WorldWideScience

Sample records for criticality analysis methodology

  1. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  2. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  3. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  4. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  5. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  6. Validating analysis methodologies used in burnup credit criticality calculations

    International Nuclear Information System (INIS)

    Brady, M.C.; Napolitano, D.G.

    1992-01-01

    The concept of allowing reactivity credit for the depleted (or burned) state of pressurized water reactor fuel in the licensing of spent fuel facilities introduces a new challenge to members of the nuclear criticality community. The primary difference in this analysis approach is the technical ability to calculate spent fuel compositions (or inventories) and to predict their effect on the system multiplication factor. Isotopic prediction codes are used routinely for in-core physics calculations and the prediction of radiation source terms for both thermal and shielding analyses, but represent an innovation for criticality specialists. This paper discusses two methodologies currently being developed to specifically evaluate isotopic composition and reactivity for the burnup credit concept. A comprehensive approach to benchmarking and validating the methods is also presented. This approach involves the analysis of commercial reactor critical data, fuel storage critical experiments, chemical assay isotopic data, and numerical benchmark calculations

  7. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  8. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  9. Analysis of kyoto university reactor physics critical experiments using NCNSRC calculation methodology

    International Nuclear Information System (INIS)

    Amin, E.; Hathout, A.M.; Shouman, S.

    1997-01-01

    The kyoto university reactor physics experiments on the university critical assembly is used to benchmark validate the NCNSRC calculations methodology. This methodology has two lines, diffusion and Monte Carlo. The diffusion line includes the codes WIMSD4 for cell calculations and the two dimensional diffusion code DIXY2 for core calculations. The transport line uses the MULTIKENO-Code vax Version. Analysis is performed for the criticality, and the temperature coefficients of reactivity (TCR) for the light water moderated and reflected cores, of the different cores utilized in the experiments. The results of both Eigen value and TCR approximately reproduced the experimental and theoretical Kyoto results. However, some conclusions are drawn about the adequacy of the standard wimsd4 library. This paper is an extension of the NCNSRC efforts to assess and validate computer tools and methods for both Et-R R-1 and Et-MMpr-2 research reactors. 7 figs., 1 tab

  10. The Three Stages of Critical Policy Methodology: An Example from Curriculum Analysis

    Science.gov (United States)

    Rata, Elizabeth

    2014-01-01

    The article identifies and discusses three stages in the critical policy methodology used in the sociology of education. These are: firstly, employing a political economy theoretical framework that identifies causal links between global forces and local developments; secondly, analysing educational policy within that theoretically conceptualised…

  11. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  12. Starting a Conversation about Critical Frame Analysis: Reflections on Dealing with Methodology in Feminist Research

    NARCIS (Netherlands)

    Haar, M. van der; Verloo, M.M.T.

    2016-01-01

    With this article we are contributing to a conversation about Critical Frame Analysis (CFA) as a feminist research method. CFA was developed within the context of two collaborative and comparative research studies of gender equality policies in the European context, MAGEEQ (www.mageeq.net) and QUING

  13. Critical/non-critical system methodology report

    International Nuclear Information System (INIS)

    1989-01-01

    The method used to determine how the waste Isolation Pilot Plant (WIPP) facilities/systems were classified as critical or non-critical to the receipt of CH waste is described within this report. All WIPP critical facilities/systems are listed in the Operational Readiness Review Dictionary. Using the Final Safety Analysis Report (FSAR) as a guide to define the boundaries of the facilities/systems, a direct correlation of the ORR Dictionary to the FSAR can be obtained. The critical facilities/systems are those which are directly related to or have a critical support role in the receipt of CH waste. The facility/systems must meet one of the following requirements to be considered critical: (a) confinement or measure of the release of radioactive materials; (b) continued receipt and/or storage of transuranic waste (TRU) without an interruption greater than one month according to the shipping plan schedule; (c) the environmental and occupational safety of personnel meets the established site programs; and (d) the physical security of the WIPP facilities

  14. Argumentation: A Methodology to Facilitate Critical Thinking.

    Science.gov (United States)

    Makhene, Agnes

    2017-06-20

    Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.

  15. An intersectionality-based policy analysis framework: critical reflections on a methodology for advancing equity.

    Science.gov (United States)

    Hankivsky, Olena; Grace, Daniel; Hunting, Gemma; Giesbrecht, Melissa; Fridkin, Alycia; Rudrum, Sarah; Ferlatte, Olivier; Clark, Natalie

    2014-12-10

    In the field of health, numerous frameworks have emerged that advance understandings of the differential impacts of health policies to produce inclusive and socially just health outcomes. In this paper, we present the development of an important contribution to these efforts - an Intersectionality-Based Policy Analysis (IBPA) Framework. Developed over the course of two years in consultation with key stakeholders and drawing on best and promising practices of other equity-informed approaches, this participatory and iterative IBPA Framework provides guidance and direction for researchers, civil society, public health professionals and policy actors seeking to address the challenges of health inequities across diverse populations. Importantly, we present the application of the IBPA Framework in seven priority health-related policy case studies. The analysis of each case study is focused on explaining how IBPA: 1) provides an innovative structure for critical policy analysis; 2) captures the different dimensions of policy contexts including history, politics, everyday lived experiences, diverse knowledges and intersecting social locations; and 3) generates transformative insights, knowledge, policy solutions and actions that cannot be gleaned from other equity-focused policy frameworks. The aim of this paper is to inspire a range of policy actors to recognize the potential of IBPA to foreground the complex contexts of health and social problems, and ultimately to transform how policy analysis is undertaken.

  16. Identifying the critical success factors in the coverage of low vision services using the classification analysis and regression tree methodology.

    Science.gov (United States)

    Chiang, Peggy Pei-Chia; Xie, Jing; Keeffe, Jill Elizabeth

    2011-04-25

    To identify the critical success factors (CSF) associated with coverage of low vision services. Data were collected from a survey distributed to Vision 2020 contacts, government, and non-government organizations (NGOs) in 195 countries. The Classification and Regression Tree Analysis (CART) was used to identify the critical success factors of low vision service coverage. Independent variables were sourced from the survey: policies, epidemiology, provision of services, equipment and infrastructure, barriers to services, human resources, and monitoring and evaluation. Socioeconomic and demographic independent variables: health expenditure, population statistics, development status, and human resources in general, were sourced from the World Health Organization (WHO), World Bank, and the United Nations (UN). The findings identified that having >50% of children obtaining devices when prescribed (χ(2) = 44; P 3 rehabilitation workers per 10 million of population (χ(2) = 4.50; P = 0.034), higher percentage of population urbanized (χ(2) = 14.54; P = 0.002), a level of private investment (χ(2) = 14.55; P = 0.015), and being fully funded by government (χ(2) = 6.02; P = 0.014), are critical success factors associated with coverage of low vision services. This study identified the most important predictors for countries with better low vision coverage. The CART is a useful and suitable methodology in survey research and is a novel way to simplify a complex global public health issue in eye care.

  17. Risk Assessment Planning for Airborne Systems: An Information Assurance Failure Mode, Effects and Criticality Analysis Methodology

    Science.gov (United States)

    2012-06-01

    Visa Investigate Data Breach March 30, 2012 Visa and MasterCard are investigating whether a data security breach at one of the main companies that...30). MasterCard and Visa Investigate Data Breach . New York Times . Stamatis, D. (2003). Failure Mode Effect Analysis: FMEA from Theory to Execution

  18. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  19. Some critical methodological issues in secondary analysis of world health organization data on elderly suicide rates.

    Science.gov (United States)

    Shah, Ajit

    2009-07-01

    Suicides may be misclassified as accidental deaths in countries with strict legal definitions of suicide, with cultural and religious factors leading to poor registration of suicide and stigma attached to suicide. The concordance between four different definitions of suicides was evaluated by examining the relationship between pure suicide and accidental death rates, gender differences, age-associated trends and potential distil risk and protective factors by conducting secondary analysis of the latest World Health Organisation data on elderly death rates. The four definitions of suicide were: (i) one-year pure suicides rates; one-year combined suicide rates (pure suicide rates combined with accidental death rates); (iii) five-year average pure suicide rates; and (iv) five-year average combined suicides rates (pure suicides rates combined with accidental death rates). The predicted negative correlation between pure suicide and accidental death rates was not observed. Gender differences were similar for all four definitions of suicide. There was a highly significant concordance for the findings of age-associated trends between one-year pure and combined suicide rates, one-year and five-year average pure suicide rates, and five-year average pure and combined suicide rates. There was poor concordance between pure and combined suicide rates for both one-year and five-year average data for the 14 potential distil risk and protective factors, but this concordance between one-year and five-year average pure suicide rates was highly significant. The use of one-year pure suicide rates in cross-national ecological studies examining gender differences, age-associated trends and potential distil risk and protective factors is likely to be practical, pragmatic and resource-efficient.

  20. Criticality incident detection assessment methodology

    International Nuclear Information System (INIS)

    Haley, Richard M.; Warburton, Simon J.; Bowden, Russell L.

    2003-01-01

    In the United Kingdom, all nuclear facilities that handle, treat or store fissile material require a Criticality Incident Detection and Alarm System (CIDAS) to be installed, unless a case is made for the omission of such a system. Where it is concluded that a CIDAS is required, the primary objective is the reliable detection of criticality and the initiation of prompt evacuation of plant workers from the vicinity of the incident. This paper will examine and compare various methods that can be used to demonstrate that a CIDAS will satisfy the detection criterion. The paper will focus on fit-for-purpose and cost-effective methods for the assessment of gamma-based systems. In the experience of the authors this is particularly useful in demonstrating the efficacy of existing systems in operational plant. (author)

  1. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    Science.gov (United States)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies

    International Nuclear Information System (INIS)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-01-01

    Graphical abstract: -- Highlights: •Several methods based on nanotechnology achieve limit of detections in the pM and nM ranges for mercury (II) analysis. •Most of these methods are validated in filtered water samples and/or spiked samples. •Thiols in real samples constitute an actual competence for any sensor based on the binding of mercury (II) ions. •Future research should include the study of matrix interferences including thiols and dissolved organic matter. -- Abstract: In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis

  3. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  4. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  5. Electric energy tariffs - critical analysis and methodological proposition; Estrutura de tarifas de energia eletrica. Analise critica e proposicoes metodologicas

    Energy Technology Data Exchange (ETDEWEB)

    Fugimoto, Sergio Kinya

    2010-07-01

    Currently, the electric energy sector is preparing for the third round of the distributors tariff revisions. Since the regulatory environment is more consolidated in terms of required revenue, the agents are turning their attention to the necessary adjustment and correction of the tariff structure. In fact, ANEEL (regulatory agency) set topics for R and D projects considered strategic for the development of national energy sector, among them, the Tariff Structure Project. Recently, the regulatory agency also announced public hearings dealing with the costs allocation, price signals and tariffs for low-voltage consumers. In line with this debate, the thesis finds to analyze the methodology for calculating tariffs, systematizing knowledge dispersed in various references. For this, it discusses the major aspects of peak pricing theory, including American, British and French approaches, and researches the connection between the logic built into the costs allocation by hour and the criteria for electricity distribution system planning. Aiming to reflect the costs of each customer type, indicating a better utilization of the distribution system, are proposed improvements and innovation whose highlights are: shifting the idea that expansion costs should be only allocated in peak time of the system, setting the periods after calculating the costs, changing how to derive the reference charges by average aggregation of the costs and applying the methodology on altered load curves. Finally, this thesis seeks to prove that the current methodology, although designed by the time in which the electricity sector was aggregated, can be adapted according to the proposed improvements and innovations, and thus applied to the current environment in which electric energy businesses and tariffs are separated in generation, transmission, and distribution and retail areas. (author)

  6. Critical Thinking and the Use of Nontraditional Instructional Methodologies.

    Science.gov (United States)

    Orique, Sabrina B; McCarthy, Mary Ann

    2015-08-01

    The purpose of this study was to examine the relationship between critical thinking and the use of concept mapping (CM) and problem-based learning (PBL) during care plan development. A quasi-experimental study with a pretest-posttest design was conducted using a convenience sample (n = 49) of first-semester undergraduate baccalaureate nursing students. Critical thinking was measured using the Holistic Critical Thinking Scoring Rubric. Data analysis consisted of a repeated measures analysis of variance with post hoc mean comparison tests using the Bonferroni method. Findings indicated that mean critical thinking at phase 4 (CM and PBL) was significantly higher, compared with phase 1 (baseline), phase 2 (PBL), and phase 3 (CM [p < 0.001]). The results support the utilization of nontraditional instructional (CM and PBL) methodologies in undergraduate nursing curricula. Copyright 2015, SLACK Incorporated.

  7. Criticality accident studies and methodology implemented at the CEA

    International Nuclear Information System (INIS)

    Barbry, Francis; Fouillaud, Patrick; Reverdy, Ludovic; Mijuin, Dominique

    2003-01-01

    Based on the studies and results of experimental programs performed since 1967 in the CRAC, then SILENE facilities, the CEA has devised a methodology for criticality accident studies. This methodology integrates all the main focuses of its approach, from criticality accident phenomenology to emergency planning and response, and thus includes aspects such as criticality alarm detector triggering, airborne releases, and irradiation risk assessment. (author)

  8. Auditing organizational communication: evaluating the methodological strengths and weaknesses of the critical incident technique, network analysis, and the communication satisfaction questionnaire

    NARCIS (Netherlands)

    Koning, K.H.

    2016-01-01

    This dissertation focuses on the methodology of communication audits. In the context of three Dutch high schools, we evaluated several audit instruments. The first study in this dissertation focuses on the question whether the rationale of the critical incident technique (CIT) still applies when it

  9. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  10. A review of costing methodologies in critical care studies.

    Science.gov (United States)

    Pines, Jesse M; Fager, Samuel S; Milzman, David P

    2002-09-01

    Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.

  11. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  12. Stepping up for Childhood: A Contextual Critical Methodology

    Science.gov (United States)

    Kyriacopoulos, Konstantine; Sánchez, Marta

    2017-01-01

    In this paper, we theorize a critical methodology for education centering community experiences of systemic injustice, drawing upon Critical Race Theory, critical educational leadership studies, Chicana feminism, participant action research and political theory, to refocus our work on the human relationships at the center of the learning and…

  13. Methodologies for evaluation of environmental capacity and impact due to radioactive releases by critical path analysis and their application to the IPEN's aquatic environment as a typical case study

    International Nuclear Information System (INIS)

    Chandra, U.

    1986-01-01

    A brief description of the tested concepts, for determination of environmental capacity and impact by critical path analysis technique and of dose limitation/optmization for radioactive releases is made. These concepts/methodologies are being applied in the environment of IPEN. The aquatic environment of IPEN is dealt with in detal with a view to evaluate the possible critical paths, its capacity, and present and future radiological impacts. (Author) [pt

  14. Critical Realism and Empirical Bioethics: A Methodological Exposition.

    Science.gov (United States)

    McKeown, Alex

    2017-09-01

    This paper shows how critical realism can be used to integrate empirical data and philosophical analysis within 'empirical bioethics'. The term empirical bioethics, whilst appearing oxymoronic, simply refers to an interdisciplinary approach to the resolution of practical ethical issues within the biological and life sciences, integrating social scientific, empirical data with philosophical analysis. It seeks to achieve a balanced form of ethical deliberation that is both logically rigorous and sensitive to context, to generate normative conclusions that are practically applicable to the problem, challenge, or dilemma. Since it incorporates both philosophical and social scientific components, empirical bioethics is a field that is consistent with the use of critical realism as a research methodology. The integration of philosophical and social scientific approaches to ethics has been beset with difficulties, not least because of the irreducibly normative, rather than descriptive, nature of ethical analysis and the contested relation between fact and value. However, given that facts about states of affairs inform potential courses of action and their consequences, there is a need to overcome these difficulties and successfully integrate data with theory. Previous approaches have been formulated to overcome obstacles in combining philosophical and social scientific perspectives in bioethical analysis; however each has shortcomings. As a mature interdisciplinary approach critical realism is well suited to empirical bioethics, although it has hitherto not been widely used. Here I show how it can be applied to this kind of research and explain how it represents an improvement on previous approaches.

  15. El análisis de criticidad, una metodología para mejorar la confiabilidad operacional // Criticality analysis , a methodology to improve the operational reliability.

    Directory of Open Access Journals (Sweden)

    R. Huerta Mendoza

    2000-10-01

    establecer prioridades, yfocalizar el esfuerzo que garantice el éxito maximizando la rentabilidad.Palabras claves: confiabilidad, criticidad, seguridad, ambiente, riesgo, disponibilidad, mejoramiento.___________________________________________________________________Abstract:The criticality analysis is a methodology that allows to establish the hierarchy or priorities of processes, systems andequipments, creating a structure that facilitates the taking of effective and correct decisions, addressing the effort and theresources in areas where it is more important or necessary to improve the operational dependability, based on the currentreality.The improvement of the operational dependability of any installation or their systems and components is associated withfour fundamental aspects: human dependability, dependability of the process, dependability of the design and thedependability of the maintenance. Regrettably, difficultly is in our hands limitless resources, so much economic as human,to be able to improve at the same time, these four aspects in all the areas of a company.The approaches to carry out a criticality analysis are associated with: security, surroundings, production, operation costsand maintenance, failure rate and repair time mainly. These approaches are related with a mathematical equation thatgenerates punctuation for each evaluated element.The generated list, result of a team work, allows to even and to homologate approaches to establish priorities, and focalisethe effort that guarantees the success maximizing the profitability..Key words:. PDVSA, dependability, criticality, security, surroundings, risk, readiness, improvement,changes.

  16. Trace Chemical Analysis Methodology

    Science.gov (United States)

    1980-04-01

    147 65 Modified DR/2 spectrophotometer face ........... ... 150 66 Colorimetric oil analysis field test kit ......... .. 152 67 Pictorial step...Assisted Pattern Recognitio Perhaps the most promising application of pattern recogntiontechniques for this research effort is the elucidation ".f the...large compartment on the spectrophotomer face . The screwdriver is used to adjust the zero adjust and light ad- just knobs, and the stainless steel

  17. Fire safety analysis: methodology

    International Nuclear Information System (INIS)

    Kazarians, M.

    1998-01-01

    From a review of the fires that have occurred in nuclear power plants and the results of fire risk studies that have been completed over the last 17 years, we can conclude that internal fires in nuclear power plants can be an important contributor to plant risk. Methods and data are available to quantify the fire risk. These methods and data have been subjected to a series of reviews and detailed scrutiny and have been applied to a large number of plants. There is no doubt that we do not know everything about fire and its impact on a nuclear power plants. However, this lack of knowledge or uncertainty can be quantified and can be used in the decision making process. In other words, the methods entail uncertainties and limitations that are not insurmountable and there is little or no basis for the results of a fire risk analysis fail to support a decision process

  18. Blending critical realist and emancipatory practice development methodologies: making critical realism work in nursing research.

    LENUS (Irish Health Repository)

    Parlour, Randal

    2012-12-01

    This paper examines the efficacy of facilitation as a practice development intervention in changing practice within an Older Person setting and in implementing evidence into practice. It outlines the influences exerted by the critical realist paradigm in guiding emancipatory practice development activities and, in particular, how the former may be employed within an emancipatory practice development study to elucidate and increase understanding pertinent to causation and outcomes. The methodology is based upon an emancipatory practice development approach set within a realistic evaluation framework. This allows for systematic analysis of the social and contextual elements that influence the explication of outcomes associated with facilitation. The study is concentrated upon five practice development cycles, within which a sequence of iterative processes is integrated. The authors assert that combining critical realist and emancipatory processes offers a robust and practical method for translating evidence and implementing changes in practice, as the former affirms or falsifies the influence that emancipatory processes exert on attaining culture shift, and enabling transformation towards effective clinical practice. A new framework for practice development is proposed that establishes methodological coherency between emancipatory practice development and realistic evaluation. This augments the existing theoretical bases for both these approaches by contributing new theoretical and methodological understandings of causation.

  19. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  20. Critical methodologies: early childhood research studies in Norway

    OpenAIRE

    Rhedding-Jones, Jeanette; Bjelkerud, Agnes Westgaard; Giæver, Katrine; Røkholt, Eline Grelland; Holten, Ingeborg Caroline Sæbøe; Lafton, Tove; Moxnes, Anna Rigmor; Pope, Liv Alice

    2014-01-01

    This is an open access article licensed under a Creative Commons Attribution 4.0 International License and originally published in Reconceptualizing Educational Research Methodology (RERM). You can access the article on publisher's website by following this link: https://journals.hioa.no/index.php/rerm This chapter exemplifies seven projects and their related research methodologies. It does so to consider how to construct critical research studies without replicating someone else’s researc...

  1. A study on methodologies for assessing safety critical network's risk impact on Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Lee, H. J.; Park, S. K.; Seo, S. J.

    2006-08-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for Nuclear Power Plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of the first year study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  2. Critical Proximity as a Methodological Move in Techno-Anthropology

    DEFF Research Database (Denmark)

    Birkbak, Andreas; Petersen, Morten Krogh; Elgaard Jensen, Torben

    2015-01-01

    proximity.’ Critical proximity offers an alternative to critical distance, especially with respect to avoiding premature references to abstract panoramas such as democratization and capitalist exploitation in the quest to conduct ‘critical’ analysis. Critical proximity implies, instead, granting the beings...

  3. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  4. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More Than Qualitative Methods.

    Science.gov (United States)

    Bowleg, Lisa

    2017-10-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.

  5. The Paradigm of Critical Realism: Approach to an Interdisciplinary Methodology

    Directory of Open Access Journals (Sweden)

    M. A. Tavana

    2015-02-01

    Full Text Available The debate of method of recognition in humanities, social and nature science is one of the apprehensions of Scientists in Wisdom Domain –especially in the modern world. This apprehension, before all things was of appearing in the paradigm of methodoligical between the advocates of the paradigms of positivism with having a share of the natural science. They used to values of naturalism in social and human studies. Additionally, testing and observation and repetition are present as the main terms of recognition. In front, the difference between human studies and natural science is important for the advocates of hermeneutic paradigm. Also, they speak about the methodology of interpretation (understanding of human and social phenomenon. But, in the second half of 20th century another paradigms was established as the Critical realism. This paradigm is tried to parther from the methodoloigcal binary and has a share from the recognition of ontological positivism and epistemological hermeneutics attain to a procedure of interdisciplinary about recognition. So, on the basis of this subject, this article mentioned this question that: would the critical realism receive as the methodology in interdisciplinary? Method of the Article is postulate. This article reasoned that multilayer ontology and epistemology redounded to multilayer methodology that could build up the knowledge of interdisciplinary.

  6. Critical Analysis of Multimodal Discourse

    DEFF Research Database (Denmark)

    van Leeuwen, Theo

    2013-01-01

    This is an encyclopaedia article which defines the fields of critical discourse analysis and multimodality studies, argues that within critical discourse analysis more attention should be paid to multimodality, and within multimodality to critical analysis, and ends reviewing a few examples of re...

  7. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  8. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  9. Performance Testing Methodology for Safety-Critical Programmable Logic Controller

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Ji Hyeon; Kim, Sung Ho; Sohn, Se Do

    2009-01-01

    The Programmable Logic Controller (PLC) for use in Nuclear Power Plant safety-related applications is being developed and tested first time in Korea. This safety-related PLC is being developed with requirements of regulatory guideline and industry standards for safety system. To test that the quality of the developed PLC is sufficient to be used in safety critical system, document review and various product testings were performed over the development documents for S/W, H/W, and V/V. This paper provides the performance testing methodology and its effectiveness for PLC platform conducted by KOPEC

  10. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  11. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  12. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  13. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  14. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  15. Fieldwork Methodology in South American Maritime Archaeology: A Critical Review

    Science.gov (United States)

    Argüeso, Amaru; Ciarlo, Nicolás C.

    2017-12-01

    In archaeology, data obtained from the analysis of material evidence (i.e., the archaeological record) from extensive excavations have been a significant means for the ultimate development of interpretations about human life in the past. Therefore, the methodological procedures and tools employed during fieldwork are of crucial importance due to their effect on the information likely to be recovered. In the case of maritime archaeology, the development of rigorous methods and techniques allowed for reaching outcomes as solid as those from the work performed on land. These improvements constituted one of the principal supports—if not, the most important pillar—for its acceptance as a scientific field of study. Over time, the growing diversity of sites under study (e.g., shipwrecks, ports, dockyards, and prehistoric settlements) and the underwater environments encountered made it clear that there was a need for the application of specific methodological criteria, in accordance with the particularities of the sites and of each study (e.g., the research aims and the available resources). This article presents some ideas concerning the methodologies used in South American investigations that have exhibited a strong emphasis on the analysis of historical shipwrecks (the sixteenth to twentieth centuries). Based on a state-of-the-knowledge review of these research projects, in particular where excavations were conducted, the article focuses on the details of the main strategies adopted and results achieved. The ideas proposed in this article can be useful as a starting point for future activities of surveying, recording, and excavating shipwrecks.

  16. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    Energy Technology Data Exchange (ETDEWEB)

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  17. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  18. Utilization of critical group and representative person methodologies: differences and difficulties

    International Nuclear Information System (INIS)

    Ferreira, Nelson L.D.; Rochedo, Elaine R.R.; Mazzilli, Barbara P.

    2013-01-01

    In Brazil, the assessment of the environmental impact due to routine discharges of radionuclides, which is used to the public protection, normally is based on the determination of the so-called 'critical group'. For the same purpose, the ICRP (2007) proposed the adoption of the 'representative person', defined as the individual receiving a dose representative of the members of the population who are subject to the higher exposures. In this work, are discussed, basically, the different characteristics of each one (critical group and representative person), related, mainly, to its methodologies and the necessary data demanded. Some difficulties to obtain site specific data, mainly habit data, as well as the way they are used, are discussed too. The critical group methodology uses, basically, average values, while the representative person methodology performs deterministic or probabilistic analysis using values obtained from distributions. As reference, it was considered the predicted effluents releases from Uranium Hexafluoride Production Plant (USEXA) and the effective doses calculated to the members of the previously defined critical group of Centro Experimental Aramar (CEA). (author)

  19. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  20. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  1. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  2. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  3. Critical evaluation of methodology commonly used in sample collection, storage and preparation for the analysis of pharmaceuticals and illicit drugs in surface water and wastewater by solid phase extraction and liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Baker, David R; Kasprzyk-Hordern, Barbara

    2011-11-04

    The main aim of this manuscript is to provide a comprehensive and critical verification of methodology commonly used for sample collection, storage and preparation in studies concerning the analysis of pharmaceuticals and illicit drugs in aqueous environmental samples with the usage of SPE-LC/MS techniques. This manuscript reports the results of investigations into several sample preparation parameters that to the authors' knowledge have not been reported or have received very little attention. This includes: (i) effect of evaporation temperature and (ii) solvent with regards to solid phase extraction (SPE) extracts; (iii) effect of silanising glassware; (iv) recovery of analytes during vacuum filtration through glass fibre filters and (v) pre LC-MS filter membranes. All of these parameters are vital to develop efficient and reliable extraction techniques; an essential factor given that target drug residues are often present in the aqueous environment at ng L(-1) levels. Presented is also the first comprehensive review of the stability of illicit drugs and pharmaceuticals in wastewater. Among the parameters studied are: time of storage, temperature and pH. Over 60 analytes were targeted including stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, human urine indicators and their metabolites. The lack of stability of analytes in raw wastewater was found to be significant for many compounds. For instance, 34% of compounds studied reported a stability change >15% after only 12 h in raw wastewater stored at 2 °C; a very important finding given that wastewater is typically collected with the use of 24 h composite samplers. The stability of these compounds is also critical given the recent development of so-called 'sewage forensics' or 'sewage epidemiology' in which concentrations of target drug residues in wastewater are used to back-calculate drug consumption. Without an understanding of stability

  4. Evaluation of speech errors in Putonghua speakers with cleft palate: a critical review of methodology issues.

    Science.gov (United States)

    Jiang, Chenghui; Whitehill, Tara L

    2014-04-01

    Speech errors associated with cleft palate are well established for English and several other Indo-European languages. Few articles describing the speech of Putonghua (standard Mandarin Chinese) speakers with cleft palate have been published in English language journals. Although methodological guidelines have been published for the perceptual speech evaluation of individuals with cleft palate, there has been no critical review of methodological issues in studies of Putonghua speakers with cleft palate. A literature search was conducted to identify relevant studies published over the past 30 years in Chinese language journals. Only studies incorporating perceptual analysis of speech were included. Thirty-seven articles which met inclusion criteria were analyzed and coded on a number of methodological variables. Reliability was established by having all variables recoded for all studies. This critical review identified many methodological issues. These design flaws make it difficult to draw reliable conclusions about characteristic speech errors in this group of speakers. Specific recommendations are made to improve the reliability and validity of future studies, as well to facilitate cross-center comparisons.

  5. Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors

    Science.gov (United States)

    Gheorghiu, A.-D.; Ozunu, A.

    2012-04-01

    The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step

  6. Critical Discourse Analysis and Leadership

    Science.gov (United States)

    Arriaza, Gilberto

    2015-01-01

    This article outlines the need of infusing critical discourse analysis into the preparation and support of prospective school leaders. It argues that in the process of school transformation, the school leader must possess the ability to self-reflect on his/her language and understand the potential power of language as a means that may support or…

  7. Decolonizing Interpretive Research: A Critical Bicultural Methodology for Social Change

    Science.gov (United States)

    Darder, Antonia

    2015-01-01

    This paper introduces a discussion of decolonizing interpretive research in a way that gives greater salience to and understanding of the theoretical efforts of critical bicultural education researchers over the years. Grounded in educational principles that have been derived from critical social theory, a decolonizing approach to theory building,…

  8. New Teaching Techniques to Improve Critical Thinking. The Diaprove Methodology

    Science.gov (United States)

    Saiz, Carlos; Rivas, Silvia F.

    2016-01-01

    The objective of this research is to ascertain whether new instructional techniques can improve critical thinking. To achieve this goal, two different instruction techniques (ARDESOS--group 1--and DIAPROVE--group 2--) were studied and a pre-post assessment of critical thinking in various dimensions such as argumentation, inductive reasoning,…

  9. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  10. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  11. Sampling methodology and PCB analysis

    International Nuclear Information System (INIS)

    Dominelli, N.

    1995-01-01

    As a class of compounds PCBs are extremely stable and resist chemical and biological decomposition. Diluted solutions exposed to a range of environmental conditions will undergo some preferential degradation and the resulting mixture may differ considerably from the original PCB used as insulating fluid in electrical equipment. The structure of mixtures of PCBs (synthetic compounds prepared by direct chlorination of biphenyl with chlorine gas) is extremely complex and presents a formidable analytical problem, further complicated by the presence of PCBs as contaminants in oils to soils to water. This paper provides some guidance into sampling and analytical procedures; it also points out various potential problems encountered during these processes. The guidelines provided deal with sample collection, storage and handling, sample stability, laboratory analysis (usually gas chromatography), determination of PCB concentration, calculation of total PCB content, and quality assurance. 1 fig

  12. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  13. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  14. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  15. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  16. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  17. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  18. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  19. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance.......The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  20. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  1. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  2. Risk Assessment Methodology for Protecting Our Critical Physical Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    BIRINGER,BETTY E.; DANNEELS,JEFFREY J.

    2000-12-13

    Critical infrastructures are central to our national defense and our economic well-being, but many are taken for granted. Presidential Decision Directive (PDD) 63 highlights the importance of eight of our critical infrastructures and outlines a plan for action. Greatly enhanced physical security systems will be required to protect these national assets from new and emerging threats. Sandia National Laboratories has been the lead laboratory for the Department of Energy (DOE) in developing and deploying physical security systems for the past twenty-five years. Many of the tools, processes, and systems employed in the protection of high consequence facilities can be adapted to the civilian infrastructure.

  3. Methodological and Epistemological Criticism on Experimental Accounting Research Published in Brazil

    Directory of Open Access Journals (Sweden)

    Paulo Frederico Homero Junior

    2016-06-01

    Full Text Available In this article, I analyze 17 experimental studies published in Brazilian accounting journals between 2006 and 2015, in order to develop both critical and methodological criticism on these articles. First, we discuss the methodological characteristics of the experiments and the main validity threats they face, analyzing how the selected articles deal with these threats. Overall, this analysis shows a lack of consideration of the validity of the constructs used, difficulty to develop internally valid experiments and inability to express confidence in the applicability of the results to contexts other than the experimental. Then, I compare the positivist theoretical perspective these articles have in common with constructionist conceptions of the social sciences and criticize them, based on these notions. I maintain that these articles are characterized by a behaviorist approach, a reified notion of subjectivity, disregard of the cultural and historical specificities and axiological commitment to submission, instead of the emancipation of the people in relation to management control. The paper contributes to the Brazilian accounting literature in two ways: raising awareness on the challenges faced in conducting appropriate experimental designs and showing how the experimental accounting research can be problematic from an epistemological point of view, aiming to promote an interparadigmatic debate to arouse greater awareness on the subject and more robust consideration of such issues by future researchers.

  4. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  5. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  6. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  7. Taking critical ontology seriously. Implications for Political Science Methodology

    NARCIS (Netherlands)

    Wigger, A.; Horn, L.; Keman, H.; Woldendorp, J.

    2016-01-01

    To be ‘critical’ has become fashionable among social scientists in various disciplines. Only a few decades ago, the prefix ‘critical’ was almost automatically associated with Western Marxism and in particular the Frankfurt School. Today, the term critical is no longer limited to a single theoretical

  8. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  9. A review of critical in-flight events research methodology

    Science.gov (United States)

    Giffin, W. C.; Rockwell, T. H.; Smith, P. E.

    1985-01-01

    Pilot's cognitive responses to critical in-flight events (CIFE's) were investigated, using pilots, who had on the average about 2540 flight hours each, in four experiments: (1) full-mission simulation in a general aviation trainer, (2) paper and pencil CIFE tests, (3) interactive computer-aided scenario testing, and (4) verbal protocols in fault diagnosis tasks. The results of both computer and paper and pencil tests showed only 50 percent efficiency in correct diagnosis of critical events. The efficiency in arriving at a diagnosis was also low: over 20 inquiries were made for 21 percent of the scenarios diagnosed. The information-seeking pattern was random, with frequent retracing over old inquiries. The measures for developing improved cognitive skills for CIFE's are discussed.

  10. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  11. A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems

    Science.gov (United States)

    2009-07-01

    This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...

  12. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-10-15

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied.

  13. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    International Nuclear Information System (INIS)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo

    2016-01-01

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied

  14. Critical Race Design: An Emerging Methodological Approach to Anti-Racist Design and Implementation Research

    Science.gov (United States)

    Khalil, Deena; Kier, Meredith

    2017-01-01

    This article is about introducing Critical Race Design (CRD), a research methodology that centers race and equity at the nucleus of educational opportunities by design. First, the authors define design-based implementation research (DBIR; Penuel, Fishman, Cheng, & Sabelli, 2011) as an equity-oriented education research methodology where…

  15. Critical assessment of jet erosion test methodologies for cohesive soil and sediment

    Science.gov (United States)

    Karamigolbaghi, Maliheh; Ghaneeizad, Seyed Mohammad; Atkinson, Joseph F.; Bennett, Sean J.; Wells, Robert R.

    2017-10-01

    The submerged Jet Erosion Test (JET) is a commonly used technique to assess the erodibility of cohesive soil. Employing a linear excess shear stress equation and impinging jet theory, simple numerical methods have been developed to analyze data collected using a JET to determine the critical shear stress and erodibility coefficient of soil. These include the Blaisdell, Iterative, and Scour Depth Methods, and all have been organized into easy to use spreadsheet routines. The analytical framework of the JET and its associated methods, however, are based on many assumptions that may not be satisfied in field and laboratory settings. The main objective of this study is to critically assess this analytical framework and these methodologies. Part of this assessment is to include the effect of flow confinement on the JET. The possible relationship between the derived erodibility coefficient and critical shear stress, a practical tool in soil erosion assessment, is examined, and a review of the deficiencies in the JET methodology also is presented. Using a large database of JET results from the United States and data from literature, it is shown that each method can generate an acceptable curve fit through the scour depth measurements as a function of time. The analysis shows, however, that the Scour Depth and Iterative Methods may result in physically unrealistic values for the erosion parameters. The effect of flow confinement of the impinging jet increases the derived critical shear stress and decreases the erodibility coefficient by a factor of 2.4 relative to unconfined flow assumption. For a given critical shear stress, the length of time over which scour depth data are collected also affects the calculation of erosion parameters. In general, there is a lack of consensus relating the derived soil erodibility coefficient to the derived critical shear stress. Although empirical relationships are statistically significant, the calculated erodibility coefficient for a

  16. CRITICAL RADAR: TOOL AND METHODOLOGY FOR EVALUATING CURRENT PROJECTS USING MULTIPLE VARIABLES

    Directory of Open Access Journals (Sweden)

    André M. Ferrari

    2017-06-01

    Full Text Available Many resources are invested in measurement processes of projects indicators without, however, give a clear view of which projects deserves the right attention at the right time. This paper proposes the use of statistics, through the analysis of multiple variables and their interrelationships, to give better basis to a critical assessment methodology of current projects used in a multinational mining company. The contribution of the research is to report the methodology called Critical Radar which is based on a graphical tool with simple operationalization that can support the decision making in complex environments, and has great flexibility across the different market scenarios and possible changes in companies guidelines. The tool has great potential to help evaluate current projects due to their characteristics of flexible use in different business areas; high degree of freedom for improvement; use of known market tool in its development; ease of viewing the results through charts and notes and user freedom to use any existing indicators in the company if complied with some statistical data quality characteristics.

  17. Analysis of the IPEN/MB-01 critical unit based on criticality experiments

    International Nuclear Information System (INIS)

    Santos, Adimir dos; Yamaguchi, Mitsuo; Ferreira, Carlos Roberto; Yoriyaz, Helio

    1995-01-01

    The analysis of the critical loading of the IPEN/MB-01 was performed by using several reactor cell methodologies. The results obtained by using the coupled NJOY/AMPX-II/HAMMER-TECHNION shows the good quality of the available nuclear data files as well as the methodologies in the Reactor Physics area. The original HAMMER system shows results that are well as the methodologies in the Reactor Physics area. The original HAMMER system shows results that are well outside of the desired quality for a cell code. (author), 15 refs, 3 figs, 5 tabs

  18. DOSTOEVSKY'S RELIGIOSITY AS A METHODOLOGICAL PROBEM OF SOVIET LITERARY CRITICISM

    Directory of Open Access Journals (Sweden)

    Sergey Sergeevich Shaulov

    2012-11-01

    Full Text Available Soviet literary criticism, especially in the first decades after the 1917 Revolution, was quite biased in its treatment of Dostoevsky and his works. The reasons for this bias lie both inside and outside the sphere of political ideology. We suggest that there exists a genetic link between some Soviet readings of Dostoevsky and a number of interpretations made in the author's lifetime. Also analysed are the attempts to 'domesticate' Dostoevsky and adapt his works to drastically different cultural conditions and political norms. It is indicative that this adaptation has always passed the stage of mythologizing the writer and his works. This mythologization paradoxically became a convergence point for Soviet (Lunacharsky, anti-Soviet (Berdyayev and purely philosophical (Bakhtin readings of Dostoevsky. Ultimately, the central Dostoevsky myth in post-revolutionary Russia was a version of Romantic mythology often directly expressed in comparing Dostoevsky with Prometheus. We also look at the negative readings of Dostoevsky, which construed the author as a certain mythological antagonist of the proletariat as the collective messiah. Such readings (exemplified in our article by Pereverzev's and Livshits' point at the ultimate limit of ethical assessment of Dostoevsky from the standpoint of rational secular humanism and the Soviet humanitarian thought as its version. Dostoevsky's artistic practice incorporates this tradition within the intranovel dialogue as just one of the voices and demonstrates its ethical insufficiency, which in its turn provokes the mixed reaction of 'appropriation' and 'rejection' from both Soviet thinkers and their contemporary heirs.

  19. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  20. Using Critical Thinking Drills to Teach and Assess Proficiency in Methodological and Statistical Thinking

    Science.gov (United States)

    Cascio, Ted V.

    2017-01-01

    This study assesses the effectiveness of critical thinking drills (CTDs), a repetitious classroom activity designed to improve methodological and statistical thinking in relation to psychological claims embedded in popular press articles. In each of four separate CTDs, students critically analyzed a brief article reporting a recent psychological…

  1. Using Six Sigma methodology to reduce patient transfer times from floor to critical-care beds.

    Science.gov (United States)

    Silich, Stephan J; Wetz, Robert V; Riebling, Nancy; Coleman, Christine; Khoueiry, Georges; Abi Rafeh, Nidal; Bagon, Emma; Szerszen, Anita

    2012-01-01

    In response to concerns regarding delays in transferring critically ill patients to intensive care units (ICU), a quality improvement project, using the Six Sigma process, was undertaken to correct issues leading to transfer delay. To test the efficacy of a Six Sigma intervention to reduce transfer time and establish a patient transfer process that would effectively enhance communication between hospital caregivers and improve the continuum of care for patients. The project was conducted at a 714-bed tertiary care hospital in Staten Island, New York. A Six Sigma multidisciplinary team was assembled to assess areas that needed improvement, manage the intervention, and analyze the results. The Six Sigma process identified eight key steps in the transfer of patients from general medical floors to critical care areas. Preintervention data and a root-cause analysis helped to establish the goal transfer-time limits of 3 h for any individual transfer and 90 min for the average of all transfers. The Six Sigma approach is a problem-solving methodology that resulted in almost a 60% reduction in patient transfer time from a general medical floor to a critical care area. The Six Sigma process is a feasible method for implementing healthcare related quality of care projects, especially those that are complex. © 2011 National Association for Healthcare Quality.

  2. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  3. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  4. An investigation of the critical components of a land ethic: An application of Q methodology

    Science.gov (United States)

    Spradling, Suzanne Shaw

    Scope and method of study. The purpose of this study was to reveal the underlying structure of the beliefs of a sample of environmental educators regarding the critical components of a land or environmental ethic. Participants in the study were 30 environmental educators from seven states. All had been trained in one or more of the following national environmental education programs: Project WILD, Project WET, Project Learning Tree, Leopold Education Project, or Leave No Trace. Ages of the participants ranged from 18--63 years. Q methodology directed the study. Each participant completed a Q-sort of 54 statements related to environmental ethics. The data were analyzed using a computer program PQMethod 2.06. This program performed a correlation matrix as input data for factor analysis, and a VARIMAX rotation. Participant demographic data were collected in order to provide a more complete picture of the revealed structure of beliefs. Findings and conclusions. A three-factor solution was revealed from the analysis of the data. These factors represent the groupings of the participants with like beliefs in reference to the critical components of environmental ethics. Factor one was named Nature's Advocates. These individuals believe in equal rights for all parts of the environment. Factor two was named Nature's Stewards because of the revealed belief that humans were to have dominion over the earth given to them by the creator and that natural resources should be used responsibly. Factor three was named Nature's Romantics because of their belief that nature should be preserved for its aesthetic value and because of their naive approach to conservation. The demographic data added detail to the portrait created from the Q-sort data analysis. It is important then, to take into consideration what environmental educators believe about environmental ethics in designing meaningful curriculum that seeks to foster the development of those ethics. This study reveals the beliefs

  5. The "Critical" Elements of Illness Management and Recovery: Comparing Methodological Approaches.

    Science.gov (United States)

    McGuire, Alan B; Luther, Lauren; White, Dominique; White, Laura M; McGrew, John; Salyers, Michelle P

    2016-01-01

    This study examined three methodological approaches to defining the critical elements of Illness Management and Recovery (IMR), a curriculum-based approach to recovery. Sixty-seven IMR experts rated the criticality of 16 IMR elements on three dimensions: defining, essential, and impactful. Three elements (Recovery Orientation, Goal Setting and Follow-up, and IMR Curriculum) met all criteria for essential and defining and all but the most stringent criteria for impactful. Practitioners should consider competence in these areas as preeminent. The remaining 13 elements met varying criteria for essential and impactful. Findings suggest that criticality is a multifaceted construct, necessitating judgments about model elements across different criticality dimensions.

  6. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  7. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  8. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  9. Methodologies and applications for critical infrastructure protection: State-of-the-art

    International Nuclear Information System (INIS)

    Yusta, Jose M.; Correa, Gabriel J.; Lacal-Arantegui, Roberto

    2011-01-01

    This work provides an update of the state-of-the-art on energy security relating to critical infrastructure protection. For this purpose, this survey is based upon the conceptual view of OECD countries, and specifically in accordance with EU Directive 114/08/EC on the identification and designation of European critical infrastructures, and on the 2009 US National Infrastructure Protection Plan. The review discusses the different definitions of energy security, critical infrastructure and key resources, and shows some of the experie'nces in countries considered as international reference on the subject, including some information-sharing issues. In addition, the paper carries out a complete review of current methodologies, software applications and modelling techniques around critical infrastructure protection in accordance with their functionality in a risk management framework. The study of threats and vulnerabilities in critical infrastructure systems shows two important trends in methodologies and modelling. A first trend relates to the identification of methods, techniques, tools and diagrams to describe the current state of infrastructure. The other trend accomplishes a dynamic behaviour of the infrastructure systems by means of simulation techniques including systems dynamics, Monte Carlo simulation, multi-agent systems, etc. - Highlights: → We examine critical infrastructure protection experiences, systems and applications. → Some international experiences are reviewed, including EU EPCIP Plan and the US NIPP programme. → We discuss current methodologies and applications on critical infrastructure protection, with emphasis in electric networks.

  10. Choice-Induced Preference Change in the Free-Choice Paradigm: A Critical Methodological Review

    Directory of Open Access Journals (Sweden)

    Keise eIzuma

    2013-02-01

    Full Text Available Choices not only reflect our preference, but they also affect our behavior. The phenomenon of choice-induced preference change has been of interest to cognitive dissonance researchers in social psychology, and more recently, it has attracted the attention of researchers in economics and neuroscience. Preference modulation after the mere act of making a choice has been repeatedly demonstrated over the last 50 years by an experimental paradigm called the free-choice paradigm. However, in 2010, Chen and Risen pointed out a serious methodological flaw in this paradigm, arguing that evidence for choice-induced preference change is still insufficient. Despite the flaw, studies using the traditional free-choice paradigm continue to be published without addressing the criticism. Here, aiming to draw more attention to this issue, we briefly explain the methodological problem, and then describe simple simulation studies that illustrate how the free-choice paradigm produces a systematic pattern of preference change consistent with cognitive dissonance, even without any change in true preference. Our stimulation also shows how a different level of noise in each phase of the free-choice paradigm independently contributes to the magnitude of artificial preference change. Furthermore, we review ways of addressing the critique and provide a meta-analysis to show the effect size of choice-induced preference change after addressing the critique. Finally, we review and discuss, based on the results of the stimulation studies, how the criticism affects our interpretation of past findings generated from the free-choice paradigm. We conclude that the use of the conventional free-choice paradigm should be avoided in future research and the validity of past findings from studies using this paradigm should be empirically re-established.

  11. Assessment of critical minerals: Updated application of an early-warning screening methodology

    Science.gov (United States)

    McCullough, Erin A.; Nassar, Nedal

    2017-01-01

    Increasing reliance on non-renewable mineral resources reinforces the need for identifying potential supply constraints before they occur. The US National Science and Technology Council recently released a report that outlines a methodology for screening potentially critical minerals based on three indicators: supply risk (R), production growth (G), and market dynamics (M). This early-warning screening was initially applied to 78 minerals across the years 1996 to 2013 and identified a subset of minerals as “potentially critical” based on the geometric average of these indicators—designated as criticality potential (C). In this study, the screening methodology has been updated to include data for 2014, as well as to incorporate revisions and modifications to the data, where applicable. Overall, C declined in 2014 for the majority of minerals examined largely due to decreases in production concentration and price volatility. However, the results vary considerably across minerals, with some minerals, such as gallium, recording increases for all three indicators. In addition to assessing magnitudinal changes, this analysis also examines the significance of the change relative to historical variation for each mineral. For example, although mined nickel’s R declined modestly in 2014 in comparison to that of other minerals, it was by far the largest annual change recorded for mined nickel across all years examined and is attributable to Indonesia’s ban on the export of unprocessed minerals. Based on the 2014 results, 20 minerals with the highest C values have been identified for further study including the rare earths, gallium, germanium, rhodium, tantalum, and tungsten.

  12. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  13. Methodology for prioritizing cyber-vulnerable critical infrastructure equipment and mitigation strategies.

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Lon Andrew; Stinebaugh, Jennifer A.

    2010-04-01

    The Department of Homeland Security (DHS), National Cyber Security Division (NSCD), Control Systems Security Program (CSSP), contracted Sandia National Laboratories to develop a generic methodology for prioritizing cyber-vulnerable, critical infrastructure assets and the development of mitigation strategies for their loss or compromise. The initial project has been divided into three discrete deliverables: (1) A generic methodology report suitable to all Critical Infrastructure and Key Resource (CIKR) Sectors (this report); (2) a sector-specific report for Electrical Power Distribution; and (3) a sector-specific report for the water sector, including generation, water treatment, and wastewater systems. Specific reports for the water and electric sectors are available from Sandia National Laboratories.

  14. Autoclave nuclear criticality safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    D`Aquila, D.M. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States); Tayloe, R.W. Jr. [Battelle, Columbus, OH (United States)

    1991-12-31

    Steam-heated autoclaves are used in gaseous diffusion uranium enrichment plants to heat large cylinders of UF{sub 6}. Nuclear criticality safety for these autoclaves is evaluated. To enhance criticality safety, systems are incorporated into the design of autoclaves to limit the amount of water present. These safety systems also increase the likelihood that any UF{sub 6} inadvertently released from a cylinder into an autoclave is not released to the environment. Up to 140 pounds of water can be held up in large autoclaves. This mass of water is sufficient to support a nuclear criticality when optimally combined with 125 pounds of UF{sub 6} enriched to 5 percent U{sup 235}. However, water in autoclaves is widely dispersed as condensed droplets and vapor, and is extremely unlikely to form a critical configuration with released UF{sub 6}.

  15. Auto-Interviewing, Auto-Ethnography and Critical Incident Methodology for Eliciting a Self-Conceptualised Worldview

    Directory of Open Access Journals (Sweden)

    Béatrice Boufoy-Bastick

    2004-01-01

    Full Text Available Knowing oneself has been an age-old humanistic concern for many western and oriental philosophers. The same concern is now shared by modern psychologists and anthropologists who seek to understand the "self" and others by eluci­dating their worldviews. This paper presents an auto-anthropological methodology which can ef­fec­tively elucidate one's worldview. This intro­spective qualitative methodology uses integratively three methodological processes, namely auto-inter­viewing, auto-ethnography and critical incident technique to elicit baseline cultural data. The paper reports on how this methodology was used to elicit my current worldview. It first explains how emic data were educed and rendered in emo­tionally enhanced narratives, which were then deconstructed to elicit the major recurring themes in the etic interpretive content analysis. To illus­trate this auto-anthropological methodology, two cultural life events have been used: a critical incident in Singapore and a consciousness raising process in Fiji. The first event revealed my own education ideology while the second made me realise my mitigated support for cultural diversity. URN: urn:nbn:de:0114-fqs0401371

  16. Critical Race Theory-Social Constructivist Bricolage: A Health-Promoting Schools Research Methodology

    Science.gov (United States)

    Nyika, Lawrence; Murray-Orr, Anne

    2017-01-01

    While the current literature recognises the capacity of diverse methodologies to provide informative understandings of health-promoting schools (HPS), there is a paucity of examples to show how different research strategies can be used. We address this knowledge gap by examining the significance of a critical race theory-social constructivist…

  17. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More than Qualitative Methods

    Science.gov (United States)

    Bowleg, Lisa

    2017-01-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with…

  18. M&A Performance and Economic Impact: Integration and Critical Assessment of Methodological Approach

    Directory of Open Access Journals (Sweden)

    Karolis Andriuskevicius

    2017-11-01

    Full Text Available Purpose of the article: Existing methodologies employed within the M&A performance framework are investigated and critically discuss. Methodology/methods: The research has been carried out as a structured assessment of past literature. The findings from scientific articles and studies by various scholars have been categorized, grouped and summarized to discern a meta-analytic view of the work carried out to date. Scientific aim: The conducted research seeks to ascertain and evaluate theoretically existing methodologies used in empirical studies that would allow proper and critical understanding of the results of various findings in the holistic and global M&As area. Findings: The research elaborates on several key developments in M&A methodology and performance studies carried out in empirical works during the last two decades. The findings help to independently and objectively assess performance of M&A from a holistic perspective. Conclusions: Each methodology measuring either M&A performance on a corporate level or effects of M&A on the economy level shall be interpreted and relied on with caution as each of them dispose their limitations whereas application of these methodologies is subject to data availability and case specific.

  19. Performance management in healthcare: a critical analysis.

    Science.gov (United States)

    Hewko, Sarah J; Cummings, Greta G

    2016-01-01

    Purpose - The purpose of this paper is to explore the underlying theoretical assumptions and implications of current micro-level performance management and evaluation (PME) practices, specifically within health-care organizations. PME encompasses all activities that are designed and conducted to align employee outputs with organizational goals. Design/methodology/approach - PME, in the context of healthcare, is analyzed through the lens of critical theory. Specifically, Habermas' theory of communicative action is used to highlight some of the questions that arise in looking critically at PME. To provide a richer definition of key theoretical concepts, the authors conducted a preliminary, exploratory hermeneutic semantic analysis of the key words "performance" and "management" and of the term "performance management". Findings - Analysis reveals that existing micro-level PME systems in health-care organizations have the potential to create a workforce that is compliant, dependent, technically oriented and passive, and to support health-care systems in which inequalities and power imbalances are perpetually reinforced. Practical implications - At a time when the health-care system is under increasing pressure to provide high-quality, affordable services with fewer resources, it may be wise to investigate new sector-specific ways of evaluating and managing performance. Originality/value - In this paper, written for health-care leaders and health human resource specialists, the theoretical assumptions and implications of current PME practices within health-care organizations are explored. It is hoped that readers will be inspired to support innovative PME practices within their organizations that encourage peak performance among health-care professionals.

  20. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  1. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  2. Critical analysis of the cranking

    International Nuclear Information System (INIS)

    Hamamoto, Ikuko

    1985-01-01

    Problems, success and shortcomings of the cranking model are discussed by choosing the following four critical topics: 1) the interaction between the ground- and the S-band, 2) vanishing M1 transition moments, 3) the relation between the signature-dependence of the ΔI=1 E2 transition rates in odd-A nuclei and the deviation of nuclear shape from axial symmetry, and 4) the quantum effect on rotational motion, especially on moments of inertia for triaxial shape. (orig.)

  3. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  4. Criticality calculations for safety analysis

    International Nuclear Information System (INIS)

    Vellozo, S.O.

    1981-01-01

    Criticality studies in uranium nitrate and plutonium nitrate aqueous solutions were done. For uranium compound three basic computer codes are used: GAMTEC-II, DTF-IV, KENO-IV. Water was used as refletor and the results obtained with the different computer codes were analyzed and compared with the 'Handbuck zur Kriticalitat'. The cross sections and the cylindrical geometry were generated by Gamtec-II computer code. In the second compound the thickness of the recipient with plutonium nitrate are used with rectangular geometry and concret reflector. The effective multiplication constant was calculated with the Gamtec-II and Keno-IV library. The results show many differences. (E.G) [pt

  5. Validation the methodology calculate critical position of control rods to the critical facility IPEN/MB-01

    International Nuclear Information System (INIS)

    Lopez Aldama, D.; Rodriguez Gual, R.

    1998-01-01

    Presently work intends to validate the models and programs used in the Nuclear Technology Center for calculating the critical position of control rods by means of the analysis of the measurements performed at the critical facility IPEN/MB-01. The lattice calculations were carried out with the WIMS/D4 code and for the global calculations the diffusion code SNAP-3D was used

  6. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  7. Risk assessment methodology for extreme wind and missile effects on critical facilities

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1985-01-01

    The TORMIS methodology has been applied to a number of probabilistic risk assessments of critical facilities in the continental United States. These analyses have centered on the estimation of tornado missile impact and damage risks to individual targets as well as to groups of targets at specific plants. A number of advancements and generalizations in the approach have recently been made. These include: (1) generalization of windfield options to include straight winds (WINMIS) and hurricanes (HURMIS); (2) generalization of the scoring to enable analysis of Boolean system expressions for damage probabilities on compound series and parallel safety trains; (3) generalization of the failure criteria to include wind pressure as well as missile impact; (4) generalization of the plant modeling capability to enable more detailed treatment of targets partially or fully enclosed by vulnerable cladding and to allow tracking of missiles inside such enclosures; and (5) incorporation of windspeed criteria for structural failure and subsequent production of potential missiles. This paper will present some of the basic theory and key results of recent TORMIS, WINMIS, and HURMIS applications. The influence of uncertainties in the estimation process and the data needed for plant-specific risk assessments will also be discussed

  8. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  9. Applicability of risk-informed criticality methodology to spent fuel repositories

    International Nuclear Information System (INIS)

    Mays, C.; Thomas, D.A.; Favet, D.

    2000-01-01

    An important objective of geologic disposal is keeping the fissionable material in a condition so that a self-sustaining nuclear chain reaction (criticality) is highly unlikely. This objective supports the overall performance objective of any repository, which is to protect the health and safety of the public by limiting radiological exposure. This paper describes a risk-informed, performance-based methodology, which combines deterministic and probabilistic approaches for evaluating the criticality potential of high-level waste and spent nuclear fuel after the repository is sealed and permanently closed (postclosure). (authors)

  10. The Challenges of Participant Photography: A Critical Reflection on Methodology and Ethics in Two Cultural Contexts.

    Science.gov (United States)

    Murray, Linda; Nash, Meredith

    2017-05-01

    Photovoice and photo-elicitation are two common methods of participant photography used in health research. Although participatory photography has many benefits, this critical reflection provides fellow researchers with insights into the methodological and ethical challenges faced when using such methods. In this article, we critically reflect on two studies that used participatory photography in different cultural contexts. The first study used photo-elicitation to investigate mothers' experiences of infant settling in central Vietnam. The second study used photovoice to explore pregnant embodiment in Australia. Following a discussion of the literature and a detailed overview of the two studies, we examine the methodological challenges in using participant photography before, during and after each study. This is followed by a discussion of ethical concerns that arose in relation to the burden of participation, confidentiality, consent, and the photographing of families and children. To conclude, we highlight implications for using participatory photography in other settings.

  11. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  12. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  13. Critical Discourse Analysis in Literacy Education: A Review of the Literature

    Science.gov (United States)

    Rogers, Rebecca; Schaenen, Inda

    2014-01-01

    This article is a critical, integrative literature review of scholarship in literacy studies from 2004 to 2012 that draws on critical discourse analysis (CDA). We discuss key issues, trends, and criticisms in the field. Our methodology was carried out in three stages. First, we searched educational databases to locate literacy-focused CDA…

  14. Dynamical analysis of critical assembly CC-1

    International Nuclear Information System (INIS)

    Aleman Fernandez, J.R.

    1990-01-01

    The computer code CC-1, elaborated for the analysis of transients in Critical Assemblies is described. The results by the program are compared with the ones presented in the Safety Report for the Critical Assembly of ''La Quebrada'' Nuclear Research Centre (CIN). 7 refs

  15. Methodological systematic review: mortality in elderly patients with cervical spine injury: a critical appraisal of the reporting of baseline characteristics, follow-up, cause of death, and analysis of risk factors.

    NARCIS (Netherlands)

    Middendorp, J.J. van; Albert, T.J.; Veth, R.P.H.; Hosman, A.J.F.

    2010-01-01

    STUDY DESIGN: Methodologic systematic review. OBJECTIVE: To determine the validity of reported risk factors for mortality in elderly patients with cervical spine injury. SUMMARY OF BACKGROUND DATA: In elderly patients with cervical spine injury, mortality has frequently been associated with the type

  16. A Methodology and Toolkit for Deploying Reliable Security Policies in Critical Infrastructures

    Directory of Open Access Journals (Sweden)

    Faouzi Jaïdi

    2018-01-01

    Full Text Available Substantial advances in Information and Communication Technologies (ICT bring out novel concepts, solutions, trends, and challenges to integrate intelligent and autonomous systems in critical infrastructures. A new generation of ICT environments (such as smart cities, Internet of Things, edge-fog-social-cloud computing, and big data analytics is emerging; it has different applications to critical domains (such as transportation, communication, finance, commerce, and healthcare and different interconnections via multiple layers of public and private networks, forming a grid of critical cyberphysical infrastructures. Protecting sensitive and private data and services in critical infrastructures is, at the same time, a main objective and a great challenge for deploying secure systems. It essentially requires setting up trusted security policies. Unfortunately, security solutions should remain compliant and regularly updated to follow and track the evolution of security threats. To address this issue, we propose an advanced methodology for deploying and monitoring the compliance of trusted access control policies. Our proposal extends the traditional life cycle of access control policies with pertinent activities. It integrates formal and semiformal techniques allowing the specification, the verification, the implementation, the reverse-engineering, the validation, the risk assessment, and the optimization of access control policies. To automate and facilitate the practice of our methodology, we introduce our system SVIRVRO that allows managing the extended life cycle of access control policies. We refer to an illustrative example to highlight the relevance of our contributions.

  17. Critical Analysis of a Website: A Critique based on Critical Applied Linguistics and Critical Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rina Agustina

    2013-05-01

    Full Text Available E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1 critical applied linguistics  (CAL aspects and (2 critical  discourse analysis (CDA. After analysing the website using CAL and CDA, it was found that the website was adequate for beginner, in which it provided fun learning through games as well as challenged learners’ to test their vocabulary. Despite of these strengths, there were several issues required further thinking in terms of learners’ broad knowledge, such as, some of learning materials focused on states in America. It was quite difficult for EFL learners if they did not have adequate general knowledge. Thus, the findings implied that the website could be used as a supporting learning material, which accompanied textbooks and vocabulary exercise books.

  18. Criticality Analysis of SAMOP Subcritical Assembly

    International Nuclear Information System (INIS)

    Tegas-Sutondo; Syarip; Triwulan-Tjiptono

    2005-01-01

    A critically analysis has been performed for homogenous system of uranyl nitrate solution, as part of a preliminary design assessment on neutronic aspect of SAMOP sub-critical assembly. The analysis is intended to determine some critical parameters such as the minimum of critical dimension and critical mass for the desired concentration. As the basis of this analysis, it has been defined a fuel system with an enrichment of 20% for cylindrical geometry of both bare and graphite reflected of 30 cm thickness. The MCNP code has been utilized for this purpose, for variation of concentrations ranging from 150 g/l to 500 g/l. It is found that the best concentration giving the minimum geometrical dimension is around 400 g/l, for both the bare and reflected systems. Whilst the best one, of minimum critical mass is corresponding to the concentration of around 200 g/l with critical mass around 14.1 kg and 4.2 kg for the bare and reflected systems respectively. Based on the result of calculations, it is concluded that by taking into consideration of the critical limit, the SAMOP subcritical assembly is neutronically can be made. (author)

  19. Critical ethnography: An under-used research methodology in neuroscience nursing.

    Science.gov (United States)

    Ross, Cheryl; Rogers, Cath; Duff, Diane

    2016-01-01

    Critical ethnography is a qualitative research method that endeavours to explore and understand dominant discourses that are seen as being the 'right' way to think, see, talk about or enact a particular 'action' or situation in society and recommend ways to re-dress social power inequities. In health care, vulnerable populations, including many individuals who have experienced neurological illnesses or injuries that leave them susceptible to the influence of others, would be suitable groups for study using critical ethnography methodology. Critical ethnography has also been used to study workplace culture. While ethnography has been effectively used to underpin other phenomena of interest to neuroscience nurses, only one example of the use of critical ethnography exists in the published literature related to neuroscience nursing. In our "Research Corner" in this issue of the Canadian Journal of Neuroscience Nursing (CJNN) our guest editors, Dr. Cheryl Ross and Dr. Cath Rogers will briefly highlight the origins of qualitative research, ethnography, and critical ethnography and describe how they are used and, as the third author, I will discuss the relevance of critical ethnography findings for neuroscience nurses.

  20. Methodology for Mode Selection in Corridor Analysis of Freight Transportation

    OpenAIRE

    Kanafani, Adib

    1984-01-01

    The purpose of tins report is to outline a methodology for the analysis of mode selection in freight transportation. This methodology is intended to partake of transportation corridor analysts, a component of demand analysis that is part of a national transportation process. The methodological framework presented here provides a basis on which specific models and calculation procedures might be developed. It also provides a basis for the development of a data management system suitable for co...

  1. Critical dialogical approach: A methodological direction for occupation-based social transformative work.

    Science.gov (United States)

    Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke

    2018-05-03

    Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.

  2. Critical analysis of science textbooks evaluating instructional effectiveness

    CERN Document Server

    2013-01-01

    The critical analysis of science textbooks is vital in improving teaching and learning at all levels in the subject, and this volume sets out a range of academic perspectives on how that analysis should be done. Each chapter focuses on an aspect of science textbook appraisal, with coverage of everything from theoretical and philosophical underpinnings, methodological issues, and conceptual frameworks for critical analysis, to practical techniques for evaluation. Contributions from many of the most distinguished scholars in the field give this collection its sure-footed contemporary relevance, reflecting the international standards of UNESCO as well as leading research organizations such as the American Association for the Advancement of Science (whose Project 2061 is an influential waypoint in developing protocols for textbook analysis). Thus the book shows how to gauge aspects of textbooks such as their treatment of controversial issues, graphical depictions, scientific historiography, vocabulary usage, acc...

  3. Pragmatic critical realism: could this methodological approach expand our understanding of employment relations?

    Science.gov (United States)

    Mearns, Susan Lesley

    2011-01-01

    This paper seeks to highlight the need for employment relations academics and researchers to expand their use of research methodologies in order for them to enable the advancement of theoretical debate within their discipline. It focuses on the contribution that pragmatical critical realism has made to the field of perception and argues that it would add value to the subject of employment relations. It is a theoretically centred review of pragmatical critical realism and the possible contribution this methodology would make to the field of employment relations. The paper concludes that the employment relationship does not take place in a vacuum rather it is focussed on the interaction between imperfect individuals. Therefore, their interactions are moulded by emotions which can not be explored thoroughly or even acknowledged through a positivists' rigorous but limited acknowledgment of what constitutes 'knowledge' and development of theory. While not rejecting the contribution that quantitative data or positivism have made to the field, the study concludes that pragmatic critical realism has a lot to offer the development of the area and its theoretical foundations.

  4. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  5. The methodology of semantic analysis for extracting physical effects

    Science.gov (United States)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  6. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  7. Critical feature analysis of a radiotherapy machine

    International Nuclear Information System (INIS)

    Rae, Andrew; Jackson, Daniel; Ramanan, Prasad; Flanz, Jay; Leyman, Didier

    2005-01-01

    The software implementation of the emergency shutdown feature in a major radiotherapy system was analyzed, using a directed form of code review based on module dependences. Dependences between modules are labelled by particular assumptions; this allows one to trace through the code, and identify those fragments responsible for critical features. An 'assumption tree' is constructed in parallel, showing the assumptions which each module makes about others. The root of the assumption tree is the critical feature of interest, and its leaves represent assumptions which, if not valid, might cause the critical feature to fail. The analysis revealed some unexpected assumptions that motivated improvements to the code

  8. Analysis of Critical Parts and Materials

    Science.gov (United States)

    1980-12-01

    1 1 1% 1% 1% 1% Large Orders Manual Ordering of Some Critical Parts Order Spares with Original Order Incentives Belter Capital Investment...demand 23 Large orders 24 Long lead procurement funding (including raw materials, facility funding) 25 Manpower analysis and training 26 Manual ... ordering of some critical parts 27 More active role in schedule negotiation 28 Multiple source procurements 29 Multi-year program funding 30 Order

  9. ACRR fuel storage racks criticality safety analysis

    International Nuclear Information System (INIS)

    Bodette, D.E.; Naegeli, R.E.

    1997-10-01

    This document presents the criticality safety analysis for a new fuel storage rack to support modification of the Annular Core Research Reactor for production of molybdenum-99 at Sandia National Laboratories, Technical Area V facilities. Criticality calculations with the MCNP code investigated various contingencies for the criticality control parameters. Important contingencies included mix of fuel element types stored, water density due to air bubbles or water level for the over-moderated racks, interaction with existing fuel storage racks and fuel storage holsters in the fuel storage pool, neutron absorption of planned rack design and materials, and criticality changes due to manufacturing tolerances or damage. Some limitations or restrictions on use of the new fuel storage rack for storage operations were developed through the criticality analysis and are required to meet the double contingency requirements of criticality safety. As shown in the analysis, this system will remain subcritical under all credible upset conditions. Administrative controls are necessary for loading, moving, and handling the storage rack as well as for control of operations around it. 21 refs., 16 figs., 4 tabs

  10. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-01-01

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community

  11. Mapping new theoretical and methodological terrain for knowledge translation: contributions from critical realism and the arts

    Science.gov (United States)

    Kontos, Pia C; Poland, Blake D

    2009-01-01

    Background Clinical practice guidelines have been a popular tool for the improvement of health care through the implementation of evidence from systematic research. Yet, it is increasingly clear that knowledge alone is insufficient to change practice. The social, cultural, and material contexts within which practice occurs may invite or reject innovation, complement or inhibit the activities required for success, and sustain or alter adherence to entrenched practices. However, knowledge translation (KT) models are limited in providing insight about how and why contextual contingencies interact, the causal mechanisms linking structural aspects of context and individual agency, and how these mechanisms influence KT. Another limitation of KT models is the neglect of methods to engage potential adopters of the innovation in critical reflection about aspects of context that influence practice, the relevance and meaning of innovation in the context of practice, and the identification of strategies for bringing about meaningful change. Discussion This paper presents a KT model, the Critical Realism and the Arts Research Utilization Model (CRARUM), that combines critical realism and arts-based methodologies. Critical realism facilitates understanding of clinical settings by providing insight into the interrelationship between its structures and potentials, and individual action. The arts nurture empathy, and can foster reflection on the ways in which contextual factors influence and shape clinical practice, and how they may facilitate or impede change. The combination of critical realism and the arts within the CRARUM model promotes the successful embedding of interventions, and greater impact and sustainability. Conclusion CRARUM has the potential to strengthen the science of implementation research by addressing the complexities of practice settings, and engaging potential adopters to critically reflect on existing and proposed practices and strategies for sustaining

  12. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  13. Methodology for risk analysis of nuclear installations

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Senne Junior, Murillo; Jordao, Elizabete

    2002-01-01

    Both the licensing standards for general uses in nuclear facilities and the specific ones require a risk assessment during their licensing processes. The risk assessment is carried out through the estimation of both probability of the occurrence of the accident, and their magnitudes. This is a complex task because the great deal of potential hazardous events that can occur in nuclear facilities difficult the statement of the accident scenarios. There are also many available techniques to identify the potential accidents, estimate their probabilities, and evaluate their magnitudes. In this paper is presented a new methodology that systematizes the risk assessment process, and orders the accomplishment of their several steps. (author)

  14. Empowerment in critical care - a concept analysis.

    Science.gov (United States)

    Wåhlin, Ingrid

    2017-03-01

    The purpose of this paper was to analyse how the concept of empowerment is defined in the scientific literature in relation to critical care. As empowerment is a mutual process affecting all individuals involved, the perspectives of not only patients and next of kin but also staff were sought. A literature review and a concept analysis based on Walker and Avant's analysis procedure were used to identify the basic elements of empowerment in critical care. Twenty-two articles with a focus on critical care were discovered and included in the investigation. A mutual and supportive relationship, knowledge, skills, power within oneself and self-determination were found to be the common attributes of empowerment in critical care. The results could be adapted and used for all parties involved in critical care - whether patients, next of kin or staff - as these defining attributes are assumed to be universal to all three groups, even if the more specific content of each attribute varies between groups and individuals. Even if empowerment is only sparsely used in relation to critical care, it appears to be a very useful concept in this context. The benefits of improving empowerment are extensive: decreased levels of distress and strain, increased sense of coherence and control over situation, and personal and/or professional development and growth, together with increased comfort and inner satisfaction. © 2016 The Authors. Scandinavian Journal of Caring Sciences published by John Wiley & Sons Ltd on behalf of Nordic College.

  15. Practical theology as ‘healing of memories’: Critical reflections on a specific methodology

    Directory of Open Access Journals (Sweden)

    Ian A. Nell

    2011-07-01

    Full Text Available When developing new perspectives and paradigms for practical theology in South Africa, we obviously have to take our South African context seriously. We live in a post-conflict society in which gigantic sociocultural shifts have taken place since 1994. Many institutions and groups endeavour to address the conflict, injustices and pain of the past, including the Institute for the Healing of Memories (IHOM. The Institute makes use of a specific methodology in their workshops. Having participated in these workshops in congregational contexts as well as in the training of theological students, in this article I investigated the methodology of the Institute as a framework for new perspectives on practical theology in South Africa. Making use of Victor Turner’s theoretical construct of ‘social drama’ as one way of looking at the methodology of the IHOM, I reflected critically on the challenges that it poses to practical theology by making use of a ‘rhetorical frame’ and trying to delineate some constructive proposals for further reflections on practical theological paradigms and perspectives.

  16. Computational methods for nuclear criticality safety analysis

    International Nuclear Information System (INIS)

    Maragni, M.G.

    1992-01-01

    Nuclear criticality safety analyses require the utilization of methods which have been tested and verified against benchmarks results. In this work, criticality calculations based on the KENO-IV and MCNP codes are studied aiming the qualification of these methods at the IPEN-CNEN/SP and COPESP. The utilization of variance reduction techniques is important to reduce the computer execution time, and several of them are analysed. As practical example of the above methods, a criticality safety analysis for the storage tubes for irradiated fuel elements from the IEA-R1 research has been carried out. This analysis showed that the MCNP code is more adequate for problems with complex geometries, and the KENO-IV code shows conservative results when it is not used the generalized geometry option. (author)

  17. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  18. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  19. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  20. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  1. Validity of contents of a paediatric critical comfort scale using mixed methodology.

    Science.gov (United States)

    Bosch-Alcaraz, A; Jordan-Garcia, I; Alcolea-Monge, S; Fernández-Lorenzo, R; Carrasquer-Feixa, E; Ferrer-Orona, M; Falcó-Pegueroles, A

    Critical illness in paediatric patients includes acute conditions in a healthy child as well as exacerbations of chronic disease, and therefore these situations must be clinically managed in Critical Care Units. The role of the paediatric nurse is to ensure the comfort of these critically ill patients. To that end, instruments are required that correctly assess critical comfort. To describe the process for validating the content of a paediatric critical comfort scale using mixed-method research. Initially, a cross-cultural adaptation of the Comfort Behavior Scale from English to Spanish using the translation and back-translation method was made. After that, its content was evaluated using mixed method research. This second step was divided into a quantitative stage in which an ad hoc questionnaire was used in order to assess each scale's item relevance and wording and a qualitative stage with two meetings with health professionals, patients and a family member following the Delphi Method recommendations. All scale items obtained a content validity index >0.80, except physical movement in its relevance, which obtained 0.76. Global content scale validity was 0.87 (high). During the qualitative stage, items from each of the scale domains were reformulated or eliminated in order to make the scale more comprehensible and applicable. The use of a mixed-method research methodology during the scale content validity phase allows the design of a richer and more assessment-sensitive instrument. Copyright © 2017 Sociedad Española de Enfermería Intensiva y Unidades Coronarias (SEEIUC). Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Partnering for Research: A Critical Discourse Analysis

    Science.gov (United States)

    Irving, Catherine J.; English, Leona M.

    2008-01-01

    Using a critical discourse analysis, informed by poststructuralist theory, we explore the research phenomenon of coerced partnership. This lens allows us to pay attention to the social relations of power operating in knowledge generation processes, especially as they affect feminist researchers in adult education. We propose an alternative vision…

  3. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  4. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  5. Opening Remarks of the Acquisition Path Analysis Methodology Session

    International Nuclear Information System (INIS)

    Renis, T.

    2015-01-01

    An overview of the recent development work that has been done on acquisition path analysis, implementation of the methodologies within the Department of Safeguards, lessons learned and future areas for development will be provided. (author)

  6. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  7. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  8. Ideologies of English in a Chinese High School EFL Textbook: A Critical Discourse Analysis

    Science.gov (United States)

    Xiong, Tao; Qian, Yamin

    2012-01-01

    In this article we examine ideologies of English in present-day China with a special focus on textbook discourse. The research framework is informed by critical theories on language and education. Critical discourse analysis is applied as a methodological approach characterized by a socially committed attitude in the explanation and interpretation…

  9. Disposal criticality analysis for immobilized plutonium: Internal configurations

    International Nuclear Information System (INIS)

    Gottlieb, P.; Massari, J.R.; Cloke, P.L.

    1998-03-01

    The analysis for immobilized Pu follows the disposal criticality analysis methodology. In this study the focus is on determining the range of chemical compositions of the configurations which can occur following the aqueous degradation processes, particularly with respect to the concentrations of uranium, plutonium, and the principal neutron absorber, gadolinium. The principal analysis tool is a mass balance program that computes the amounts of plutonium, uranium, gadolinium, and chromium in solution as a function of time with inputs from a range of possible waste form dissolution rates, stainless steel corrosion rates, and compound solubilities for the neutronically significant elements. For the waste forms and degradation modes considered here, it is possible to preclude the possibility of criticality by maintaining a plutonium loading limit. Since the presence of hafnium is shown to increase this loading limit, the defense-in-depth policy would suggest the maximization of the amount of Hf as a backup criticality control material. At the end of 1997, after this study was completed, the ceramic waste form was downselected and a new formulation was developed, with the amount of Hf increased to the point where internal criticality may no longer be possible. In addition, recent calculations indicate that GdPO 4 is insoluble over a much broader range of pH than is Gd 2 O 3 , so that its use as the Gd carrier in the waste form would provide an extra margin of defense-in-depth

  10. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  11. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  12. Criticality safety analysis of the NPP Krsko storage racks

    International Nuclear Information System (INIS)

    Kromar, M.; Kurincic, B.

    2002-01-01

    NPP Krsko is going to increase the capacity of the spent fuel storage pool by replacement of the existing racks with high-density racks. This will be the second reracking campaign since 1983 when storage was increased from 180 to 828 storage locations. The pool capacity will increase from 828 to 1694 with partial reracking by the spring 2003. The installed capacity will be sufficient for the current design plant lifetime. Complete reracking of the spent fuel pool will additionally increase capacity to 2321 storage locations. The design, rack manufacturing and installation has been awarded to the Framatome ANP GmbH. Burnup credit methodology, which was approved by the Slovenian Nuclear Safety Administration in previous licensing of existing racks, will be again implemented in the licensing process with the recent methodology improvements. Specific steps of the criticality safety analysis and representative results are presented in the paper.(author)

  13. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  14. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  15. Selected critical examples of scientometric publication analysis

    DEFF Research Database (Denmark)

    Ingwersen, Peter

    2014-01-01

    Objective: This paper selects and outlines factors of central importance in the calculation, presentation and interpretation of publication analysis results from a scientometric perspective. The paper focuses on growth, world share analyses and the logic behind the computation of average numbers...... of authors, institutions or countries per publication indexed by Web of Science. Methodology: The paper uses examples from earlier research evaluation studies and cases based on online data to describe issues, problematic details, pitfalls and how to overcome them in publication analysis with respect...... to analytic tool application, calculation, presentation and interpretation. Results: By means of different kinds of analysis and presentation, the paper provides insight into scientometrics in the context of informetric analysis, selected cases of research productivity, publication patterns and research...

  16. Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences.

    Science.gov (United States)

    DeForge, Ryan; Shaw, Jay

    2012-03-01

    Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences As two doctoral candidates in a health and rehabilitation sciences program, we describe in this paper our respective paradigmatic locations along a quite nonlinear ontological-epistemological-axiological-methodological chain. In a turn-taking fashion, we unpack the tenets of critical realism and pragmatism, and then trace the linkages from these paradigmatic locations through to the methodological choices that address a community-based research problem. Beyond serving as an answer to calls for academics in training to demonstrate philosophical-theoretical-methodological integrity and coherence in their scholarship, this paper represents critical realism and its fore-grounding of a deeply stratified ontology in reflexive relation to pragmatism and its back-grounding of ontology. We conclude by considering the merits and challenges of conducting research from within singular versus proliferate paradigmatic perspectives. © 2011 Blackwell Publishing Ltd.

  17. Methodology: care of the critically ill and injured during pandemics and disasters: CHEST consensus statement.

    Science.gov (United States)

    Ornelas, Joe; Dichter, Jeffrey R; Devereaux, Asha V; Kissoon, Niranjan; Livinski, Alicia; Christian, Michael D

    2014-10-01

    Natural disasters, industrial accidents, terrorism attacks, and pandemics all have the capacity to result in large numbers of critically ill or injured patients. This supplement provides suggestions for all those involved in a disaster or pandemic with multiple critically ill patients, including front-line clinicians, hospital administrators, professional societies, and public health or government officials. The field of disaster medicine does not have the required body of evidence needed to undergo a traditional guideline development process. In result, consensus statement-development methodology was used to capture the highest-caliber expert opinion in a structured, scientific approach. Task Force Executive Committee members identified core topic areas regarding the provision of care to critically ill or injured patients from pandemics or disasters and subsequently assembled an international panel for each identified area. International disaster medicine experts were brought together to identify key questions (in a population, intervention, comparator, outcome [PICO]-based format) within each of the core topic areas. Comprehensive literature searches were then conducted to identify studies upon which evidence-based recommendations could be made. No studies of sufficient quality were identified. Therefore, the panel developed expert opinion-based suggestions that are presented in this supplement using a modified Delphi process. A total of 315 suggestions were drafted across all topic groups. After two rounds of a Delphi consensus-development process, 267 suggestions were chosen by the panel to include in the document and published in a total of 12 manuscripts composing the core chapters of this supplement. Draft manuscripts were prepared by the topic editor and members of the working groups for each of the topics, producing a total of 11 papers. Once the preliminary drafts were received, the Executive Committee (Writing Committee) then met to review, edit, and

  18. Evaluation of frother performance in coal flotation: A critical review of existing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharya, S.; Dey, S. [Indian School for Mines, Dhanbad (India). Dept. for Fuel & Mineral Engineering

    2008-07-01

    Separation efficiency in flotation depends, to a considerable extent, on the efficiency of the frother used. A successful frother must achieve a delicate balance between froth stability and non-persistency. Ideally, the frother is not supposed to influence the state of the surface of the coal and minerals. In practice, however, interaction does occur between the frother, other reagents, and solid surfaces. Various commercially available frothers can differ slightly or significantly in their influence on the flotation results. Therefore, a plant operator is in a dilemma when it comes to selecting a frother to be used in his plant. This article attempts to critically review the different methodologies, which are available to compare the performance of two or more frothers in order to decide which would best serve the purpose of the plant operator.

  19. Criticality analysis in uranium enrichment plant

    International Nuclear Information System (INIS)

    Okamoto, Tsuyoshi; Kiyose, Ryohei

    1977-01-01

    In a large scale uranium enrichment plant, uranium inventory in cascade rooms is not very large in quantity, but the facilities dealing with the largest quantity of uranium in that process are the UF 6 gas supply system and the blending system for controlling the product concentration. When UF 6 spills out of these systems, the enriched uranium is accumulated, and the danger of criticality accident is feared. If a NaF trap is placed at the forestage of waste gas treatment system, plenty of UF 6 and HF are adsorbed together in the NaF trap. Thus, here is the necessity of checking the safety against criticality. Various assumptions were made to perform the computation surveying the criticality of the system composed of UF 6 and HF adsorbed on NaF traps with WIMS code (transport analysis). The minimum critical radius resulted in about 53 cm in case of 3.5% enriched fuel for light water reactors. The optimum volume ratio of fissile material in the double salt UF 6 .2NaF and NaF.HF is about 40 vol. %. While, criticality survey computation was also made for the annular NaF trap having the central cooling tube, and it was found that the effect of cooling tube radius did not decrease the multiplication factor up to the cooling tube radius of about 5 cm. (Wakatsuki, Y.)

  20. Determination of Critical Conditions for Puncturing Almonds Using Coupled Response Surface Methodology and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Mahmood Mahmoodi-Eshkaftaki

    2013-01-01

    Full Text Available In this study, the effect of seed moisture content, probe diameter and loading velocity (puncture conditions on some mechanical properties of almond kernel and peeled almond kernel is considered to model a relationship between the puncture conditions and rupture energy. Furthermore, distribution of the mechanical properties is determined. The main objective is to determine the critical values of mechanical properties significant for peeling machines. The response surface methodology was used to find the relationship between the input parameters and the output responses, and the fitness function was applied to measure the optimal values using the genetic algorithm. Two-parameter Weibull function was used to describe the distribution of mechanical properties. Based on the Weibull parameter values, i.e. shape parameter (β and scale parameter (η calculated for each property, the mechanical distribution variations were completely described and it was confirmed that the mechanical properties are rule governed, which makes the Weibull function suitable for estimating their distributions. The energy model estimated using response surface methodology shows that the mechanical properties relate exponentially to the moisture, and polynomially to the loading velocity and probe diameter, which enabled successful estimation of the rupture energy (R²=0.94. The genetic algorithm calculated the critical values of seed moisture, probe diameter, and loading velocity to be 18.11 % on dry mass basis, 0.79 mm, and 0.15 mm/min, respectively, and optimum rupture energy of 1.97·10-³ J. These conditions were used for comparison with new samples, where the rupture energy was experimentally measured to be 2.68 and 2.21·10-³ J for kernel and peeled kernel, respectively, which was nearly in agreement with our model results.

  1. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  2. Gap analysis methodology for business service engineering

    NARCIS (Netherlands)

    Nguyen, D.K.; van den Heuvel, W.J.A.M.; Papazoglou, M.; de Castro, V.; Marcos, E.; Hofreiter, B.; Werthner, H.

    2009-01-01

    Many of today’s service analysis and design techniques rely on ad-hoc and experience-based identification of value-creating business services and implicitly assume a “green-field” situation focusing on the development of completely new services while offering very limited support for discovering

  3. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  4. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  5. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  6. Critical Analysis of Boko Haram Insurgency

    Science.gov (United States)

    2017-06-09

    insurgency, which poses a threat and problem to the Nigerian government. This research will consult and refer to materials, books , internet, articles, and...this paper recommends the government of Nigeria use efforts to defeat the group focused on; socio economic development, improved intelligence network...College or any other governmental agency. ( References to this study should include the foregoing statement.) iv ABSTRACT A CRITICAL ANALYSIS OF

  7. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  8. Designing an AHP methodology to prioritize critical elements for product innovation: an intellectual capital perspective

    Directory of Open Access Journals (Sweden)

    Costa, R. V.

    2015-08-01

    Full Text Available Intellectual capital has for the past decades been evidenced as an important source of competitive advantages and differentiation at the firm level. At the same time, innovation has become a critical factor for companies to ensure their sustainability and even their survival in a globalized business landscape. Having in mind these two crucial concepts for business success, this study intends to build on the relationships between intellectual capital and product innovation at the firm level. Specifically, we will design and test a model based on the Analytic Hierarchy Process, whose aim is to allow the prioritization of intellectual capital elements according to their relative importance for product innovation performance at the firm level. The main goal of this research is to build a diagnosis and action tool that helps business managers incorporate an intellectual capital perspective into their product innovation initiatives. This framework will help managers to better understand which intellectual capital elements are more critical to their product innovation efforts, and thereby systematize actions and clarify resource allocation priorities to improve their product innovation capabilities. In order to validate the practicability of this proposal, the methodology was empirically applied to a Portuguese innovative company.

  9. Combining nutrition and exercise to optimize survival and recovery from critical illness: Conceptual and methodological issues.

    Science.gov (United States)

    Heyland, Daren K; Stapleton, Renee D; Mourtzakis, Marina; Hough, Catherine L; Morris, Peter; Deutz, Nicolaas E; Colantuoni, Elizabeth; Day, Andrew; Prado, Carla M; Needham, Dale M

    2016-10-01

    Survivors of critical illness commonly experience neuromuscular abnormalities, including muscle weakness known as ICU-acquired weakness (ICU-AW). ICU-AW is associated with delayed weaning from mechanical ventilation, extended ICU and hospital stays, more healthcare-related hospital costs, a higher risk of death, and impaired physical functioning and quality of life in the months after ICU admission. These observations speak to the importance of developing new strategies to aid in the physical recovery of acute respiratory failure patients. We posit that to maintain optimal muscle mass, strength and physical function, the combination of nutrition and exercise may have the greatest impact on physical recovery of survivors of critical illness. Randomized trials testing this and related hypotheses are needed. We discussed key methodological issues and proposed a common evaluation framework to stimulate work in this area and standardize our approach to outcome assessments across future studies. Copyright © 2015 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  10. Ethical and methodological issues in qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions: a critical review.

    Science.gov (United States)

    Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika

    2017-01-01

    Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.

  11. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  12. Decision Making Analysis: Critical Factors-Based Methodology

    Science.gov (United States)

    2010-04-01

    the pitfalls associated with current wargaming methods such as assuming a western view of rational values in decision - making regardless of the cultures...Utilization theory slightly expands the rational decision making model as it states that “actors try to maximize their expected utility by weighing the...items to categorize the decision - making behavior of political leaders which tend to demonstrate either a rational or cognitive leaning. Leaders

  13. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  14. Diversion path analysis handbook. Volume I. Methodology

    International Nuclear Information System (INIS)

    Maltese, M.D.K.; Goodwin, K.E.; Schleter, J.C.

    1976-10-01

    Diversion Path Analysis (DPA) is a procedure for analyzing internal controls of a facility in order to identify vulnerabilities to successful diversion of material by an adversary. The internal covert threat is addressed but the results are also applicable to the external overt threat. The diversion paths are identified. Complexity parameters include records alteration or falsification, multiple removals of sub-threshold quantities, collusion, and access authorization of the individual. Indicators, or data elements and information of significance to detection of unprevented theft, are identified by means of DPA. Indicator sensitivity is developed in terms of the threshold quantity, the elapsed time between removal and indication and the degree of localization of facility area and personnel given by the indicator. Evaluation of facility internal controls in light of these sensitivities defines the capability of interrupting identified adversary action sequences related to acquisition of material at fixed sites associated with the identified potential vulnerabilities. Corrective measures can, in many cases, also be prescribed for management consideration and action. DPA theory and concepts have been developing over the last several years, and initial field testing proved both the feasibility and practicality of the procedure. Follow-on implementation testing verified the ability of facility personnel to perform DPA

  15. Memory controllers for mixed-time-criticality systems architectures, methodologies and trade-offs

    CERN Document Server

    Goossens, Sven; Akesson, Benny; Goossens, Kees

    2016-01-01

    This book discusses the design and performance analysis of SDRAM controllers that cater to both real-time and best-effort applications, i.e. mixed-time-criticality memory controllers. The authors describe the state of the art, and then focus on an architecture template for reconfigurable memory controllers that addresses effectively the quickly evolving set of SDRAM standards, in terms of worst-case timing and power analysis, as well as implementation. A prototype implementation of the controller in SystemC and synthesizable VHDL for an FPGA development board are used as a proof of concept of the architecture template.

  16. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  17. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  18. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  19. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  20. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  1. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  2. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  3. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  4. Sensitivity analysis of critical experiment with direct perturbation compared to TSUNAMI-3D sensitivity analysis

    International Nuclear Information System (INIS)

    Barber, A. D.; Busch, R.

    2009-01-01

    The goal of this work is to obtain sensitivities from direct uncertainty analysis calculation and correlate those calculated values with the sensitivities produced from TSUNAMI-3D (Tools for Sensitivity and Uncertainty Analysis Methodology Implementation in Three Dimensions). A full sensitivity analysis is performed on a critical experiment to determine the overall uncertainty of the experiment. Small perturbation calculations are performed for all known uncertainties to obtain the total uncertainty of the experiment. The results from a critical experiment are only known as well as the geometric and material properties. The goal of this relationship is to simplify the uncertainty quantification process in assessing a critical experiment, while still considering all of the important parameters. (authors)

  5. Critical analysis of industrial electron accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Korenev, S. E-mail: sergey_korenev@steris.com

    2004-10-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterilization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed.

  6. Critical analysis of industrial electron accelerators

    Science.gov (United States)

    Korenev, S.

    2004-09-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterlization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed.

  7. Critical analysis of industrial electron accelerators

    International Nuclear Information System (INIS)

    Korenev, S.

    2004-01-01

    The critical analysis of electron linacs for industrial applications (degradation of PTFE, curing of composites, modification of materials, sterilization and others) is considered in this report. Main physical requirements for industrial electron accelerators consist in the variations of beam parameters, such as kinetic energy and beam power. Questions for regulation of these beam parameters are considered. The level of absorbed dose in the irradiated product and throughput determines the main parameters of electron accelerator. The type of ideal electron linac for industrial applications is discussed

  8. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  9. Fission reactor critical experiments and analysis

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Work accomplished in support of nonweapons programs by LASL Group Q-14 is described. Included are efforts in basic critical measurements, nuclear criticality safety, a plasma core critical assembly, and reactivity coefficient measurements

  10. Criticality safety analysis for mockup facility

    International Nuclear Information System (INIS)

    Shin, Young Joon; Shin, Hee Sung; Kim, Ik Soo; Oh, Seung Chul; Ro, Seung Gy; Bae, Kang Mok

    2000-03-01

    Benchmark calculations for SCALE4.4 CSAS6 module have been performed for 31 UO 2 fuel, 15MOX fuel and 10 metal material criticality experiments and then calculation biases of the SCALE 4.4 CSAS6 module have been revealed to be 0.00982, 0.00579 and 0.02347, respectively. When CSAS6 is applied to the criticality safety analysis for the mockup facility in which several kinds of nuclear material components are included, the calculation bias of CSAS6 is conservatively taken to be 0.02347. With the aid of this benchmarked code system, criticality safety analyses for the mockup facility at normal and hypothetical accidental conditions have been carried out. It appears that the maximum K eff is 0.28356 well below than the critical limit, K eff =0.95 at normal condition. In a hypothetical accidental condition, the maximum K eff is found to be 0.73527 much lower than the subcritical limit. For another hypothetical accidental condition the nuclear material leaks out of container and spread or lump in the floor, it was assumed that the nuclear material is shaped into a slab and water exists in the empty space of the nuclear material. K eff has been calculated as function of slab thickness and the volume ratio of water to nuclear material. The result shows that the K eff increases as the water volume ratio increases. It is also revealed that the K eff reaches to the maximum value when water if filled in the empty space of nuclear material. The maximum K eff value is 0.93960 lower than the subcritical limit

  11. Literature research of FMEA (Failure Mode and Effects Analysis) methodology

    International Nuclear Information System (INIS)

    Hustak, S.

    1999-01-01

    The potential of the FMEA applications is demonstrated. Some approaches can be used for system analysis or immediately for PSA, in particular, for obtaining background information for fault tree analysis in the area of component modelling and, to a lesser extent, for identification of the initiating events. On the other hand, other FMEA applications, such as criticality analysis, are unusable in PSA. (author)

  12. Prioritizing critical success factors for reverse logistics implementation using fuzzy-TOPSIS methodology

    Science.gov (United States)

    Agrawal, Saurabh; Singh, Rajesh K.; Murtaza, Qasim

    2016-03-01

    Electronics industry is one of the fastest growing industries in the world. In India also, there are high turnovers and growing demand of electronics product especially after post liberalization in early nineties. These products generate e-waste which has become big environmental issue. Industries can handle these e-waste and product returns efficiently by developing reverse logistics (RL) system. A thorough study of critical success factors (CSFs) and their ordered implementation is essential for successful RL implementation. The aim of the study is to review the CSFs, and to prioritize them for RL implementation in Indian electronics industry. Twelve CSFs were identified through literature review, and discussion with the experts from the Indian electronics industry. Fuzzy-Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) approach is proposed for prioritizing these CSFs. Perusal of literature indicates that fuzzy-TOPSIS has not been applied earlier for prioritization of CSFs in Indian electronics industry. Five Indian electronics companies were selected for evaluation of this methodology. Results indicate that most of the identified factors are crucial for the RL implementation. Top management awareness, resource management, economic factors, and contracts terms and conditions are top four prioritized factor, and process capabilities and skilled workers is the least prioritized factor. The findings will be useful for successful RL implementation in Indian electronics industry.

  13. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  14. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  15. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  16. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  17. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  18. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  19. Analysis of criticality experiments at SHE

    International Nuclear Information System (INIS)

    Takano, Makoto; Doi, Takeshi; Hirano, Mitsumasa; Shindo, Ryuichi; Oomura, Hiroshi

    1982-03-01

    In the report, the criticality experiments, which were conducted for the core configurations of Semi-Homogeneous Experimental Assembly (SHE)-8,12,13,14, are analyzed for the purpose of verifying the computer codes and calculational methods employed in the nuclear design of VHTR. The codes, DELIGHT-5 and CITATION calculate the neutron spectrum and the effective multiplication factor respectively. Each system of SHE is modeled by twodimensional R-Z, Triangular and threedimensional Triangular-Z geometries. Various effects such as axial buckling, modeling and the difference between diffusion and transport are also taken into account. Calculated values of effective multiplication factor show the disagreement of 1 - 3% from the values of experiments approximately. Therefore the analysis is considered to be inadequate to the verification and more precise analysis is required with the emphasis on how to model the system, condense the group constants and guess the buckling value for spectrum calculation. (author)

  20. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  1. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  2. A Quantitative Examination of Critical Success Factors Comparing Agile and Waterfall Project Management Methodologies

    Science.gov (United States)

    Pedersen, Mitra

    2013-01-01

    This study investigated the rate of success for IT projects using agile and standard project management methodologies. Any successful project requires use of project methodology. Specifically, large projects require formal project management methodologies or models, which establish a blueprint of processes and project planning activities. This…

  3. A Critical Analysis of IQ Studies of Adopted Children

    Science.gov (United States)

    Richardson, Ken; Norgate, Sarah H.

    2006-01-01

    The pattern of parent-child correlations in adoption studies has long been interpreted to suggest substantial additive genetic variance underlying variance in IQ. The studies have frequently been criticized on methodological grounds, but those criticisms have not reflected recent perspectives in genetics and developmental theory. Here we apply…

  4. Broadening the Study of Participation in the Life Sciences: How Critical Theoretical and Mixed-Methodological Approaches Can Enhance Efforts to Broaden Participation

    Science.gov (United States)

    Metcalf, Heather

    2016-01-01

    This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…

  5. Critical reading and critical thinking--study design and methodology: a personal approach on how to read the clinical literature.

    Science.gov (United States)

    Lipman, Timothy O

    2013-04-01

    The volume of medical literature grows exponentially. Yet we are faced with the necessity to make clinical decisions based on the availability and quality of scientific information. The general strength (reliability, robustness) of any interpretation that guides us in clinical decision making is dependent on how information was obtained. All information and medical studies and, consequently, all conclusions are not created equal. It is incumbent upon us to be able to assess the quality of the information that guides us in the care of our patients. Being able to assess medical literature critically requires use of critical reading and critical thinking skills. To achieve these skills, to be able to analyze medical literature critically, takes a combination of education and practice, practice, and more practice.

  6. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  7. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  8. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  9. The critical review of methodologies and approaches to assess the inherent skin sensitization potential (skin allergies) of chemicals. Part II

    DEFF Research Database (Denmark)

    Thyssen, Jacob P; Giménez-Arnau, Elena; Lepoittevin, Jean-Pierre

    2012-01-01

    To identify specific cases, classes or specific use situations of chemicals for which 'safety thresholds' or 'safety limits' were set (in regulations, standards, in scientific research/clinical work, etc.) and critically review the scientific and methodological parameters used to set those limits....

  10. Critical Inquiry for the Social Good: Methodological Work as a Means for Truth-Telling in Education

    Science.gov (United States)

    Kuntz, Aaron M.; Pickup, Austin

    2016-01-01

    This article questions the ubiquity of the term "critical" in methodological scholarship, calling for a renewed association of the term with projects concerned with social justice, truth-telling, and overt articulations of the social good. Drawing on Michel Foucault's work with parrhesia (or truth-telling) and Aristotle's articulation of…

  11. Criticality analysis for hazardous materials transportation; Classificacao da criticidade das rotas do transporte rodoviario de produtos perigosos da BRASKEM

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Katia; Brady, Mariana [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Diniz, Americo [BRASKEM S.A., Sao Paulo, SP (Brazil)

    2008-07-01

    The bad conditions of Brazilians roads drive the companies to be more exigent with the transportation of hazardous materials to avoid accidents or materials releases with actions to contain the releases to community and water sources. To minimize this situation, DNV and BRASKEM developed a methodology for risk analysis called Criticality Analysis for Hazardous Materials Transportation. The objective of this methodology is identifying the most critical points of routes to make actions to avoid accidents. (author)

  12. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  13. Critical analysis of the Colombian mining legislation

    International Nuclear Information System (INIS)

    Vargas P, Elkin; Gonzalez S, Carmen Lucia

    2003-01-01

    The document analyses the Colombian mining legislation, Act 685 of 2001, based on the reasons expressed by the government and the miners for its conceit and approval. The document tries to determine the developments achieved by this new Mining Code considering international mining competitiveness and its adaptation to the constitutional rules about environment, indigenous communities, decentralization and sustainable development. The analysis formulates general and specific hypothesis about the proposed objectives of the reform, which are confronted with the arguments and critical evaluations of the results. Most hypothesis are not verified, thus demonstrating that the Colombian mining legislation is far from being the necessary instrument to promote mining activities, making it competitive according to international standards and adapted to the principles of sustainable development, healthy environment, community participation, ethnic minorities and regional autonomy

  14. Critical analysis of algebraic collective models

    International Nuclear Information System (INIS)

    Moshinsky, M.

    1986-01-01

    The author shall understand by algebraic collective models all those based on specific Lie algebras, whether the latter are suggested through simple shell model considerations like in the case of the Interacting Boson Approximation (IBA), or have a detailed microscopic foundation like the symplectic model. To analyze these models critically, it is convenient to take a simple conceptual example of them in which all steps can be implemented analytically or through elementary numerical analysis. In this note he takes as an example the symplectic model in a two dimensional space i.e. based on a sp(4,R) Lie algebra, and show how through its complete discussion we can get a clearer understanding of the structure of algebraic collective models of nuclei. In particular he discusses the association of Hamiltonians, related to maximal subalgebras of our basic Lie algebra, with specific types of spectra, and the connections between spectra and shapes

  15. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  16. Criticality Model

    International Nuclear Information System (INIS)

    Alsaed, A.

    2004-01-01

    The ''Disposal Criticality Analysis Methodology Topical Report'' (YMP 2003) presents the methodology for evaluating potential criticality situations in the monitored geologic repository. As stated in the referenced Topical Report, the detailed methodology for performing the disposal criticality analyses will be documented in model reports. Many of the models developed in support of the Topical Report differ from the definition of models as given in the Office of Civilian Radioactive Waste Management procedure AP-SIII.10Q, ''Models'', in that they are procedural, rather than mathematical. These model reports document the detailed methodology necessary to implement the approach presented in the Disposal Criticality Analysis Methodology Topical Report and provide calculations utilizing the methodology. Thus, the governing procedure for this type of report is AP-3.12Q, ''Design Calculations and Analyses''. The ''Criticality Model'' is of this latter type, providing a process evaluating the criticality potential of in-package and external configurations. The purpose of this analysis is to layout the process for calculating the criticality potential for various in-package and external configurations and to calculate lower-bound tolerance limit (LBTL) values and determine range of applicability (ROA) parameters. The LBTL calculations and the ROA determinations are performed using selected benchmark experiments that are applicable to various waste forms and various in-package and external configurations. The waste forms considered in this calculation are pressurized water reactor (PWR), boiling water reactor (BWR), Fast Flux Test Facility (FFTF), Training Research Isotope General Atomic (TRIGA), Enrico Fermi, Shippingport pressurized water reactor, Shippingport light water breeder reactor (LWBR), N-Reactor, Melt and Dilute, and Fort Saint Vrain Reactor spent nuclear fuel (SNF). The scope of this analysis is to document the criticality computational method. The criticality

  17. SWANS: A Prototypic SCALE Criticality Sequence for Automated Optimization Using the SWAN Methodology

    International Nuclear Information System (INIS)

    Greenspan, E.

    2001-01-01

    SWANS is a new prototypic analysis sequence that provides an intelligent, semi-automatic search for the maximum k eff of a given amount of specified fissile material, or of the minimum critical mass. It combines the optimization strategy of the SWAN code with the composition-dependent resonance self-shielded cross sections of the SCALE package. For a given system composition arrived at during the iterative optimization process, the value of k eff is as accurate and reliable as obtained using the CSAS1X Sequence of SCALE-4.4. This report describes how SWAN is integrated within the SCALE system to form the new prototypic optimization sequence, describes the optimization procedure, provides a user guide for SWANS, and illustrates its application to five different types of problems. In addition, the report illustrates that resonance self-shielding might have a significant effect on the maximum k eff value a given fissile material mass can have

  18. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  19. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  20. Teaching for Art Criticism: Incorporating Feldman's Critical Analysis Learning Model in Students' Studio Practice

    Science.gov (United States)

    Subramaniam, Maithreyi; Hanafi, Jaffri; Putih, Abu Talib

    2016-01-01

    This study adopted 30 first year graphic design students' artwork, with critical analysis using Feldman's model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students' performances in their critical ability.…

  1. Rethinking Critical Mathematics: A Comparative Analysis of Critical, Reform, and Traditional Geometry Instructional Texts

    Science.gov (United States)

    Brantlinger, Andrew

    2011-01-01

    This paper presents findings from a comparative analysis of three similar secondary geometry texts, one critical unit, one standards-based reform unit, and one specialist chapter. I developed the critical unit as I took the tenets of critical mathematics (CM) and substantiated them in printed curricular materials in which to teach as part of a…

  2. Critical Discourse Analysis in Education: A Review of the Literature, 2004 to 2012

    Science.gov (United States)

    Rogers, Rebecca; Schaenen, Inda; Schott, Christopher; O'Brien, Kathryn; Trigos-Carrillo, Lina; Starkey, Kim; Chasteen, Cynthia Carter

    2016-01-01

    This article reviews critical discourse analysis scholarship in education research from 2004 to 2012. Our methodology was carried out in three stages. First, we searched educational databases. Second, we completed an analytic review template for each article and encoded these data into a digital spreadsheet to assess macro-trends in the field.…

  3. The Brazilian Experience with Agroecological Extension: A Critical Analysis of Reform in a Pluralistic Extension System

    Science.gov (United States)

    Diesel, Vivien; Miná Dias, Marcelo

    2016-01-01

    Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…

  4. A cost effective approach for criticality accident analysis of a DOE SNF storage facility

    International Nuclear Information System (INIS)

    Garrett, R.L.; Couture, G.F.; Gough, S.T.

    1997-01-01

    This paper presents the methodologies used to derive criticality accident analyses for a spent nuclear fuel receipt, storage, handling, and shipping facility. Two criticality events are considered: process-induced and Natural Phenomena Hazards (NPH)-induced. The criticality analyses required the development of: (1) the frequency at which each sceanario occurred, (2) the estimated number of fissions for each scenario, and (3) the consequences associated with each criticality scenario. A fault tree analysis was performed to quantify the frequency of criticality due to process-induced events. For the frequency analysis of NPH-induced criticality, a probabilistic approach was employed. To estimate the consequences of a criticality event, the resulting fission yield was determined using a probabilistic approach. For estimating the source term, a 95% amount of overall conservatism was targeted. This methodology applied to the facility criticality scenarios indicated that: (1) the 95th percentile yield levels from the historical yield distributions are approximately 5 x 10 17 fissions and 5 x 10 18 fissions for internal event and NPH-induced criticality event, respectively; and (2) using probabilistic Latin Hypercube Sampling, the downwind 95th percentile dose to a receptor at the US DOE reservation boundary is 2.2 mrem. This estimate is compared to the bounding dose of 1.4 rem. 4 refs., 1 fig

  5. CRITICAL ANALYSIS OF THE RELIABILITY OF INTUITIVE MORAL DECISIONS

    Directory of Open Access Journals (Sweden)

    V. V. Nadurak

    2017-06-01

    Full Text Available Purpose of the research is a critical analysis of the reliability of intuitive moral decisions. Methodology. The work is based on the methodological attitude of empirical ethics, involving the use of findings from empirical research in ethical reflection and decision making. Originality. The main kinds of intuitive moral decisions are identified: 1 intuitively emotional decisions (i.e. decisions made under the influence of emotions that accompanies the process of moral decision making; 2 decisions made under the influence of moral risky psychological aptitudes (unconscious human tendencies that makes us think in a certain way and make decisions, unacceptable from the logical and ethical point of view; 3 intuitively normative decisions (decisions made under the influence of socially learned norms, that cause evaluative feeling «good-bad», without conscious reasoning. It was found that all of these kinds of intuitive moral decisions can lead to mistakes in the moral life. Conclusions. Considering the fact that intuition systematically leads to erroneous moral decisions, intuitive reaction cannot be the only source for making such decisions. The conscious rational reasoning can compensate for weaknesses of intuition. In this case, there is a necessity in theoretical model that would structure the knowledge about the interactions between intuitive and rational factors in moral decisions making and became the basis for making suggestions that would help us to make the right moral decision.

  6. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  7. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  8. Teaching For Art Criticism: Incorporating Feldman’s Critical Analysis Learning Model In Students’ Studio Practice

    OpenAIRE

    Maithreyi Subramaniam; Jaffri Hanafi; Abu Talib Putih

    2016-01-01

    This study adopted 30 first year graphic design students’ artwork, with critical analysis using Feldman’s model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students’ performances in their critical ability. Pearson Correlation Coefficient was used to find out the correlation between students’ studio practice and art critical ability scores. The...

  9. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  10. Analysis of critically refracted longitudinal waves

    Energy Technology Data Exchange (ETDEWEB)

    Pei, Ning, E-mail: npei@iastate.edu; Bond, Leonard J., E-mail: npei@iastate.edu [Center for Nondestructive Evaluation, Iowa State University, Ames, IA 50011 (United States)

    2015-03-31

    Fabrication processes, such as, welding, forging, and rolling can induce residual stresses in metals that will impact product performance and phenomena such as cracking and corrosion. To better manage residual stress tools are needed to map their distribution. The critically refracted ultrasonic longitudinal (LCR) wave is one such approach that has been used for residual stress characterization. It has been shown to be sensitive to stress and less sensitive to the effects of the texture of the material. Although the LCR wave is increasingly widely applied, the factors that influence the formation of the LCR beam are seldom discussed. This paper reports a numerical model used to investigate the transducers' parameters that can contribute to the directionality of the LCR wave and hence enable performance optimization when used for industrial applications. An orthogonal test method is used to study the transducer parameters which influence the LCR wave beams. This method provides a design tool that can be used to study and optimize multiple parameter experiments and it can identify which parameter or parameters are of most significance. The simulation of the sound field in a 2-D 'water-steel' model is obtained using a Spatial Fourier Analysis method. The effects of incident angle, standoff, the aperture and the center frequency of the transducer were studied. Results show that the aperture of the transducer, the center frequency and the incident angle are the most important factors in controlling the directivity of the resulting LCR wave fields.

  11. Workplace bullying prevention: a critical discourse analysis.

    Science.gov (United States)

    Johnson, Susan L

    2015-10-01

    The aim of this study was to analyse the discourses of workplace bullying prevention of hospital nursing unit managers and in the official documents of the organizations where they worked. Workplace bullying can be a self-perpetuating problem in nursing units. As such, efforts to prevent this behaviour may be more effective than efforts to stop ongoing bullying. There is limited research on how healthcare organizations characterize their efforts to prevent workplace bullying. This was a qualitative study. Critical discourse analysis and Foucault's writings on governmentality and discipline were used to analyse data from interviews with hospital nursing unit managers (n = 15) and organizational documents (n = 22). Data were collected in 2012. The discourse of workplace bullying prevention centred around three themes: prevention of workplace bullying through managerial presence, normalizing behaviours and controlling behaviours. All three are individual level discourses of workplace bullying prevention. Current research indicates that workplace bullying is a complex issue with antecedents at the individual, departmental and organizational level. However, the discourse of the participants in this study only focused on prevention of bullying by moulding the behaviours of individuals. The effective prevention of workplace bullying will require departmental and organizational initiatives. Leaders in all types of organizations can use the results of this study to examine their organizations' discourses of workplace bullying prevention to determine where change is needed. © 2015 John Wiley & Sons Ltd.

  12. Critical analysis of the pedagogical practice of the teachers trainnees

    Directory of Open Access Journals (Sweden)

    Mónica Ruiz Quiroga

    2013-07-01

    Full Text Available This article reports the results of a research project supported by the Research Center of the Universidad Pedagógica Nacional, whose purpose was the redefinition of the training process of the students, in the frame of the pedagogical practice, in one of the research lines for the Degree in Elementary Education with emphasis on Social Sciences. On a theoretical level, analysis and discussion were developed from critical pedagogy, particularly the concepts of pedagogical practice, training and systematization of experiences. Methodologically the project was developed from the Educational Action Research. It was found that students and teachers conceive pedagogical practice in a critical way, related to their reflective and transformative personalities, something that breaks, in some way, with the traditional outlook that defines it as the confirmation of the theory in the field. This way of conceiving is the result of both the training process and the life history of each other, as well as the staging and the discussion of the significance of the practice within the social sciences framework.

  13. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    Cha, Kyoon Ho; Choi, Yu Sun; Lee, Eun Ki; Park, Moon Ghu; Morita, Toshio; Heibel, Michael D.

    2004-01-01

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  14. Interpretive Phenomenological Analysis: An Appropriate Methodology for Educational Research?

    Directory of Open Access Journals (Sweden)

    Edward John Noon

    2018-04-01

    Full Text Available Interpretive phenomenological analysis (IPA is a contemporary qualitative methodology, first developed by psychologist Jonathan Smith (1996. Whilst its roots are in psychology, it is increasingly being drawn upon by scholars in the human, social and health sciences (Charlick, Pincombe, McKellar, & Fielder, 2016. Despite this, IPA has received limited attention across educationalist literature. Drawing upon my experiences of using IPA to explore the barriers to the use of humour in the teaching of Childhood Studies (Noon, 2017, this paper will discuss its theoretical orientation, sampling and methods of data collection and analysis, before examining the strengths and weaknesses to IPA’s employment in educational research.

  15. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  16. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  17. SWANS: A Prototypic SCALE Criticality Sequence for Automated Optimization Using the SWAN Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Greenspan, E.

    2001-01-11

    SWANS is a new prototypic analysis sequence that provides an intelligent, semi-automatic search for the maximum k{sub eff} of a given amount of specified fissile material, or of the minimum critical mass. It combines the optimization strategy of the SWAN code with the composition-dependent resonance self-shielded cross sections of the SCALE package. For a given system composition arrived at during the iterative optimization process, the value of k{sub eff} is as accurate and reliable as obtained using the CSAS1X Sequence of SCALE-4.4. This report describes how SWAN is integrated within the SCALE system to form the new prototypic optimization sequence, describes the optimization procedure, provides a user guide for SWANS, and illustrates its application to five different types of problems. In addition, the report illustrates that resonance self-shielding might have a significant effect on the maximum k{sub eff} value a given fissile material mass can have.

  18. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  19. The Combined Effect of Mere Exposure, Counterattitudinal Advocacy, and Art Criticism Methodology on Upper Elementary and Junior High Students' Affect Toward Art Works.

    Science.gov (United States)

    Hollingsworth, Patricia

    1983-01-01

    Results indicated that, for elementary students, art criticism was more effective than a combination of methodologies for developing positive affect toward art works. For junior high students, the combination methodology was more effective than art criticism, the exposure method, or the counterattitudinal advocacy method. (Author/SR)

  20. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  1. Teaching For Art Criticism: Incorporating Feldman’s Critical Analysis Learning Model In Students’ Studio Practice

    Directory of Open Access Journals (Sweden)

    Maithreyi Subramaniam

    2016-01-01

    Full Text Available This study adopted 30 first year graphic design students’ artwork, with critical analysis using Feldman’s model of art criticism. Data were analyzed quantitatively; descriptive statistical techniques were employed. The scores were viewed in the form of mean score and frequencies to determine students’ performances in their critical ability. Pearson Correlation Coefficient was used to find out the correlation between students’ studio practice and art critical ability scores. The findings showed most students performed slightly better than average in the critical analyses and performed best in selecting analysis among the four dimensions assessed. In the context of the students’ studio practice and critical ability, findings showed there are some connections between the students’ art critical ability and studio practice.

  2. Bayesian methodology for the design and interpretation of clinical trials in critical care medicine: a primer for clinicians.

    Science.gov (United States)

    Kalil, Andre C; Sun, Junfeng

    2014-10-01

    To review Bayesian methodology and its utility to clinical decision making and research in the critical care field. Clinical, epidemiological, and biostatistical studies on Bayesian methods in PubMed and Embase from their inception to December 2013. Bayesian methods have been extensively used by a wide range of scientific fields, including astronomy, engineering, chemistry, genetics, physics, geology, paleontology, climatology, cryptography, linguistics, ecology, and computational sciences. The application of medical knowledge in clinical research is analogous to the application of medical knowledge in clinical practice. Bedside physicians have to make most diagnostic and treatment decisions on critically ill patients every day without clear-cut evidence-based medicine (more subjective than objective evidence). Similarly, clinical researchers have to make most decisions about trial design with limited available data. Bayesian methodology allows both subjective and objective aspects of knowledge to be formally measured and transparently incorporated into the design, execution, and interpretation of clinical trials. In addition, various degrees of knowledge and several hypotheses can be tested at the same time in a single clinical trial without the risk of multiplicity. Notably, the Bayesian technology is naturally suited for the interpretation of clinical trial findings for the individualized care of critically ill patients and for the optimization of public health policies. We propose that the application of the versatile Bayesian methodology in conjunction with the conventional statistical methods is not only ripe for actual use in critical care clinical research but it is also a necessary step to maximize the performance of clinical trials and its translation to the practice of critical care medicine.

  3. 3-D rod ejection analysis using a conservative methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Ho; Park, Jin Woo; Park, Guen Tae; Um, Kil Sup; Ryu, Seok Hee; Lee, Jae Il; Choi, Tong Soo [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    The point kinetics model which simplifies the core phenomena and physical specifications is used for the conventional rod ejection accident analysis. The point kinetics model is convenient to assume conservative core parameters but this simplification loses large amount of safety margin. The CHASER system couples the three-dimensional core neutron kinetics code ASTRA, the sub-channel analysis code THALES and the fuel performance analysis code FROST. The validation study for the CHASER system is addressed using the NEACRP three-dimensional PWR core transient benchmark problem. A series of conservative rod ejection analyses for the APR1400 type plant is performed for both hot full power (HFP) and hot zero power (HZP) conditions to determine the most limiting cases. The conservative rod ejection analysis methodology is designed to properly consider important phenomena and physical parameters.

  4. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  6. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  7. Advertisement Analysis: A Comparative Critical Study

    Science.gov (United States)

    Abdelaal, Noureldin Mohamed; Sase, Amal Saleh

    2014-01-01

    This study aimed at analyzing two advertisements, and investigating how advertisers use discourse and semiotics to make people and customers buy into their ideas, beliefs, or simply their products. The two advertisements analyzed are beauty products which have been selected from internet magazines. The methodology adopted in this study is…

  8. Constructing Israeli and Palestinian Identity: A Multimodal Critical Discourse Analysis of World History Textbooks and Teacher Discourse

    Science.gov (United States)

    Osborn, Daniel

    2017-01-01

    This research critically evaluates the depiction of Israelis and Palestinians in World History textbooks and World History teachers' instructional discourse. Employing a Multimodal Critical Discourse Analysis methodology, this study offers a comparison between written narratives and spoken discourse in order to analyze the portrayals found in…

  9. Ghost Hunting as a Means to Illustrate Scientific Methodology and Enhance Critical Thinking

    Science.gov (United States)

    Rockwell, Steven C.

    2012-01-01

    The increasing popularity of television shows featuring paranormal investigations has led to a renewed enthusiasm in ghost hunting activities, and belief in the paranormal in general. These shows typically feature a group of investigators who, while claiming to utilize proper scientifically correct methodologies, violate many core scientific…

  10. Toward a Methodology of Death: Deleuze's "Event" as Method for Critical Ethnography

    Science.gov (United States)

    Rodriguez, Sophia

    2016-01-01

    This article examines how qualitative researchers, specifically ethnographers, might utilize complex philosophical concepts in order to disrupt the normative truth-telling practices embedded in social science research. Drawing on my own research experiences, I move toward a methodology of death (for researcher/researched alike) grounded in…

  11. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  12. Does College Teach Critical Thinking? A Meta-Analysis

    Science.gov (United States)

    Huber, Christopher R.; Kuncel, Nathan R.

    2016-01-01

    Educators view critical thinking as an essential skill, yet it remains unclear how effectively it is being taught in college. This meta-analysis synthesizes research on gains in critical thinking skills and attitudinal dispositions over various time frames in college. The results suggest that both critical thinking skills and dispositions improve…

  13. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  14. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  15. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  16. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  17. Social Network Analysis and Critical Realism

    DEFF Research Database (Denmark)

    Buch-Hansen, Hubert

    2014-01-01

    in relation to established philosophies of science. This article argues that there is a tension between applied and methods-oriented SNA studies, on the one hand, and those addressing the social-theoretical nature and implications of networks, on the other. The former, in many cases, exhibits positivist...... tendencies, whereas the latter incorporate a number of assumptions that are directly compatible with core critical realist views on the nature of social reality and knowledge. This article suggests that SNA may be detached from positivist social science and come to constitute a valuable instrument...... in the critical realist toolbox....

  18. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  19. Critical point analysis of phase envelope diagram

    Energy Technology Data Exchange (ETDEWEB)

    Soetikno, Darmadi; Siagian, Ucok W. R. [Department of Petroleum Engineering, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia); Kusdiantara, Rudy, E-mail: rkusdiantara@s.itb.ac.id; Puspita, Dila, E-mail: rkusdiantara@s.itb.ac.id; Sidarto, Kuntjoro A., E-mail: rkusdiantara@s.itb.ac.id; Soewono, Edy; Gunawan, Agus Y. [Department of Mathematics, Institut Teknologi Bandung, Jl. Ganesha 10, Bandung 40132 (Indonesia)

    2014-03-24

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab.

  20. Critical point analysis of phase envelope diagram

    International Nuclear Information System (INIS)

    Soetikno, Darmadi; Siagian, Ucok W. R.; Kusdiantara, Rudy; Puspita, Dila; Sidarto, Kuntjoro A.; Soewono, Edy; Gunawan, Agus Y.

    2014-01-01

    Phase diagram or phase envelope is a relation between temperature and pressure that shows the condition of equilibria between the different phases of chemical compounds, mixture of compounds, and solutions. Phase diagram is an important issue in chemical thermodynamics and hydrocarbon reservoir. It is very useful for process simulation, hydrocarbon reactor design, and petroleum engineering studies. It is constructed from the bubble line, dew line, and critical point. Bubble line and dew line are composed of bubble points and dew points, respectively. Bubble point is the first point at which the gas is formed when a liquid is heated. Meanwhile, dew point is the first point where the liquid is formed when the gas is cooled. Critical point is the point where all of the properties of gases and liquids are equal, such as temperature, pressure, amount of substance, and others. Critical point is very useful in fuel processing and dissolution of certain chemicals. Here in this paper, we will show the critical point analytically. Then, it will be compared with numerical calculations of Peng-Robinson equation by using Newton-Raphson method. As case studies, several hydrocarbon mixtures are simulated using by Matlab

  1. Phenotypic variance, plasticity and heritability estimates of critical thermal limits depend on methodological context

    DEFF Research Database (Denmark)

    Chown, Steven L.; Jumbam, Keafon R.; Sørensen, Jesper Givskov

    2009-01-01

    used during assessments of critical thermal limits to activity. To date, the focus of work has almost exclusively been on the effects of rate variation on mean values of the critical limits. 2.  If the rate of temperature change used in an experimental trial affects not only the trait mean but also its...... this is the case for critical thermal limits using a population of the model species Drosophila melanogaster and the invasive ant species Linepithema humile. 4.  We found that effects of the different rates of temperature change are variable among traits and species. However, in general, different rates...... of temperature change resulted in different phenotypic variances and different estimates of heritability, presuming that genetic variance remains constant. We also found that different rates resulted in different conclusions regarding the responses of the species to acclimation, especially in the case of L...

  2. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  3. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  4. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  5. Feng Youlan’s Interpretation of Western Philosophy:A Critical Examination from the Perspective of Metaphysical Methodology

    Directory of Open Access Journals (Sweden)

    Derong Chen

    2015-04-01

    Full Text Available The paper aims to display a limited observation of Feng’s interpretation of Western philosophy through the window of metaphysical methodology. This paper concentrates on Feng’s interpretation of Western philosophy from the perspective of metaphysical methodology and aims to display a limited observation of Feng’s interpretation of Western philosophy through the window of metaphysical methodology. Based on a brief review the recent studies of Feng Youlan and Western philosophy, this paper analyzes the progress and insufficient aspects in current studies on this issue and particularly clarifies what are the metaphysics and metaphysical methods in the context of Feng Youlan’s philosophy. In clarifying Feng’s interpretation of Western philosophy from the perspective of methodology, this paper further critically analyzes the Feng’s positive metaphysical methods and negative metaphysical methods, and assumes that Feng’s negative metaphysical methods essentially is a kind of attitudes towards metaphysics but neither a kind of metaphysics nor a kind of metaphysical methods. Instead of characterizing metaphysical methods as positive and negative as Feng did, this paper suggests an alternative division of metaphysical methods: direct and indirect methods of dealing with metaphysical issues.

  6. The Cognitive Mediation Hypothesis Revisited: An Empirical Response to Methodological and Theoretical Criticism.

    Science.gov (United States)

    Romero, Anna A.; And Others

    1996-01-01

    In order to address criticisms raised against the cognitive mediation hypothesis, three experiments were conducted to develop a more direct test of the hypothesis. Taken together, the three experiments provide converging support for the cognitive mediation hypothesis, reconfirming the central role of cognition in the persuasion process.…

  7. Ethical human resource management: a critical analysis

    OpenAIRE

    Khan, Muhammad

    2014-01-01

    In modern day, Human Resource Management (HRM) is seen as a mere variant of management control aiming intentionally to ‘colonize’ the identity of the individual employee which points to the contradictions between the idealised HRM theories and its practice commonly referred to as the difference between rhetoric and reality. These critical analyses suggest that HRM reflects a historical shift in the way work is defined and managed and research has to be undertaken on how morality and ethics ma...

  8. The liquidity preference theory: a critical analysis

    OpenAIRE

    Giancarlo Bertocco; Andrea Kalajzic

    2014-01-01

    Keynes in the General Theory, explains the monetary nature of the interest rate by means of the liquidity preference theory. The objective of this paper is twofold. First, to point out the limits of the liquidity preference theory. Second, to present an explanation of the monetary nature of the interest rate based on the arguments with which Keynes responded to the criticism levelled at the liquidity preference theory by supporters of the loanable funds theory such as Ohlin and Robertson. It ...

  9. Criticality analysis of a spent fuel shipping cask

    International Nuclear Information System (INIS)

    Pena, J.

    1984-01-01

    Criticality analysis for a system yields to the determination of the multiplication factor. Should such analysis be performed for a spent fuel shipping cask some standards must be accomplished. In this study a sample design is analyzed and criticality results are presented. (author)

  10. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  11. Sensitivity Analysis of Criticality for Different Nuclear Fuel Shapes

    International Nuclear Information System (INIS)

    Kang, Hyun Sik; Jang, Misuk; Kim, Seoung Rae

    2016-01-01

    Rod-type nuclear fuel was mainly developed in the past, but recent study has been extended to plate-type nuclear fuel. Therefore, this paper reviews the sensitivity of criticality according to different shapes of nuclear fuel types. Criticality analysis was performed using MCNP5. MCNP5 is well-known Monte Carlo codes for criticality analysis and a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical systems. We performed the sensitivity analysis of criticality for different fuel shapes. In sensitivity analysis for simple fuel shapes, the criticality is proportional to the surface area. But for fuel Assembly types, it is not proportional to the surface area. In sensitivity analysis for intervals between plates, the criticality is greater as the interval increases, but if the interval is greater than 8mm, it showed an opposite trend that the criticality decrease by a larger interval. As a result, it has failed to obtain the logical content to be described in common for all cases. The sensitivity analysis of Criticality would be always required whenever subject to be analyzed is changed

  12. Sensitivity Analysis of Criticality for Different Nuclear Fuel Shapes

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Hyun Sik; Jang, Misuk; Kim, Seoung Rae [NESS, Daejeon (Korea, Republic of)

    2016-10-15

    Rod-type nuclear fuel was mainly developed in the past, but recent study has been extended to plate-type nuclear fuel. Therefore, this paper reviews the sensitivity of criticality according to different shapes of nuclear fuel types. Criticality analysis was performed using MCNP5. MCNP5 is well-known Monte Carlo codes for criticality analysis and a general-purpose Monte Carlo N-Particle code that can be used for neutron, photon, electron or coupled neutron / photon / electron transport, including the capability to calculate eigenvalues for critical systems. We performed the sensitivity analysis of criticality for different fuel shapes. In sensitivity analysis for simple fuel shapes, the criticality is proportional to the surface area. But for fuel Assembly types, it is not proportional to the surface area. In sensitivity analysis for intervals between plates, the criticality is greater as the interval increases, but if the interval is greater than 8mm, it showed an opposite trend that the criticality decrease by a larger interval. As a result, it has failed to obtain the logical content to be described in common for all cases. The sensitivity analysis of Criticality would be always required whenever subject to be analyzed is changed.

  13. Bilingualism and cognitive reserve: A critical overview and a plea for methodological innovations

    Directory of Open Access Journals (Sweden)

    Noelia eCalvo

    2016-01-01

    Full Text Available The decline of cognitive skills throughout healthy or pathological aging can be slowed down by experiences which foster cognitive reserve (CR. Recently, some studies on Alzheimer’s disease have suggested that CR may be enhanced by life-long bilingualism. However, the evidence is inconsistent and based on retrospective approaches featuring several methodological weaknesses. Some studies demonstrated at least four years of delay in dementia symptoms, while others did not find such an effect. Moreover, various methodological aspects vary from study to study. The present paper addresses contradictory findings, identifies possible lurking variables, and outlines methodological alternatives thereof. First, we characterize possible confounding factors that may have influenced extant results. Our focus is on the criteria to establish bilingualism, differences in sample design, the instruments used to examine cognitive skills, and the role of variables known to modulate life-long cognition. Second, we propose that these limitations could be largely circumvented through experimental approaches. Proficiency in the non-native language can be successfully assessed by combining subjective and objective measures; confounding variables which have been distinctively associated with certain bilingual groups (e.g., alcoholism, sleep disorders can be targeted through relevant instruments; and cognitive status might be better tapped via robust cognitive screenings and executive batteries. Moreover, future research should incorporate tasks yielding predictable patterns of contrastive performance between bilinguals and monolinguals. Crucially, these include instruments which reveal bilingual disadvantages in vocabulary, null effects in working memory, and advantages in inhibitory control and other executive functions. Finally, paradigms tapping proactive interference (which assess the disruptive effect of long-term memory on newly learned information could also

  14. Bilingualism and Cognitive Reserve: A Critical Overview and a Plea for Methodological Innovations.

    Science.gov (United States)

    Calvo, Noelia; García, Adolfo M; Manoiloff, Laura; Ibáñez, Agustín

    2015-01-01

    The decline of cognitive skills throughout healthy or pathological aging can be slowed down by experiences which foster cognitive reserve (CR). Recently, some studies on Alzheimer's disease have suggested that CR may be enhanced by life-long bilingualism. However, the evidence is inconsistent and largely based on retrospective approaches featuring several methodological weaknesses. Some studies demonstrated at least 4 years of delay in dementia symptoms, while others did not find such an effect. Moreover, various methodological aspects vary from study to study. The present paper addresses contradictory findings, identifies possible lurking variables, and outlines methodological alternatives thereof. First, we characterize possible confounding factors that may have influenced extant results. Our focus is on the criteria to establish bilingualism, differences in sample design, the instruments used to examine cognitive skills, and the role of variables known to modulate life-long cognition. Second, we propose that these limitations could be largely circumvented through experimental approaches. Proficiency in the non-native language can be successfully assessed by combining subjective and objective measures; confounding variables which have been distinctively associated with certain bilingual groups (e.g., alcoholism, sleep disorders) can be targeted through relevant instruments; and cognitive status might be better tapped via robust cognitive screenings and executive batteries. Moreover, future research should incorporate tasks yielding predictable patterns of contrastive performance between bilinguals and monolinguals. Crucially, these include instruments which reveal bilingual disadvantages in vocabulary, null effects in working memory, and advantages in inhibitory control and other executive functions. Finally, paradigms tapping proactive interference (which assess the disruptive effect of long-term memory on newly learned information) could also offer useful data

  15. Bilingualism and Cognitive Reserve: A Critical Overview and a Plea for Methodological Innovations

    Science.gov (United States)

    Calvo, Noelia; García, Adolfo M.; Manoiloff, Laura; Ibáñez, Agustín

    2016-01-01

    The decline of cognitive skills throughout healthy or pathological aging can be slowed down by experiences which foster cognitive reserve (CR). Recently, some studies on Alzheimer's disease have suggested that CR may be enhanced by life-long bilingualism. However, the evidence is inconsistent and largely based on retrospective approaches featuring several methodological weaknesses. Some studies demonstrated at least 4 years of delay in dementia symptoms, while others did not find such an effect. Moreover, various methodological aspects vary from study to study. The present paper addresses contradictory findings, identifies possible lurking variables, and outlines methodological alternatives thereof. First, we characterize possible confounding factors that may have influenced extant results. Our focus is on the criteria to establish bilingualism, differences in sample design, the instruments used to examine cognitive skills, and the role of variables known to modulate life-long cognition. Second, we propose that these limitations could be largely circumvented through experimental approaches. Proficiency in the non-native language can be successfully assessed by combining subjective and objective measures; confounding variables which have been distinctively associated with certain bilingual groups (e.g., alcoholism, sleep disorders) can be targeted through relevant instruments; and cognitive status might be better tapped via robust cognitive screenings and executive batteries. Moreover, future research should incorporate tasks yielding predictable patterns of contrastive performance between bilinguals and monolinguals. Crucially, these include instruments which reveal bilingual disadvantages in vocabulary, null effects in working memory, and advantages in inhibitory control and other executive functions. Finally, paradigms tapping proactive interference (which assess the disruptive effect of long-term memory on newly learned information) could also offer useful data

  16. Partnership Studies: A New Methodological Approach to Literary Criticism in World Literatures, Languages and Education

    Directory of Open Access Journals (Sweden)

    Antonella Riem Natale

    2015-07-01

    Full Text Available This article briefly describes the innovative research undertaken by the Partnership Studies Group based at the University of Udine (Italy, which, since 1998, has been investigating the possible configurations of a partnership model within contemporary world literatures, language, and education. Partnership Studies draw upon non-binary and trans-disciplinary paradigms as propounded by Riane Eisler, and have been demonstrating their strength and potentialities as epistemological and methodological instruments of transcultural consciousness and awareness, capable of fostering harmonious understanding and relations of reciprocity rather than domination among different cultures.

  17. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  18. Fundamentals of critical analysis: the concept of validity and analysis essentials

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-01-01

    Full Text Available Critical analysis of literature is an assessment process that allows the reader to get an idea of potential error in the results of a study, errors arising either from bias or confusion. Critical analysis attempts to establish whether the study meets expected criteria or methodological conditions. There are many checklists available that are commonly used to guide this analysis, but filling out a checklist is not tantamount to critical appraisal. Internal validity is defined as the extent to which a research finding actually represents the true relationship between exposure and outcome, considering the unique conditions in which the study was carried out. Attention must be given to the inclusion and exclusion criteria that were used, on the sampling methods, on the baseline characteristics of the patients that were enrolled in the study. External validity refers to the possibility of generalizing conclusions beyond the study sample or the study population. External validity includes population validity and ecological validity. Lastly, the article covers potential threats to external validity that must be considered when analyzing a study.

  19. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  20. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  1. Identify and Classify Critical Success Factor of Agile Software Development Methodology Using Mind Map

    OpenAIRE

    Tasneem Abd El Hameed; Mahmoud Abd EL Latif; Sherif Kholief

    2016-01-01

    Selecting the right method, right personnel and right practices, and applying them adequately, determine the success of software development. In this paper, a qualitative study is carried out among the critical factors of success from previous studies. The factors of success match with their relative principles to illustrate the most valuable factor for agile approach success, this paper also prove that the twelve principles poorly identified for few factors resulting from qualitative and qua...

  2. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  3. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  4. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  5. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  6. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    Menzel, Francine; Sabundjian, Gaiane; D'auria, Francesco; Madeira, Alzira A.

    2015-01-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  7. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  8. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  9. Critical analysis of radiologist-patient interaction.

    Science.gov (United States)

    Morris, K J; Tarico, V S; Smith, W L; Altmaier, E M; Franken, E A

    1987-05-01

    A critical incident interview technique was used to identify features of radiologist-patient interactions considered effective and ineffective by patients. During structured interviews with 35 radiology patients and five patients' parents, three general categories of physician behavior were described: attention to patient comfort, explanation of procedure and results, and interpersonal sensitivity. The findings indicated that patients are sensitive to physicians' interpersonal styles and that they want physicians to explain procedures and results in an understandable manner and to monitor their well-being during procedures. The sample size of the study is small; thus further confirmation is needed. However, the implications for training residents and practicing radiologists in these behaviors are important in the current competitive medical milieu.

  10. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  11. Critical reflections on realist review: insights from customizing the methodology to the needs of participatory research assessment.

    Science.gov (United States)

    Jagosh, Justin; Pluye, Pierre; Wong, Geoff; Cargo, Margaret; Salsberg, Jon; Bush, Paula L; Herbert, Carol P; Green, Lawrence W; Greenhalgh, Trish; Macaulay, Ann C

    2014-06-01

    Realist review has increased in popularity as a methodology for complex intervention assessment. Our experience suggests that the process of designing a realist review requires its customization to areas under investigation. To elaborate on this idea, we first describe the logic underpinning realist review and then present critical reflections on our application experience, organized in seven areas. These are the following: (1) the challenge of identifying middle range theory; (2) addressing heterogeneity and lack of conceptual clarity; (3) the challenge of appraising the quality of complex evidence; (4) the relevance of capturing unintended outcomes; (5) understanding the process of context, mechanism, and outcome (CMO) configuring; (6) incorporating middle-range theory in the CMO configuration process; and (7) using middle range theory to advance the conceptualization of outcomes - both visible and seemingly 'hidden'. One conclusion from our experience is that the degree of heterogeneity of the evidence base will determine whether theory can drive the development of review protocols from the outset, or will follow only after an intense period of data immersion. We hope that presenting a critical reflection on customizing realist review will convey how the methodology can be tailored to the often complex and idiosyncratic features of health research, leading to innovative evidence syntheses. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  13. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  14. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  15. Critical review of methodology and application of risk ranking for prioritisation of food and feed related issues, on the basis of the size of anticipated health impact

    NARCIS (Netherlands)

    Fels-Klerx, van der H.J.; Asselt, van E.D.; Raley, M.; Poulsen, M.; Korsgaard, H.; Bredsdorff, L.; Nauta, M.; Flari, V.; Agostino, D' M.; Coles, D.G.; Frewer, L.J.

    2015-01-01

    This study aimed to critically review methodologies for ranking of risks related to feed/food safety and nutritional hazards, on the basis of their anticipated human health impact. An extensive systematic literature review was performed to identify and characterize the available methodologies for

  16. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  17. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  18. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  19. Critical analysis of adsorption data statistically

    Science.gov (United States)

    Kaushal, Achla; Singh, S. K.

    2017-10-01

    Experimental data can be presented, computed, and critically analysed in a different way using statistics. A variety of statistical tests are used to make decisions about the significance and validity of the experimental data. In the present study, adsorption was carried out to remove zinc ions from contaminated aqueous solution using mango leaf powder. The experimental data was analysed statistically by hypothesis testing applying t test, paired t test and Chi-square test to (a) test the optimum value of the process pH, (b) verify the success of experiment and (c) study the effect of adsorbent dose in zinc ion removal from aqueous solutions. Comparison of calculated and tabulated values of t and χ 2 showed the results in favour of the data collected from the experiment and this has been shown on probability charts. K value for Langmuir isotherm was 0.8582 and m value for Freundlich adsorption isotherm obtained was 0.725, both are mango leaf powder.

  20. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  1. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  2. MARKETING MIX: AN ATTEMPT AT CRITICAL ANALYSIS

    OpenAIRE

    Kotliarov I.D.

    2012-01-01

    The present paper contains an analysis of main directions of evolution of marketing mix concept. Typical problems of each approach are demonstrated. Classical form of marketing mix (4Ps) is recommended as the basic form of marketing mix, which, however, may be adapted to specific characteristics of the firm and its industry

  3. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-03-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  4. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  5. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  6. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  7. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  8. Applications of probabilistic risk analysis in nuclear criticality safety design

    International Nuclear Information System (INIS)

    Chang, J.K.

    1992-01-01

    Many documents have been prepared that try to define the scope of the criticality analysis and that suggest adding probabilistic risk analysis (PRA) to the deterministic safety analysis. The report of the US Department of Energy (DOE) AL 5481.1B suggested that an accident is credible if the occurrence probability is >1 x 10 -6 /yr. The draft DOE 5480 safety analysis report suggested that safety analyses should include the application of methods such as deterministic safety analysis, risk assessment, reliability engineering, common-cause failure analysis, human reliability analysis, and human factor safety analysis techniques. The US Nuclear Regulatory Commission (NRC) report NRC SG830.110 suggested that major safety analysis methods should include but not be limited to risk assessment, reliability engineering, and human factor safety analysis. All of these suggestions have recommended including PRA in the traditional criticality analysis

  9. SCALE system cross-section validation for criticality safety analysis

    International Nuclear Information System (INIS)

    Hathout, A.M.; Westfall, R.M.; Dodds, H.L. Jr.

    1980-01-01

    The purpose of this study is to test selected data from three cross-section libraries for use in the criticality safety analysis of UO 2 fuel rod lattices. The libraries, which are distributed with the SCALE system, are used to analyze potential criticality problems which could arise in the industrial fuel cycle for PWR and BWR reactors. Fuel lattice criticality problems could occur in pool storage, dry storage with accidental moderation, shearing and dissolution of irradiated elements, and in fuel transport and storage due to inadequate packing and shipping cask design. The data were tested by using the SCALE system to analyze 25 recently performed critical experiments

  10. Methodological aspects of functional neuroimaging at high field strength: a critical review

    International Nuclear Information System (INIS)

    Scheef, L.; Landsberg, M.W.; Boecker, H.

    2007-01-01

    The last few years have proven that high field magnetic resonance imaging (MRI) is superior in nearly every way to conventional equipment up to 1.5 tesla (T). Following the global success of 3T-scanners in research institutes and medical practices, a new generation of MRI devices with field strengths of 7T and higher is now on the horizon. The introduction of ultra high fields has brought MRI technology closer to the physical limitations and increasingly greater costs are required to achieve this goal. This article provides a critical overview of the advantages and problems of functional neuroimaging using ultra high field strengths. This review is principally limited to T2*-based functional imaging techniques not dependent on contrast agents. The main issues include the significance of high field technology with respect to SNR, CNR, resolution, and sequences, as well as artifacts, noise exposure, and SAR. Of great relevance is the discussion of parallel imaging, which will presumably determine the further development of high and ultra high field strengths. Finally, the importance of high field strengths for functional neuroimaging is explained by selected publications. (orig.)

  11. Embodying Critical and Corporeal Methodology: Digital Storytelling With Young Women in Eating Disorder Recovery

    Directory of Open Access Journals (Sweden)

    Andrea LaMarre

    2016-03-01

    Full Text Available Digital storytelling is as an arts-based research method that offers researchers an opportunity to engage deeply with participants, speak back to dominant discourses, and re-imagine bodily possibilities. In this article, we describe the process of developing a research-based digital storytelling curriculum exploring eating disorder recovery. We have built this curriculum around research interviews with young women in recovery as well as research and popular literature on eating disorder recovery. Here, we highlight how the curriculum acted as a scaffolding device for the participants' artistic creation around their lived experiences of recovery. The participants' stories crystallize what resonated for them in the workshop process: they each have an open-ended narrative arc, emphasize the intercorporeality of recovery, and focus on recovery as process. The nuances within each story reveal unique embodied experiences that contextualize their recoveries. Using the example of eating disorder recovery, we offer an illustration of the possibilities of digital storytelling as a critical arts-based research method and what we gain from doing research differently in terms of participant-researcher relationships and the value of the arts in disrupting dominant discourses. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs160278

  12. Risk and Interdependencies in Critical Infrastructures A Guideline for Analysis

    CERN Document Server

    Utne, Ingrid; Vatn, Jørn

    2012-01-01

    Today’s society is completely dependent on critical networks such as  water supply, sewage, electricity, ICT and transportation. Risk and vulnerability analyses are needed to grasp the impact of threats and hazards. However, these become quite complex as there are strong interdependencies both within and between infrastructure systems. Risk and Interdependencies in Critical Infrastructures: A  guideline for analysis provides methods for analyzing risks and interdependencies of critical infrastructures.  A number of analysis approaches are described and are adapted to each of these infrastructures. Various approaches are also revised, and all are supported by several examples and illustrations. Particular emphasis is given to the analysis of various interdependencies that often exist between the infrastructures.  Risk and Interdependencies in Critical Infrastructures: A  guideline for analysis provides a good tool to identify the hazards that are threatening your infrastructures, and will enhance the un...

  13. Analysis of Critical Earth Observation Priorities for Societal Benefit

    Science.gov (United States)

    Zell, E. R.; Huff, A. K.; Carpenter, A. T.; Friedl, L.

    2011-12-01

    To ensure that appropriate near real-time (NRT) and historical Earth observation data are available to benefit society and meet end-user needs, the Group on Earth Observations (GEO) sponsored a multi-disciplinary study to identify a set of critical and common Earth observations associated with 9 Societal Benefit Areas (SBAs): Agriculture, Biodiversity, Climate, Disasters, Ecosystems, Energy, Health, Water, and Weather. GEO is an intergovernmental organization working to improve the availability, access, and use of Earth observations to benefit society through a Global Earth Observation System of Systems (GEOSS). The study, overseen by the GEO User Interface Committee, focused on the "demand" side of Earth observation needs: which users need what types of data, and when? The methodology for the study was a meta-analysis of over 1,700 publicly available documents addressing Earth observation user priorities, under the guidance of expert advisors from around the world. The result was a ranking of 146 Earth observation parameters that are critical and common to multiple SBAs, based on an ensemble of 4 statistically robust methods. Within the results, key details emerged on NRT observations needed to serve a broad community of users. The NRT observation priorities include meteorological parameters, vegetation indices, land cover and soil property observations, water body and snow cover properties, and atmospheric composition. The results of the study and examples of NRT applications will be presented. The applications are as diverse as the list of priority parameters. For example, NRT meteorological and soil moisture information can support monitoring and forecasting for more than 25 infectious diseases, including epidemic diseases, such as malaria, and diseases of major concern in the U.S., such as Lyme disease. Quickly evolving events that impact forests, such as fires and insect outbreaks, can be monitored and forecasted with a combination of vegetation indices, fuel

  14. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  15. A faster reactor transient analysis methodology for PCs

    International Nuclear Information System (INIS)

    Ott, K.O.

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the ''quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report

  16. Critical parameters for isobutane determined by the image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Masui, G. [Center for Multiscale Mechanics and Mechanical Systems, Keio University, Hiyoshi 3-14-1, Kohoku-ku, Yokohama 223-8522 (Japan); Honda, Y. [Center for Multiscale Mechanics and Mechanical Systems, Keio University, Hiyoshi 3-14-1, Kohoku-ku, Yokohama 223-8522 (Japan); Uematsu, M. [Center for Multiscale Mechanics and Mechanical Systems, Keio University, Hiyoshi 3-14-1, Kohoku-ku, Yokohama 223-8522 (Japan)]. E-mail: uematsu@mech.keio.ac.jp

    2006-12-15

    (p, {rho}, T) Measurements and visual observations of the meniscus for isobutane were carried out carefully in the critical region over the range of temperatures: -15 mK {<=} (T - T {sub c}) {<=} 35 mK, and of densities: -7.5 kg . m{sup -3} {<=} ({rho} - {rho} {sub c}) {<=} 7.5 kg . m{sup -3} by a metal-bellows volumometer with an optical cell. Vapor pressures were also measured at T = (310, 405, 406, 407, and 407.5) K. The critical point of T {sub c} and {rho} {sub c} was determined by the image analysis of the critical opalescence which is proposed in this study. The critical pressure p {sub c} was determined to be the pressure measurement at the critical point. Comparisons of the critical parameters with values given in the literature are presented.

  17. Critical parameters for isobutane determined by the image analysis

    International Nuclear Information System (INIS)

    Masui, G.; Honda, Y.; Uematsu, M.

    2006-01-01

    (p, ρ, T) Measurements and visual observations of the meniscus for isobutane were carried out carefully in the critical region over the range of temperatures: -15 mK ≤ (T - T c ) ≤ 35 mK, and of densities: -7.5 kg . m -3 ≤ (ρ - ρ c ) ≤ 7.5 kg . m -3 by a metal-bellows volumometer with an optical cell. Vapor pressures were also measured at T = (310, 405, 406, 407, and 407.5) K. The critical point of T c and ρ c was determined by the image analysis of the critical opalescence which is proposed in this study. The critical pressure p c was determined to be the pressure measurement at the critical point. Comparisons of the critical parameters with values given in the literature are presented

  18. A critical analysis of the quark status

    CERN Document Server

    Basile, M; Giusti, P; Massam, Thomas; Palmonari, F; Romeo, G C; Valenti, G; Zichichi, A

    1977-01-01

    A world analysis of the experiments to search for quarks shows that the general belief that quarks do not exist is not based on such good experimental grounds. For example, the extensive searches so far performed in strong interactions are limited to small p/sub T/ values; the electromagnetic case is even worse, while quark production in weak interactions is at present an unexplored field. Intuitive arguments on a plausible proton-breaking mechanism are presented in order to emphasize the serious limitations of the experiments performed so far, and to stimulate further searches in the right direction. (15 refs).

  19. Monte Carlo criticality analysis for dissolvers with neutron poison

    International Nuclear Information System (INIS)

    Yu, Deshun; Dong, Xiufang; Pu, Fuxiang.

    1987-01-01

    Criticality analysis for dissolvers with neutron poison is given on the basis of Monte Carlo method. In Monte Carlo calculations of thermal neutron group parameters for fuel pieces, neutron transport length is determined in terms of maximum cross section approach. A set of related effective multiplication factors (K eff ) are calculated by Monte Carlo method for the three cases. Related numerical results are quite useful for the design and operation of this kind of dissolver in the criticality safety analysis. (author)

  20. Critical analysis of marketing in Croatian publishing

    Directory of Open Access Journals (Sweden)

    Silvija Gašparić

    2018-03-01

    Full Text Available Marketing is an inevitable part of today's modern lifestyle. The role that marketing plays is so big that it has become the most important part of business. Due to crisis that is still affecting publishers in Croatia, this paper emphasizes the power of advertising as a key ingredient in how to overcome this situation and upgrade the system of publishing in Croatia. The framework of the paper is based on marketing as a tool that leads to popularization of books and sales increase. Beside the experimental part which gives an insight into public's opinion about books, publishing and marketing, the first chapter gives the literature review and analysis conducted on the whole process of book publishing in Croatia with pointing out mistakes that Croatian publishers make. Also, benefits of foreign publishing will be mentioned and used for comparison and projection on to the problems of the native market. The aim of this analysis and this viewpoint paper is to contribute the comprehension of marketing strategies and activities and its use and gains in Croatian publishing.

  1. Determination Methodology of the Fiduciary Law and Critic Towards Sharia Fiduciary Institutional Dualism and its Legislation

    Directory of Open Access Journals (Sweden)

    Iwan Setiawan

    2015-04-01

    Full Text Available The Qur’an’s verse, Al-Baqarah: 283, explains rahn in a trip condition, however, the fact that the enforcement of rahn also occurs in muqim condition. Syar’ah fiduciary needs to be included into law as Muslims in Indonesia are the majority and based on the decision of OKI (Islamic Conference Organization, Indonesia is as an Islamic country. Therefore, by being supported by Pancasila, it generates God Sovereignty theory. The research method used is normative juridical method: that is the law research method done by researching literature. The research shows that: 1 the fiduciary truth (rahn in Islamic law is (syara’ making a thing having value of property in the view of Syara’ as a debt guarantee, which enables to take the whole or part of the debt of the thing. Method of istinbath al-hukmi is related to the fiduciary done by the fiduciary institution using qiyas (metaphor, 2 the emergence of dualism of sharia fiduciary institution and sharia banking was caused by the existence of fatwa of National Sharia Board MUI (Indonesian Council of Religious Scholars No: 25/DSN-MUI/lII/2002, and Fatwa of National Sharia Board MUI No: 26/DSN-MUI/II1/2002, in which sharia financial institution is allowed to produce the product which accords with syarih principles, 3 critic towards law No. 21, 2008 and governmental regulations No. 51, 2011 are not directly clarified, but has similarities and explanations concerning a product that must accord with sharia principles.

  2. SCALE 5: Powerful new criticality safety analysis tools

    International Nuclear Information System (INIS)

    Bowman, Stephen M.; Hollenbach, Daniel F.; Dehart, Mark D.; Rearden, Bradley T.; Gauld, Ian C.; Goluoglu, Sedat

    2003-01-01

    Version 5 of the SCALE computer software system developed at Oak Ridge National Laboratory, scheduled for release in December 2003, contains several significant new modules and sequences for criticality safety analysis and marks the most important update to SCALE in more than a decade. This paper highlights the capabilities of these new modules and sequences, including continuous energy flux spectra for processing multigroup problem-dependent cross sections; one- and three-dimensional sensitivity and uncertainty analyses for criticality safety evaluations; two-dimensional flexible mesh discrete ordinates code; automated burnup-credit analysis sequence; and one-dimensional material distribution optimization for criticality safety. (author)

  3. Computational methods for criticality safety analysis within the scale system

    International Nuclear Information System (INIS)

    Parks, C.V.; Petrie, L.M.; Landers, N.F.; Bucholz, J.A.

    1986-01-01

    The criticality safety analysis capabilities within the SCALE system are centered around the Monte Carlo codes KENO IV and KENO V.a, which are both included in SCALE as functional modules. The XSDRNPM-S module is also an important tool within SCALE for obtaining multiplication factors for one-dimensional system models. This paper reviews the features and modeling capabilities of these codes along with their implementation within the Criticality Safety Analysis Sequences (CSAS) of SCALE. The CSAS modules provide automated cross-section processing and user-friendly input that allow criticality safety analyses to be done in an efficient and accurate manner. 14 refs., 2 figs., 3 tabs

  4. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  5. Analysis of Critical Infrastructure Dependencies and Interdependencies

    Energy Technology Data Exchange (ETDEWEB)

    Petit, Frederic [Argonne National Lab. (ANL), Argonne, IL (United States); Verner, Duane [Argonne National Lab. (ANL), Argonne, IL (United States); Brannegan, David [Argonne National Lab. (ANL), Argonne, IL (United States); Buehring, William [Argonne National Lab. (ANL), Argonne, IL (United States); Dickinson, David [Argonne National Lab. (ANL), Argonne, IL (United States); Guziel, Karen [Argonne National Lab. (ANL), Argonne, IL (United States); Haffenden, Rebecca [Argonne National Lab. (ANL), Argonne, IL (United States); Phillips, Julia [Argonne National Lab. (ANL), Argonne, IL (United States); Peerenboom, James [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-06-01

    The report begins by defining dependencies and interdependencies and exploring basic concepts of dependencies in order to facilitate a common understanding and consistent analytical approaches. Key concepts covered include; Characteristics of dependencies: upstream dependencies, internal dependencies, and downstream dependencies; Classes of dependencies: physical, cyber, geographic, and logical; and Dimensions of dependencies: operating environment, coupling and response behavior, type of failure, infrastructure characteristics, and state of operations From there, the report proposes a multi-phase roadmap to support dependency and interdependency assessment activities nationwide, identifying a range of data inputs, analysis activities, and potential products for each phase, as well as key steps needed to progress from one phase to the next. The report concludes by outlining a comprehensive, iterative, and scalable framework for analyzing dependencies and interdependencies that stakeholders can integrate into existing risk and resilience assessment efforts.

  6. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  7. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  8. Challenges in the vulnerability and risk analysis of critical infrastructures

    International Nuclear Information System (INIS)

    Zio, Enrico

    2016-01-01

    The objective of this paper is to provide a systematic view on the problem of vulnerability and risk analysis of critical infrastructures. Reflections are made on the inherent complexities of these systems, related challenges are identified and possible ways forward for their analysis and management are indicated. Specifically: the framework of vulnerability and risk analysis is examined in relation to its application for the protection and resilience of critical infrastructures; it is argued that the complexity of these systems is a challenging characteristic, which calls for the integration of different modeling perspectives and new approaches of analysis; examples of are given in relation to the Internet and, particularly, the electric power grid, as representative of critical infrastructures and the associated complexity; the integration of different types of analyses and methods of system modeling is put forward for capturing the inherent structural and dynamic complexities of critical infrastructures and eventually evaluating their vulnerability and risk characteristics, so that decisions on protections and resilience actions can be taken with the required confidence. - Highlights: • The problem of the protection and resilience of CIs is the focus of the work. • The vulnerability and risk analysis framework for this is critically examined. • The complexity of CIs is presented as a challenge for system modeling and analysis. • The integration of different modeling perspectives of analysis is put forward as a solution. • The extension of the analysis framework to new methods for dealing with surprises and black swans is advocated.

  9. Critical experiments, measurements, and analyses to establish a crack arrest methodology for nuclear pressure vessel steels

    International Nuclear Information System (INIS)

    Hahn, G.T.

    1977-01-01

    Substantial progress was made in three important areas: crack propagation and arrest theory, two-dimensional dynamic crack propagation analyses, and a laboratory test method for the material property data base. The major findings were as follows: Measurements of run-arrest events lent support to the dynamic, energy conservation theory of crack arrest. A two-dimensional, dynamic, finite-difference analysis, including inertia forces and thermal gradients, was developed. The analysis was successfully applied to run-arrest events in DCB (double-cantilever-beam) and SEN (single-edge notched) test pieces. A simplified procedure for measuring K/sub D/ and K/sub Im/ values with ordinary and duplex DCB specimens was demonstrated. The procedure employs a dynamic analysis of the crack length at arrest and requires no special instrumentation. The new method was applied to ''duplex'' specimens to measure the large K/sub D/ values displayed by A533B steel above the nil-ductility temperature. K/sub D/ crack velocity curves and K/sub Im/ values of two heats of A533B steel and the corresponding values for the plane strain fracture toughness associated with static initiation (K/sub Ic/), dynamic initiation (K/sub Id/), and the static stress intensity at crack arrest (K/sub Ia/) were measured. Possible relations among these toughness indices are identified. During the past year the principal investigators of the participating groups reached agreement on a crack arrest theory appropriate for the pressure vessel problem. 7 figures

  10. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  11. Critical analysis of world uranium resources

    Science.gov (United States)

    Hall, Susan; Coleman, Margaret

    2013-01-01

    The U.S. Department of Energy, Energy Information Administration (EIA) joined with the U.S. Department of the Interior, U.S. Geological Survey (USGS) to analyze the world uranium supply and demand balance. To evaluate short-term primary supply (0–15 years), the analysis focused on Reasonably Assured Resources (RAR), which are resources projected with a high degree of geologic assurance and considered to be economically feasible to mine. Such resources include uranium resources from mines currently in production as well as resources that are in the stages of feasibility or of being permitted. Sources of secondary supply for uranium, such as stockpiles and reprocessed fuel, were also examined. To evaluate long-term primary supply, estimates of uranium from unconventional and from undiscovered resources were analyzed. At 2010 rates of consumption, uranium resources identified in operating or developing mines would fuel the world nuclear fleet for about 30 years. However, projections currently predict an increase in uranium requirements tied to expansion of nuclear energy worldwide. Under a low-demand scenario, requirements through the period ending in 2035 are about 2.1 million tU. In the low demand case, uranium identified in existing and developing mines is adequate to supply requirements. However, whether or not these identified resources will be developed rapidly enough to provide an uninterrupted fuel supply to expanded nuclear facilities could not be determined. On the basis of a scenario of high demand through 2035, 2.6 million tU is required and identified resources in operating or developing mines is inadequate. Beyond 2035, when requirements could exceed resources in these developing properties, other sources will need to be developed from less well-assured resources, deposits not yet at the prefeasibility stage, resources that are currently subeconomic, secondary sources, undiscovered conventional resources, and unconventional uranium supplies. This

  12. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  13. Potential impacts of ENDF/B-V on critical experiment analysis based on ZEBRA-8 criticals

    Energy Technology Data Exchange (ETDEWEB)

    Choong, T S

    1982-06-01

    The ZEBRA-8 series of null-zone measurements featured a different neutron spectrum for each assembly. The experiments were designed for the purpose of basic data testing. The series cover a range of spectra both harder and softer than that for the LMFBR. The potential impacts of the newly released ENDF/BV cross section library on LMFBR critical exeriment analysis are discussed based on analysis of ZEBRA-8 series.

  14. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  15. Critical incident analysis through narrative reflective practice: A case study

    Directory of Open Access Journals (Sweden)

    Thomas S. C. Farrell

    2013-01-01

    Full Text Available Teachers can reflect on their practices by articulating and exploring incidents they consider critical to themselves or others. By talking about these critical incidents, teachers can make better sense of seemingly random experiences that occur in their teaching because they hold the real inside knowledge, especially personal intuitive knowledge, expertise and experience that is based on their accumulated years as language educators teaching in schools and classrooms. This paper is about one such critical incident analysis that an ESL teacher in Canada revealed to her critical friend and how both used McCabe’s (2002 narrative framework for analyzing an important critical incident that occurred in the teacher’s class.

  16. Population Analysis: A Methodology for Understanding Populations in COIN Environments

    National Research Council Canada - National Science Library

    Burke, Mark C; Self, Eric C

    2008-01-01

    .... Our methodology provides a heuristic model, called the "3 x 5 P.I.G.S.P.E.E.R. Model," that can be applied in any environment and will help bridge the gap between strategic theory and tactical implementation...

  17. SCIENTIFIC METHODOLOGY FOR THE APPLIED SOCIAL SCIENCES: CRITICAL ANALYSES ABOUT RESEARCH METHODS, TYPOLOGIES AND CONTRIBUTIONS FROM MARX, WEBER AND DURKHEIM

    Directory of Open Access Journals (Sweden)

    Mauricio Corrêa da Silva

    2015-06-01

    Full Text Available This study aims to discuss the importance of the scientific method to conduct and advertise research in applied social sciences and research typologies, as well as to highlight contributions from Marx, Weber and Durkheim to the scientific methodology. To reach this objective, we conducted a review of the literature on the term research, the scientific method,the research techniques and the scientific methodologies. The results of the investigation revealed that it is fundamental that the academic investigator uses a scientific method to conduct and advertise his/her academic works in applied social sciences in comparison with the biochemical or computer sciences and in the indicated literature. Regarding the contributions to the scientific methodology, we have Marx, dialogued, the dialectical, striking analysis, explicative of social phenomenon, the need to understand the phenomena as historical and concrete totalities; Weber, the distinction between “facts” and “value judgments” to provide objectivity to the social sciences and Durkheim, the need to conceptualize very well its object of study, reject sensible data and imbue with the spirit of discovery and of being surprised with the results.

  18. Cognitive systems engineering analysis of the JCO criticality accident

    International Nuclear Information System (INIS)

    Tanabe, Fumiya; Yamaguchi, Yukichi

    2000-01-01

    The JCO Criticality Accident is analyzed with a framework based on cognitive systems engineering. With the framework, analysis is conducted integrally both from the system viewpoint and actors viewpoint. The occupational chemical risk was important as safety constraint for the actors as well as the nuclear risk, which is due to criticality accident, to the public and to actors. The inappropriate actor's mental model of the work system played a critical role and several factors (e.g. poor training and education, lack of information on criticality safety control in the procedures and instructions, and lack of warning signs at workplace) contributed to form and shape the mental model. Based on the analysis, several countermeasures, such as warning signs, information system for supporting actors and improved training and education, are derived to prevent such an accident. (author)

  19. Analysis of the criticality safety of a nuclear fuel deposit

    International Nuclear Information System (INIS)

    Landeyro, P.A.; Mincarini, M.

    1987-01-01

    In the present work a safety analysis from criticality accidents of nuclear fuel deposits is performed. The analysis is performed utilizing two methods derived from different physical principes: 1) superficial density method, obtained from experimental research; 2) solid angle method, derived from transport theory

  20. Religious Education in Russia: A Comparative and Critical Analysis

    Science.gov (United States)

    Blinkova, Alexandra; Vermeer, Paul

    2018-01-01

    RE in Russia has been recently introduced as a compulsory regular school subject during the last year of elementary school. The present study offers a critical analysis of the current practice of Russian RE by comparing it with RE in Sweden, Denmark and Britain. This analysis shows that Russian RE is ambivalent. Although it is based on a…

  1. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  2. CRITICISM AND SUPPORT TO CORPORATE SOCIAL RESPONSIBILITY: AN ETHNOGRAPHIC APPROACH BASED ON THE WORKERS’ EXPERIENCE AND A QUALITATIVE METHODOLOGY PROPOSAL

    Directory of Open Access Journals (Sweden)

    JUAN ANTONIO NAVARRO PRADOS

    2007-01-01

    Full Text Available This paper aims at presenting partial results and process of an investigation which analyzes the experience of a groupof employees, namely the experience of the implementation process of the corporate social responsibility policy of amedium-sized service company. For the case study participant observation, analysis of corporate documents and indepthinterviews to 64 employees across all organisational levels were employed. AtlasTi software was used to analyseand feed back the information received. This analysis produced a matrix of 161 content codes further analysed bymeans of network analysis methodology. Eventually, content network data were compared to the corporate sociogram.The investigation has been carried out during the last three years.

  3. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  4. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  5. Diagnosing the EAP needs of Turkish medical students: A longitudinal critical needs analysis

    Directory of Open Access Journals (Sweden)

    Neslihan Önder Özdemir

    2014-10-01

    Full Text Available This study uses a longitudinal critical needs analysis to diagnose the English for academic purposes (EAP needs of Turkish medical students seeking proficiency in medical English and contribute to needs analysis methodology. The data were collected from medical students and specialists. To obtain valid and reliable information about medical students’ needs, three types of instruments were used: ethnographic methods, including sustained observation and participation in a research setting; reflective journals; and a questionnaire and in-depth interview. The questionnaire design was based on essays collected from the students during their study, and the items were constructed from the students’ own words. To the best of my knowledge, this study is the first attempt in the literature to triangulate both methods and data with a focus on critical pedagogy to diagnose EAP needs. The findings are the result of the triangulation of data and methodology to ensure the reliability and validity of the findings. A total of 525 subjects participated in the research (186 participants in the pilot study and 339 participants in the main study. The findings revealed medical students’ expectations of their English for specific purposes (ESP instructor, students’ shortcomings, and the problems and strategies they use while learning medical English. The interview data analysis sought to determine whether higher education students can be a reliable source to consult for their own educational needs in higher education. The methodology followed here can be replicated in other mainstream classrooms.

  6. Sports Management for Sports Massification Planned and Executed by Social Organizations. Critics to Models, Experiences and Proposal Methodological Accompaniment

    OpenAIRE

    Lorenza Antonia Reyes de Duran

    2016-01-01

    The proposal analysis, interpretation, disassembly, self-criticism and guidance is born and comes from work experience planned mass sports and social organizations opposed-not in the conventional sense comparative-private business models and sport, state and management. The contribution made by the sports management experience from positions of power, either state or business are undeniable and its impact is difficult to express in numbers for its humanistic value, which is incalculable. Howe...

  7. Conceptual and critical analysis of the Implicit Leadership Theory

    OpenAIRE

    Hernández Avilés, Omar David; García Ramos, Tania

    2013-01-01

    The purpose of this essay is to present a conceptual and critical analysis of the Implicit Leadership Theory (ILT). The objectives are: 1) explaining the main concepts of the ILT; 2) explaining the main processes of the ILT; 3) identifying constructivist assumptions in the ILT; 4) identifying constructionist assumptions in the ILT, and 5) analyzing critically theoretical assumptions of the ILT. At analyzing constructivism and constructionism assumptions in the ILP, the constructivist leadersh...

  8. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  9. Reading the World's Classics Critically: A Keyword-Based Approach to Literary Analysis in Foreign Language Studies

    Science.gov (United States)

    García, Nuria Alonso; Caplan, Alison

    2014-01-01

    While there are a number of important critical pedagogies being proposed in the field of foreign language study, more attention should be given to providing concrete examples of how to apply these ideas in the classroom. This article offers a new approach to the textual analysis of literary classics through the keyword-based methodology originally…

  10. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  11. Methodologies for analysis of patterning in the mouse RPE sheet

    Science.gov (United States)

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  12. Critical Analysis of the Mathematical Formalism of Theoretical Physics. II. Foundations of Vector Calculus

    Science.gov (United States)

    Kalanov, Temur Z.

    2014-03-01

    A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.

  13. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  14. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    2003-01-01

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  15. Critical parameters for propane determined by the image analysis

    Energy Technology Data Exchange (ETDEWEB)

    Honda, Y.; Sato, T. [Center for Multiscale Mechanics and Mechanical Systems, Keio University, Hiyoshi 3-14-1, Kohoku-ku, Yokohama 223-8522 (Japan); Uematsu, M. [Center for Multiscale Mechanics and Mechanical Systems, Keio University, Hiyoshi 3-14-1, Kohoku-ku, Yokohama 223-8522 (Japan)], E-mail: uematsu@mech.keio.ac.jp

    2008-02-15

    The (p, {rho}, T) measurements and visual observations of the meniscus for propane were carried out carefully in the critical region over the range of temperatures: -60 mK {<=} (T - T{sub c}) {<=} 40 mK and of densities: -4 kg . m{sup -3} {<=} ({rho} - {rho}{sub c}) {<=} 6 kg . m{sup -3} by a metal-bellows volumometer with an optical cell. Vapour pressures were also measured at T = (320.000, 343.132, 369.000, and 369.625) K. The critical point of T{sub c}, {rho}{sub c}, and p{sub c} was determined by the image analysis of the critical opalescence. Comparisons of the critical parameters with values given in the literature are presented.

  16. Critical parameters for propane determined by the image analysis

    International Nuclear Information System (INIS)

    Honda, Y.; Sato, T.; Uematsu, M.

    2008-01-01

    The (p, ρ, T) measurements and visual observations of the meniscus for propane were carried out carefully in the critical region over the range of temperatures: -60 mK ≤ (T - T c ) ≤ 40 mK and of densities: -4 kg . m -3 ≤ (ρ - ρ c ) ≤ 6 kg . m -3 by a metal-bellows volumometer with an optical cell. Vapour pressures were also measured at T = (320.000, 343.132, 369.000, and 369.625) K. The critical point of T c , ρ c , and p c was determined by the image analysis of the critical opalescence. Comparisons of the critical parameters with values given in the literature are presented

  17. Searching for scientific literacy and critical pedagogy in socioscientific curricula: A critical discourse analysis

    Science.gov (United States)

    Cummings, Kristina M.

    The omnipresence of science and technology in our society require the development of a critical and scientifically literate citizenry. However, the inclusion of socioscientific issues, which are open-ended controversial issues informed by both science and societal factors such as politics, economics, and ethics, do not guarantee the development of these skills. The purpose of this critical discourse analysis is to identify and analyze the discursive strategies used in intermediate science texts and curricula that address socioscientific topics and the extent to which the discourses are designed to promote or suppress the development of scientific literacy and a critical pedagogy. Three curricula that address the issue of energy and climate change were analyzed using Gee's (2011) building tasks and inquiry tools. The curricula were written by an education organization entitled PreSEES, a corporate-sponsored group called NEED, and a non-profit organization named Oxfam. The analysis found that the PreSEES and Oxfam curricula elevated the significance of climate change and the NEED curriculum deemphasized the issue. The PreSEES and Oxfam curricula promoted the development of scientific literacy while the NEED curricula suppressed its development. The PreSEES and Oxfam curricula both promoted the development of the critical pedagogy; however, only the Oxfam curricula provided authentic opportunities to enact sociopolitical change. The NEED curricula suppressed the development of critical pedagogy. From these findings, the following conclusions were drawn. When socioscientific issues are presented with the development of scientific literacy and critical pedagogy, the curricula allow students to develop fact-based opinions about the issue. However, curricula that address socioscientific issues without the inclusion of these skills minimize the significance of the issue and normalize the hegemonic worldview promoted by the curricula's authors. Based on these findings

  18. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  19. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  20. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  1. A methodology for accident analysis of fusion breeder blankets and its application to helium-cooled lead–lithium blanket

    International Nuclear Information System (INIS)

    Panayotov, Dobromir; Poitevin, Yves; Grief, Andrew; Trow, Martin; Dillistone, Michael

    2016-01-01

    'Fusion for Energy' (F4E) is designing, developing, and implementing the European Helium-Cooled Lead-Lithium (HCLL) and Helium-Cooled Pebble-Bed (HCPB) Test Blanket Systems (TBSs) for ITER (Nuclear Facility INB-174). Safety demonstration is an essential element for the integration of these TBSs into ITER and accident analysis is one of its critical components. A systematic approach to accident analysis has been developed under the F4E contract on TBS safety analyses. F4E technical requirements, together with Amec Foster Wheeler and INL efforts, have resulted in a comprehensive methodology for fusion breeding blanket accident analysis that addresses the specificity of the breeding blanket designs, materials, and phenomena while remaining consistent with the approach already applied to ITER accident analyses. Furthermore, the methodology phases are illustrated in the paper by its application to the EU HCLL TBS using both MELCOR and RELAP5 codes.

  2. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  3. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  4. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  5. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  6. The Potential Unity of Critical Thinking and Values Analysis.

    Science.gov (United States)

    Browne, M. Neil

    Metaphorically, the head and the heart represent different decision-making strategies. The disjunction between these two cultures is both sharp and unnecessary. The conflict between rationality and emotion is much broader than the tension between critical thinking and values analysis, but the assumptions responsible for the mutual awkwardness of…

  7. Quantifying tight-gas sandstone permeability via critical path analysis

    Science.gov (United States)

    Rock permeability has been actively investigated over the past several decades by the geosciences community. However, its accurate estimation still presents significant technical challenges, especially in spatially complex rocks. In this letter, we apply critical path analysis (CPA) to estimate perm...

  8. Examining Bilingual Children's Gender Ideologies through Critical Discourse Analysis

    Science.gov (United States)

    Martinez-Roldan, Carmen M.

    2005-01-01

    This article presents a case study of young bilingual students' discussions of literature in a second-grade Spanish/English bilingual classroom in the US. Sociocultural, critical, and Chicana feminist perspectives informed an analysis of the ways the children worked at understanding, marking, and resisting gender boundaries. This critical…

  9. Acknowledging the Infrasystem: A Critical Feminist Analysis of Systems Theory.

    Science.gov (United States)

    Creedon, Pamela J.

    1993-01-01

    Examines the absence of a critical feminist perspective in the application of systems theory as a unifying model for public relations. Describes an unacknowledged third system, the infrasystem, that constructs both suprasystem and subsystem interactions. Concludes with a case analysis of sport as illustration. (HB)

  10. Teaching Blended Content Analysis and Critically Vigilant Media Consumption

    Science.gov (United States)

    Harris, Christopher S.

    2015-01-01

    The semester-long activity described herein uses an integrated instructional approach to media studies to introduce students to the research method of qualitative content analysis and help them become more critically vigilant media consumers. The goal is to increase students' media literacy by guiding them in the design of an exploratory…

  11. The Digital Single Market and Legal Certainty : A Critical Analysis

    NARCIS (Netherlands)

    Castermans, A.G.; Graaff, de R.; Haentjens, M.; Colombi, Ciacchi A.

    2016-01-01

    This chapter critically examines the CESL from the viewpoint of its capability to provide legal certainty for commercial actors. This chapter’s analysis focuses on three important stages in the life cycle of a contract, seen from a business perspective: the scope rules that determine whether the

  12. Ideology, Rationality and Reproduction in Education: A Critical Discourse Analysis

    Science.gov (United States)

    Lim, Leonel

    2014-01-01

    In undertaking a critical discourse analysis of the professed aims and objectives of one of the most influential curricula in the teaching of thinking, this article foregrounds issues of power and ideology latent in curricular discourses of rationality. Specifically, it documents the subtle but powerful ways in which political and class…

  13. Critical Discourse Analysis of Advertising: Implications for Language Teacher Education

    Science.gov (United States)

    Turhan, Burcu; Okan, Zuhal

    2017-01-01

    Advertising is a prominent discourse type which is inevitably linked to a range of disciplines. This study examines the language of a non-product advertisement, not isolating it from its interaction with other texts that surrounds it. It is based on Norman Fairclough's Critical Discourse Analysis (CDA) framework in which there are three levels of…

  14. Critical reflection activation analysis - a new near-surface probe

    International Nuclear Information System (INIS)

    Gunn, J.M.F.; Trohidou, K.N.

    1988-09-01

    We propose a new surface analytic technique, Critical Reflection Activation Analysis (CRAA). This technique allows accurate depth profiling of impurities ≤ 100A beneath a surface. The depth profile of the impurity is simply related to the induced activity as a function of the angle of reflection. We argue that the technique is practical and estimate its accuracy. (author)

  15. Ecological literacy materials for use in elementary schools: A critical analysis

    Science.gov (United States)

    Chambers, Joan Maureen

    My research is a critical examination of environmental science education resources for use in Alberta schools. I examine both the resources and the processes by which these resources are developed by diverse groups. My inquiry is guided by the following question: What is the nature of the discourse of ecological literacy in the promotion and content of teaching materials in elementary schools in Alberta? This critical analysis centres on the discourses, language, and perspectives (both hidden and overt) of these resources and processes; the manifestation of political agendas; existing relations; and the inclusion or exclusion of alternate views. Framed within critical theory and an ecosocial construct, my methodology employs critical discourse analysis and hermeneutic interpretation. I analyse selected environmental science resources produced for the elementary classroom by government and nongovernment organizations. I also interview the producers and/or writers of these instructional resources to provide the perspectives of some of the developers of these materials. The findings illustrate how the discursive management of the view of nature, human-nature relationships, uncertainty, multiple perspectives, and dimensions of ecological literacy in materials for schools offer students a particular perspective. These ecological and science discourses act to shape their personal relationships with nature and notions of environmental responsibility and consciousness. This research is necessary because, particularly in Alberta, corporate interests have the potential to impact school curricula. The study points to a need for a critical appraisal of resources for schools produced by the environmental science community.

  16. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  17. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  18. Development of the criticality accident analysis code, AGNES

    International Nuclear Information System (INIS)

    Nakajima, Ken

    1989-01-01

    In the design works for the facilities which handle nuclear fuel, the evaluation of criticality accidents cannot be avoided even if their possibility is as small as negligible. In particular in the system using solution fuel like uranyl nitrate, solution has the property easily becoming dangerous form, and all the past criticality accidents occurred in the case of solution, therefore, the evaluation of criticality accidents becomes the most important item of safety analysis. When a criticality accident occurred in a solution fuel system, due to the generation and movement of radiolysis gas voids, the oscillation of power output and pressure pulses are observed. In order to evaluate the effect of criticality accidents, these output oscillation and pressure pulses must be calculated accurately. For this purpose, the development of the dynamic characteristic code AGNES (Accidentally Generated Nuclear Excursion Simulation code) was carried out. The AGNES is the reactor dynamic characteristic code having two independent void models. Modified energy model and pressure model, and as the benchmark calculation of the AGNES code, the results of the experimental analysis on the CRAC experiment are reported. (K.I.)

  19. Critical Discourse Analysis. The Elaboration of a Problem Oriented Discourse Analytic Approach After Foucault

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-05-01

    Full Text Available Abstract: The German discourse researcher Siegfried JÄGER from Duisburg is the first to have published a German-language book about the methodology of discourse analysis after FOUCAULT. JÄGER integrates in his work the discourse analytic work of Jürgen LINK as well as the interdisciplinary discussion carried on in the discourse analytic journal "kultuRRevolution" (Journal for Applied Discourse Analysis. JÄGER and his co-workers were associated with the Duisburger Institute for Language Research and Social Research (DISS, see http://www.diss-duisburg.de/ for 20 years, developing discourse theory and the methodology of discourse analysis. The interview was done via e-mail. It depicts the discourse analytic approach of JÄGER and his co-workers following the works of FOUCAULT and LINK. The interview reconstructs JÄGERs vita and his academic career. Further topics of the interview are the agenda of JÄGERs discourse studies, methodological considerations, the (problematic relationship between FOUCAULDian discourse analysis and (discourses, linguistics, styles and organization of research and questions concerning applied discourse analytic research as a form of critical intervention. URN: urn:nbn:de:0114-fqs0603219

  20. Multidisciplinary critical discourse analysis: a plea for diversity

    Directory of Open Access Journals (Sweden)

    Teun A. van Dijk

    2013-12-01

    Full Text Available This text is a Brazilian Portuguese version of the chapter from the book “Methods of Critical Discourse Analysis”. The author outlines a Critical Discourse Analysis framework while presents a synthesis of its thinking about the some possible relations between Discourse and Society. The author’s theorical horizon embraces features since the structuralist paradigm to the socio-cognitivo one. At last, the reader can realize an early presentation of the author’s Theory of Context (2001 categories of a theory of context which was published seven years later.

  1. Sensitivity analysis of critical experiments with evaluated nuclear data libraries

    International Nuclear Information System (INIS)

    Fujiwara, D.; Kosaka, S.

    2008-01-01

    Criticality benchmark testing was performed with evaluated nuclear data libraries for thermal, low-enriched uranium fuel rod applications. C/E values for k eff were calculated with the continuous-energy Monte Carlo code MVP2 and its libraries generated from Endf/B-VI.8, Endf/B-VII.0, JENDL-3.3 and JEFF-3.1. Subsequently, the observed k eff discrepancies between libraries were decomposed to specify the source of difference in the nuclear data libraries using sensitivity analysis technique. The obtained sensitivity profiles are also utilized to estimate the adequacy of cold critical experiments to the boiling water reactor under hot operating condition. (authors)

  2. Kinetic analysis of sub-prompt-critical reactor assemblies

    International Nuclear Information System (INIS)

    Das, S.

    1992-01-01

    Neutronic analysis of safety-related kinetics problems in experimental neutron multiplying assemblies has been carried out using a sub-prompt-critical reactor model. The model is based on the concept of a sub-prompt-critical nuclear reactor and the concept of instantaneous neutron multiplication in a reactor system. Computations of reactor power, period and reactivity using the model show excellent agreement with results obtained from exact kinetics method. Analytic expressions for the energy released in a controlled nuclear power excursion are derived. Application of the model to a Pulsed Fast Reactor gives its sensitivity between 4 and 5. (author). 6 refs., 4 figs., 1 tab

  3. GIS methodology for geothermal play fairway analysis: Example from the Snake River Plain volcanic province

    Science.gov (United States)

    DeAngelo, Jacob; Shervais, John W.; Glen, Jonathan; Nielson, Dennis L.; Garg, Sabodh; Dobson, Patrick; Gasperikova, Erika; Sonnenthal, Eric; Visser, Charles; Liberty, Lee M.; Siler, Drew; Evans, James P.; Santellanes, Sean

    2016-01-01

    Play fairway analysis in geothermal exploration derives from a systematic methodology originally developed within the petroleum industry and is based on a geologic and hydrologic framework of identified geothermal systems. We are tailoring this methodology to study the geothermal resource potential of the Snake River Plain and surrounding region. This project has contributed to the success of this approach by cataloging the critical elements controlling exploitable hydrothermal systems, establishing risk matrices that evaluate these elements in terms of both probability of success and level of knowledge, and building automated tools to process results. ArcGIS was used to compile a range of different data types, which we refer to as ‘elements’ (e.g., faults, vents, heatflow…), with distinct characteristics and confidence values. Raw data for each element were transformed into data layers with a common format. Because different data types have different uncertainties, each evidence layer had an accompanying confidence layer, which reflects spatial variations in these uncertainties. Risk maps represent the product of evidence and confidence layers, and are the basic building blocks used to construct Common Risk Segment (CRS) maps for heat, permeability, and seal. CRS maps quantify the variable risk associated with each of these critical components. In a final step, the three CRS maps were combined into a Composite Common Risk Segment (CCRS) map for analysis that reveals favorable areas for geothermal exploration. Python scripts were developed to automate data processing and to enhance the flexibility of the data analysis. Python scripting provided the structure that makes a custom workflow possible. Nearly every tool available in the ArcGIS ArcToolbox can be executed using commands in the Python programming language. This enabled the construction of a group of tools that could automate most of the processing for the project. Currently, our tools are repeatable

  4. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    International Nuclear Information System (INIS)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE's) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE's within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ''site'' perception to a more uniform or ''national'' perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticals data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation

  5. Exploring the Philosophical Underpinnings of Research: Relating Ontology and Epistemology to the Methodology and Methods of the Scientific, Interpretive, and Critical Research Paradigms

    Science.gov (United States)

    Scotland, James

    2012-01-01

    This paper explores the philosophical underpinnings of three major educational research paradigms: scientific, interpretive, and critical. The aim was to outline and explore the interrelationships between each paradigm's ontology, epistemology, methodology and methods. This paper reveals and then discusses some of the underlying assumptions of…

  6. Adding Value to the Learning Process by Online Peer Review Activities: Towards the Elaboration of a Methodology to Promote Critical Thinking in Future Engineers

    Science.gov (United States)

    Dominguez, Caroline; Nascimento, Maria M.; Payan-Carreira, Rita; Cruz, Gonçalo; Silva, Helena; Lopes, José; Morais, Maria da Felicidade A.; Morais, Eva

    2015-01-01

    Considering the results of research on the benefits and difficulties of peer review, this paper describes how teaching faculty, interested in endorsing the acquisition of communication and critical thinking (CT) skills among engineering students, has been implementing a learning methodology throughout online peer review activities. While…

  7. A critical analysis of the tender points in fibromyalgia.

    Science.gov (United States)

    Harden, R Norman; Revivo, Gadi; Song, Sharon; Nampiaparampil, Devi; Golden, Gary; Kirincic, Marie; Houle, Timothy T

    2007-03-01

    To pilot methodologies designed to critically assess the American College of Rheumatology's (ACR) diagnostic criteria for fibromyalgia. Prospective, psychophysical testing. An urban teaching hospital. Twenty-five patients with fibromyalgia and 31 healthy controls (convenience sample). Pressure pain threshold was determined at the 18 ACR tender points and five sham points using an algometer (dolorimeter). The patients "algometric total scores" (sums of the patients' average pain thresholds at the 18 tender points) were derived, as well as pain thresholds across sham points. The "algometric total score" could differentiate patients with fibromyalgia from normals with an accuracy of 85.7% (P pain across sham points than across ACR tender points, sham points also could be used for diagnosis (85.7%; Ps tested vs other painful conditions. The points specified by the ACR were only modestly superior to sham points in making the diagnosis. Most importantly, this pilot suggests single points, smaller groups of points, or sham points may be as effective in diagnosing fibromyalgia as the use of all 18 points, and suggests methodologies to definitively test that hypothesis.

  8. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  9. Applications of a damage tolerance analysis methodology in aircraft design and production

    Science.gov (United States)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  10. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  11. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  12. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  13. Pareto analysis of critical factors affecting technical institution evaluation

    Directory of Open Access Journals (Sweden)

    Victor Gambhir

    2012-08-01

    Full Text Available With the change of education policy in 1991, more and more technical institutions are being set up in India. Some of these institutions provide quality education, but others are merely concentrating on quantity. These stakeholders are in a state of confusion about decision to select the best institute for their higher educational studies. Although various agencies including print media provide ranking of these institutions every year, but their results are controversial and biased. In this paper, the authors have made an endeavor to find the critical factors for technical institution evaluation from literature survey. A Pareto analysis has also been performed to find the intensity of these critical factors in evaluation. This will not only help the stake holders in taking right decisions but will also help the management of institutions in benchmarking for identifying the most important critical areas to improve the existing system. This will in turn help Indian economy.

  14. Critical thickness for Nb nanofilm on sapphire substrate: a critical analysis using finite element method

    International Nuclear Information System (INIS)

    Kumar, Arun; Subramaniam, Anandh

    2009-01-01

    Full text: On growth beyond critical thickness, interfacial misfit dislocations partially relax the misfit strains, in epitaxially grown nanofilms. In this study the stress state and growth of nanofilms is simulated using Finite Element Method (FEM); by imposing stress-free strains, corresponding to the lattice mismatch between Nb nanofilm and Sapphire substrate. On growth of the Nb nanofilm, a triangular network of edge misfit dislocations nucleates at the (0001) Al2ο3 || (111) Nb , interface. Using a combined simulation of a coherently strained nanofilm and an edge dislocation, the equilibrium criterion for the nucleation of an edge dislocation is determined. Theoretical analyses in literature use only the component of the Burger's vector parallel to the interface, which is an erroneous description of the stress state and energetics of the system. In this investigation the full interfacial edge dislocation is simulated using standard commercially available software and comparisons are made with results available in literature to bring out the utility of the methodology

  15. Critical experiments analysis by ABBN-90 constant system

    Energy Technology Data Exchange (ETDEWEB)

    Tsiboulia, A.; Nikolaev, M.N.; Golubev, V. [Institute of Physics and Power Engineering, Obninsk (Russian Federation)] [and others

    1997-06-01

    The ABBN-90 is a new version of the well-known Russian group-constant system ABBN. Included constants were calculated based on files of evaluated nuclear data from the BROND-2, ENDF/B-VI, and JENDL-3 libraries. The ABBN-90 is intended for the calculation of different types of nuclear reactors and radiation shielding. Calculations of criticality safety and reactivity accidents are also provided by using this constant set. Validation of the ABBN-90 set was made by using a computerized bank of evaluated critical experiments. This bank includes the results of experiments conducted in Russia and abroad of compact spherical assemblies with different reflectors, fast critical assemblies, and fuel/water-solution criticalities. This report presents the results of the calculational analysis of the whole collection of critical experiments. All calculations were produced with the ABBN-90 group-constant system. Revealed discrepancies between experimental and calculational results and their possible reasons are discussed. The codes and archives INDECS system is also described. This system includes three computerized banks: LEMEX, which consists of evaluated experiments and their calculational results; LSENS, which consists of sensitivity coefficients; and LUND, which consists of group-constant covariance matrices. The INDECS system permits us to estimate the accuracy of neutronics calculations. A discussion of the reliability of such estimations is finally presented. 16 figs.

  16. Stream habitat analysis using the instream flow incremental methodology

    Science.gov (United States)

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  17. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  18. Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

    OpenAIRE

    J. R. Wang; S. W. Chen; Y. Chiang; W. S. Hsu; J. H. Yang; Y. S. Tseng; C. Shih

    2017-01-01

    In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the paramet...

  19. Risk analysis of critical infrastructures emphasizing electricity supply and interdependencies

    International Nuclear Information System (INIS)

    Kjølle, G.H.; Utne, I.B.; Gjerde, O.

    2012-01-01

    Failures in critical infrastructures can cause major damage to society. Wide-area interruptions (blackouts) in the electricity supply system have severe impacts on societal critical functions and other critical infrastructures, but there is no agreed-upon framework on how to analyze and predict the reliability of electricity supply. Thus, there is a need for an approach to cross-sector risk analyses, which facilitates risk analysis of outages in the electricity supply system and enables investigation of cascading failures and consequences in other infrastructures. This paper presents such an approach, which includes contingency analysis (power flow) and reliability analysis of power systems, as well as use of a cascade diagram for investigating interdependencies. A case study was carried out together with the Emergency Preparedness Group in the city of Oslo, Norway and the network company Hafslund Nett. The case study results highlight the need for cross-sector analyses by showing that the total estimated societal costs are substantially higher when cascading effects and consequences to other infrastructures are taken into account compared to only considering the costs of electricity interruptions as seen by the network company. The approach is a promising starting point for cross-sector risk analysis of electricity supply interruptions and consequences for dependent infrastructures.

  20. Sensitivity analysis of an experimental methodology to determine radionuclide diffusion coefficients in granite

    International Nuclear Information System (INIS)

    Alonso, U.; Missana, T.; Garcia-Gutierrez, M.; Patelli, A.; Rigato, V.

    2005-01-01

    Full text of publication follows: The long-term quantitative analysis of the migration behaviour of the relevant radionuclides (RN) within the geological barrier of a radioactive waste repository requires, amongst other data, the introduction of reliable transport parameters, as diffusion coefficients. Since the determination of diffusion coefficients within crystalline rocks is complex and requires long experimental times even for non-sorbing radionuclides, the data available in the literature are very scarce. The nuclear ion beam technique RBS (Rutherford Backscattering Spectrometry) that is successfully used to determine diffusion profiles in thin film science is here examined as possible suitable technique to determine the diffusion coefficients of different RN within granite. As first step, the technique sensitivity and limitations to analyse diffusion coefficients in granite samples is evaluated, considering that the technique is especially sensitive to heavy elements. The required experimental conditions in terms of experimental times, concentration and methodology of analysis are discussed. The diffusants were selected accounting the RBS sensitivity but also trying to cover different behaviours of critical RN and a wide range of possible oxidation states. In particular, Cs(I) was chosen as representative fission product, while as relevant actinides or homologues, the diffusion of Th(IV), U(IV) and Eu (III) was studied. The diffusion of these above-mentioned cations is compared to the diffusion of Re, and I as representative of anionic species. The methodology allowed evaluating diffusion coefficients in the granite samples and, for most of the elements, the values obtained are in agreement with the values found in the literature. The diffusion coefficients calculated ranged from 10 -13 to 10 -16 m 2 /s. It is remarkable that the RBS technique is especially promising to determine diffusion coefficients of high-sorbing RN and it is applicable to a wide range

  1. Benchmarking criticality analysis of TRIGA fuel storage racks.

    Science.gov (United States)

    Robinson, Matthew Loren; DeBey, Timothy M; Higginbotham, Jack F

    2017-01-01

    A criticality analysis was benchmarked to sub-criticality measurements of the hexagonal fuel storage racks at the United States Geological Survey TRIGA MARK I reactor in Denver. These racks, which hold up to 19 fuel elements each, are arranged at 0.61m (2 feet) spacings around the outer edge of the reactor. A 3-dimensional model was created of the racks using MCNP5, and the model was verified experimentally by comparison to measured subcritical multiplication data collected in an approach to critical loading of two of the racks. The validated model was then used to show that in the extreme condition where the entire circumference of the pool was lined with racks loaded with used fuel the storage array is subcritical with a k value of about 0.71; well below the regulatory limit of 0.8. A model was also constructed of the rectangular 2×10 fuel storage array used in many other TRIGA reactors to validate the technique against the original TRIGA licensing sub-critical analysis performed in 1966. The fuel used in this study was standard 20% enriched (LEU) aluminum or stainless steel clad TRIGA fuel. Copyright © 2016. Published by Elsevier Ltd.

  2. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    Science.gov (United States)

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  3. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  4. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  5. I Frankenstein: from media critical reception to the semiological analysis

    Directory of Open Access Journals (Sweden)

    João Marcos Mateus Kogawa

    2017-10-01

    Full Text Available In 2014, the movie I, Frankenstein was released. This movie has raised some comments from the media criticism, among which we list some to be object of our analysis. The analysis of critical statements reveals a discourse based on the axes of morality, profitability, traditionalism and temporality that produces a disqualification sense, which means that the movie is something that ‘hurts’ the notion of ‘classic’. From this demonstration, this paper questions the claims that the new Frankenstein should respond to a tradition opened by Mary Shelley to point some senses that re-construct the contemporary myth. Therefore, the new Frankenstein requires an interrelationship between technical apparatus - 3D technology - and a contemporary myth - an ideal of consumption facing interactivity.

  6. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  7. Critical analysis of the pedagogical practice of the teachers trainnees

    OpenAIRE

    Mónica Ruiz Quiroga; Cristian Camilo Ortiz Castiblanco; Jhider Soler Mejía

    2013-01-01

    This article reports the results of a research project supported by the Research Center of the Universidad Pedagógica Nacional, whose purpose was the redefinition of the training process of the students, in the frame of the pedagogical practice, in one of the research lines for the Degree in Elementary Education with emphasis on Social Sciences. On a theoretical level, analysis and discussion were developed from critical pedagogy, particularly the concepts of pedagogical practice, training an...

  8. Criticality safety and shielding analysis of WWER-440 fuel configurations

    International Nuclear Information System (INIS)

    Christoskov, I.

    2008-01-01

    An overview is made of some studies performed on the criticality safety and radiation shielding analysis of irradiated WWER-440 fuel storage and handling configurations. The analytical tools are based on the SCALE 4.4a code system, in combination with the TORT discrete ordinates transport code and the BUGLE-96 cross-sections library. The accuracy of some important results is assessed through comparison with independent evaluations and with measurement data. (author)

  9. A Critical Analysis of Attribute Development Programs for Army Leaders

    Science.gov (United States)

    2016-06-10

    implement a holistic approach to developing attributes within its members. These domains are human performance, psychological performance, spiritual ...A CRITICAL ANALYSIS OF ATTRIBUTE DEVELOPMENT PROGRAMS FOR ARMY LEADERS A thesis presented to the Faculty of the U.S. Army...RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 10-06-2016 2. REPORT TYPE Master’s Thesis 3. DATES COVERED (From - To) AUG 2015

  10. Criticality Analysis Of TCA Critical Lattices With MNCP-4C Monte Carlo Calculation

    International Nuclear Information System (INIS)

    Zuhair

    2002-01-01

    The use of uranium-plutonium mixed oxide (MOX) fuel in electric generation light water reactor (PWR, BWR) is being planned in Japan. Therefore, the accuracy evaluations of neutronic analysis code for MOX cores have been employed by many scientists and reactor physicists. Benchmark evaluations for TCA was done using various calculation methods. The Monte Carlo become the most reliable method to predict criticality of various reactor types. In this analysis, the MCNP-4C code was chosen because various superiorities the code has. All in all, the MCNP-4C calculation for TCA core with 38 MOX critical lattice configurations gave the results with high accuracy. The JENDL-3.2 library showed significantly closer results to the ENDF/B-V. The k eff values calculated with the ENDF/B-VI library gave underestimated results. The ENDF/B-V library gave the best estimation. It can be concluded that MCNP-4C calculation, especially with ENDF/B-V and JENDL-3.2 libraries, for MOX fuel utilized NPP design in reactor core is the best choice

  11. American Offensive Funny Riddles: A Critical Metaphor Analysis

    Directory of Open Access Journals (Sweden)

    Ahmed Sahib Jabir Mubarak

    2018-01-01

    Full Text Available The paradox in the offensive humor lies in the assumption that what evokes laughter can be harmful for someone. Linguistically, the offense can be expressed directly and indirectly, additionally, humor, including riddles is one of the most effective ways to show offense or aggression toward someone. Humor, on the other hand, is mostly expressed indirectly. Metaphoric forms are said to be one of the most appealing strategies of humor language. The present study aims at applying a critical metaphor analysis of some randomly selected American offensive humorous riddles related to various aspects of offense like race and nation. In this approach to critical discourse analysis, the cognitive aspect is added for the sake of analyzing figurative forms like metaphor which is considered as an important part of ideology. Thus, critical metaphor analysis covers both social and cognitive aspects. It is concluded that offensive jokes (namely funny riddles can be used as a tool to measure the aggressiveness towards certain social aspects like race; on the other hand, metaphors afford indications of facets of power, inequality and people ideologies in American society.

  12. A methodology for stochastic analysis of share prices as Markov chains with finite states.

    Science.gov (United States)

    Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey

    2014-01-01

    Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.

  13. Disposal criticality analysis for aluminum-based DOE fuels

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1997-11-01

    This paper describes the disposal criticality analysis for canisters containing aluminum-based Department of Energy fuels from research reactors. Different canisters were designed for disposal of highly enriched uranium (HEU) and medium enriched uranium (MEU) fuel. In addition to the standard criticality concerns in storage and transportation, such as flooding, the disposal criticality analysis must consider the degradation of the fuel and components within the waste package. Massachusetts Institute of Technology (MIT) U-Al fuel with 93.5% enriched uranium and Oak Ridge Research Reactor (ORR) U-Si-Al fuel with 21% enriched uranium are representative of the HEU and MEU fuel inventories, respectively. Conceptual canister designs with 64 MIT assemblies (16/layer, 4 layers) or 40 ORR assemblies (10/layer, 4 layers) were developed for these fuel types. Borated stainless steel plates were incorporated into a stainless steel internal basket structure within a 439 mm OD, 15 mm thick XM-19 canister shell. The Codisposal waste package contains 5 HLW canisters (represented by 5 Defense Waste Processing Facility canisters from the Savannah River Site) with the fuel canister placed in the center. It is concluded that without the presence of a fairly insoluble neutron absorber, the long-term action of infiltrating water can lead to a small, but significant, probability of criticality for both the HEU and MEU fuels. The use of 1.5kg of Gd distributed throughout the MIT fuel and the use of carbon steels for the structural basket or 1.1 kg of Gd distributed in the ORR fuel will reduce the probability of criticality to virtually zero for both fuels

  14. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  15. Methodological Analysis of Gregarious Behaviour of Agents in the Financial Markets

    OpenAIRE

    Solodukhin Stanislav V.

    2013-01-01

    The article considers methodological approaches to analysis of gregarious behaviour of agents in the financial markets and also studies foundations of the agent modelling of decision making processes with consideration of the gregarious instinct.

  16. A Methodological Demonstration of Set-theoretical Approach to Social Media Maturity Models Using Necessary Condition Analysis

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2016-01-01

    Despite being widely accepted and applied across research domains, maturity models have been criticized for lacking academic rigor, especially methodologically rigorous and empirically grounded or tested maturity models are quite rare. Attempting to close this gap, we adopt a set-theoretic approach...... and evaluate some of arguments presented by previous conceptual focused social media maturity models....... by applying the Necessary Condition Analysis (NCA) technique to derive maturity stages and stage boundaries conditions. The ontology is to view stages (boundaries) in maturity models as a collection of necessary condition. Using social media maturity data, we demonstrate the strength of our approach...

  17. Methodology for perceptual assessment of speech in patients with cleft palate: a critical review of the literature.

    Science.gov (United States)

    Lohmander, Anette; Olsson, Maria

    2004-01-01

    This review of 88 articles in three international journals was undertaken for the purpose of investigating the methodology for perceptual speech assessment in patients with cleft palate. The articles were published between 1980 and 2000 in the Cleft Palate-Craniofacial Journal, the International Journal of Language and Communication Disorders, and Folia Phoniatrica et Logopaedica. The majority of articles (76) were published in the Cleft Palate-Craniofacial Journal, with an increase in articles during the 1990s and 2000. Information about measures or variables was clearly given in all articles. However, the review raises several major concerns regarding method for collection and documentation of data and method for measurement. The most distressing findings were the use of a cross-sectional design in studies of few patients with large age ranges and different types of clefts, the use of highly variable speech samples, and the lack of information about listeners and on reliability. It is hoped that ongoing national and international collaborative efforts to standardize procedures for collection and analysis of perceptual data will help to eliminate such concerns and thus make comparison of published results possible in the future.

  18. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  19. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10"5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  20. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  1. CRITICAL RADIONUCLIDE AND PATHWAY ANALYSIS FOR THE SAVANNAH RIVER SITE

    Energy Technology Data Exchange (ETDEWEB)

    Jannik, T.

    2011-08-30

    This report is an update to the analysis, Assessment of SRS Radiological Liquid and Airborne Contaminants and Pathways, that was performed in 1997. An electronic version of this large original report is included in the attached CD to this report. During the operational history (1954 to the present) of the Savannah River Site (SRS), many different radionuclides have been released to the environment from the various production facilities. However, as will be shown by this updated radiological critical contaminant/critical pathway analysis, only a small number of the released radionuclides have been significant contributors to potential doses and risks to offsite people. The analysis covers radiological releases to the atmosphere and to surface waters, the principal media that carry contaminants offsite. These releases potentially result in exposure to offsite people. The groundwater monitoring performed at the site shows that an estimated 5 to 10% of SRS has been contaminated by radionuclides, no evidence exists from the extensive monitoring performed that groundwater contaminated with these constituents has migrated off the site (SRS 2011). Therefore, with the notable exception of radiological source terms originating from shallow surface water migration into site streams, onsite groundwater was not considered as a potential exposure pathway to offsite people. In addition, in response to the Department of Energy's (DOE) Order 435.1, several Performance Assessments (WSRC 2008; LWO 2009; SRR 2010; SRR 2011) and a Comprehensive SRS Composite Analysis (SRNO 2010) have recently been completed at SRS. The critical radionuclides and pathways identified in these extensive reports are discussed and, where applicable, included in this analysis.

  2. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  3. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  4. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  5. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  6. Proteome analysis of Saccharomyces cerevisiae: a methodological outline

    DEFF Research Database (Denmark)

    Fey, S J; Nawrocki, A; Görg, A

    1997-01-01

    Proteome analysis offers a unique means of identifying important proteins, characterizing their modifications and beginning to describe their function. This is achieved through the combination of two technologies: protein separation and selection by two-dimensional gel electrophoresis, and protei...

  7. New enhancements to SCALE for criticality safety analysis

    International Nuclear Information System (INIS)

    Hollenbach, D.F.; Bowman, S.M.; Petrie, L.M.; Parks, C.V.

    1995-01-01

    As the speed, available memory, and reliability of computer hardware increases and the cost decreases, the complexity and usability of computer software will increase, taking advantage of the new hardware capabilities. Computer programs today must be more flexible and user friendly than those of the past. Within available resources, the SCALE staff at Oak Ridge National Laboratory (ORNL) is committed to upgrading its computer codes to keep pace with the current level of technology. This paper examines recent additions and enhancements to the criticality safety analysis sections of the SCALE code package. These recent additions and enhancements made to SCALE can be divided into nine categories: (1) new analytical computer codes, (2) new cross-section libraries, (3) new criticality search sequences, (4) enhanced graphical capabilities, (5) additional KENO enhancements, (6) enhanced resonance processing capabilities, (7) enhanced material information processing capabilities, (8) portability of the SCALE code package, and (9) other minor enhancements, modifications, and corrections to SCALE. Each of these additions and enhancements to the criticality safety analysis capabilities of the SCALE code system are discussed below

  8. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  9. Sports Management for Sports Massification Planned and Executed by Social Organizations. Critics to Models, Experiences and Proposal Methodological Accompaniment

    Directory of Open Access Journals (Sweden)

    Lorenza Antonia Reyes de Duran

    2016-08-01

    Full Text Available The proposal analysis, interpretation, disassembly, self-criticism and guidance is born and comes from work experience planned mass sports and social organizations opposed-not in the conventional sense comparative-private business models and sport, state and management. The contribution made by the sports management experience from positions of power, either state or business are undeniable and its impact is difficult to express in numbers for its humanistic value, which is incalculable. However, it is urgent to emphasize the products and results achieved by some social organizations related to sport; as are the reference cases in Higuerón and the municipality Independence of Yaracuy state. From a dialectical analysis of reality, we try to understand the complex system that involves high-level sports management and performance, we introduce in the praxical notions of sporting activity and associated approach applied to social and community work. As opening and closing action research processes, we will make a proposal to accompany sports management concerning overcrowding by and from social organizations. This proposal is constructed from an innovative, flexible, open, inclusive and social, community and organizational curriculum relevance.

  10. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  11. System implementation of hazard analysis and critical control points (HACCP) in a nitrogen production plant

    International Nuclear Information System (INIS)

    Barrantes Salazar, Alexandra

    2014-01-01

    System of hazard analysis and critical control points are deployed in a production plant of liquid nitrogen. The fact that the nitrogen has become a complement to food packaging to increase shelf life, or provide a surface that protect it from manipulation, has been the main objective. Analysis of critical control points for the nitrogen production plant has been the adapted methodology. The knowledge of both the standard and the production process, as well as the on site verification process, have been necessary. In addition, all materials and/or processing units that are found in contact with the raw material or the product under study were evaluated. Such a way that the intrinsic risks of each were detected, from the physical, chemical and biological points of view according to the origin or pollution source. For each found risk was evaluated the probability of occurrence according to the frequency and gravity of it, with these variables determined was achieved the definition of the type of risk detected. In the cases that was presented a greater risk or critical, these were subjected decision tree; with which is concluded the non determination of critical control points. However, for each one of them were established the maximum permitted limits. To generate each of the results it has literature or scientific reference of reliable provenance, where is indicated properly the support of the evaluated matter. In a general way, the material matrix and the process matrix are found without critical control points; so that the project is concluded in the analysis, and it has to generate without the monitoring system and verification. To increase this project is suggested in order to cover the packaging system of gaseous nitrogen, due to it was delimited to liquid nitrogen. Furthermore, the liquid nitrogen is a 100% automated and closed process so the introduction of contaminants is very reduced, unlike the gaseous nitrogen process. (author) [es

  12. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Stefan [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Sommer, Rainer; Virotta, Francesco [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2010-09-15

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  13. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Sommer, Rainer; Virotta, Francesco

    2010-09-01

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  14. The Analysis of SBWR Critical Power Bundle Using Cobrag Code

    Directory of Open Access Journals (Sweden)

    Yohannes Sardjono

    2013-03-01

    Full Text Available The coolant mechanism of SBWR is similar with the Dodewaard Nuclear Power Plant (NPP in the Netherlands that first went critical in 1968. The similarity of both NPP is cooled by natural convection system. These coolant concept is very related with same parameters on fuel bundle design especially fuel bundle length, core pressure drop and core flow rate as well as critical power bundle. The analysis was carried out by using COBRAG computer code. COBRAG computer code is GE Company proprietary. Basically COBRAG computer code is a tool to solve compressible three-dimensional, two fluid, three field equations for two phase flow. The three fields are the vapor field, the continuous liquid field, and the liquid drop field. This code has been applied to analyses model flow and heat transfer within the reactor core. This volume describes the finitevolume equations and the numerical solution methods used to solve these equations. This analysis of same parameters has been done i.e.; inlet sub cooling 20 BTU/lbm and 40 BTU/lbm, 1000 psi pressure and R-factor is 1.038, mass flux are 0.5 Mlb/hr.ft2, 0.75 Mlb/hr.ft2, 1.00 Mlb/hr.ft2 and 1.25 Mlb/hr.ft2. Those conditions based on history operation of some type of the cell fuel bundle line at GE Nuclear Energy. According to the results, it can be concluded that SBWR critical power bundle is 10.5 % less than current BWR critical power bundle with length reduction of 12 ft to 9 ft.

  15. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  16. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  17. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    Science.gov (United States)

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become

  18. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  19. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  1. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  2. Critical Thinking Development in Pharmacy Education: A Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Michael J Peeters

    2016-03-01

    Full Text Available Objective: The investigators aimed to summarize prior studies of critical thinking development among pharmacy students, using the California Critical Thinking Skills Test (CCTST, Health Sciences Reasoning Test (HSRT, and Defining Issues Test (DIT. Methods: Independently, two investigators (KLZ, MJP systematically searched available literature using PubMed, Google Scholar, ERIC, PsychInfo, as well as pharmacy education conference abstracts in American Journal of Pharmaceutical Education. Their search terms were ‘pharmacy’, and [‘critical thinking’, ‘HSRT’, ‘CCTST’, and ‘DIT’]. Studies included were those that investigated pharmacy students, used one of the tests (CCTST, HSRT, DIT, and used a longitudinal design with test administration at two or more time-points for the same subjects (i.e., development. On review, the CCTST and HSRT seem more foundational to analytical/critical thinking, while the DIT appears to measure moral/complex thinking. Summarizing used meta-analysis with Cohen’s d and random-effects modelling. Results: Five studies involved thinking development with 10 separate cohorts for meta-analysis (8 cohorts for CCTST, 2 for DIT, and 0 for HSRT. At 5 institutions, 407 and 1148 students were included (CCTST and DIT, respectively. For the CCTST, the overall effect was 0.33 (0.19-0.47 95%CI with some heterogeneity among study cohorts (I2=52%. For the DIT, the overall effect was -0.23 (-0.83-0.37 95%CI with considerable heterogeneity between study cohorts (I2=95%. For the CCTST and DIT, some studies showed effect-sizes greater than 0.5. Meta-analysis of the HSRT could not be conducted (i.e., 0 studies found. Implications: While measuring different aspects of “critical thinking”, the CCTST and DIT showed responsiveness to change and appear to be promising measures of cognitive development. These tests should be used in further well-designed research studies that explore strategies for improving cognitive

  3. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  4. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  5. Methodological aspects in the analysis of spontaneously produced sputum

    NARCIS (Netherlands)

    Out, T. A.; Jansen, H. M.; Lutter, R.

    2001-01-01

    Analysis of sputum as a specimen containing inflammatory indices has gained considerable interest during the last decade with focus on chronic bronchitis (CB) with or without airway obstruction, cystic fibrosis (CF), chronic obstructive pulmonary disease (COPD) and asthma. The nature of the

  6. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  7. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    Science.gov (United States)

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  8. Boolean comparative analysis of qualitative data : a methodological note

    NARCIS (Netherlands)

    Romme, A.G.L.

    1995-01-01

    This paper explores the use of Boolean logic in the analysis of qualitative data, especially on the basis of so-called process theories. Process theories treat independent variables as necessary conditions which are binary rather than variable in nature, while the dependent variable is a final

  9. A methodological approach for the biomechanical cause analysis of golf-related lumbar spine injuries.

    Science.gov (United States)

    Sim, Taeyong; Jang, Dong-Jin; Oh, Euichaul

    2014-01-01

    A new methodological approach employing mechanical work (MW) determination and relative portion of its elemental analysis was applied to investigate the biomechanical causes of golf-related lumbar spine injuries. Kinematic and kinetic parameters at the lumbar and lower limb joints were measured during downswing in 18 golfers. The MW at the lumbar joint (LJ) was smaller than at the right hip but larger than the MWs at other joints. The contribution of joint angular velocity (JAV) to MW was much greater than that of net muscle moment (NMM) at the LJ, whereas the contribution of NMM to MW was greater rather than or similar to that of JAV at other joints. Thus, the contribution of JAV to MW is likely more critical in terms of the probability of golf-related injury than that of NMM. The MW-based golf-related injury index (MWGII), proposed as the ratio of the contribution of JAV to MW to that of NMM, at the LJ (1.55) was significantly greater than those at other joints ( golf-related injuries around the lumbar spine. Therefore, both MW and MWGII should be considered when investigating the biomechanical causes of lumbar spine injuries.

  10. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  11. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  12. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  13. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  14. Sustainable development goals for health promotion: a critical frame analysis.

    Science.gov (United States)

    Spencer, Grace; Corbin, J Hope; Miedema, Esther

    2018-05-25

    The Sustainable Development Goals (SDGs) lay the foundations for supporting global health and international development work for the next 15 years. Thirty years ago, the Ottawa Charter defined health promotion and outlined key principles for global action on health, including the importance of advocating, enabling and mediating for health equity. Advocacy underscores a human right to health and suggests political action to support its attainment. Enabling speaks to health promotion's focus on the empowerment of people and communities to take control over their health and aspirations. Mediation draws attention to the critical intersectoral partnerships required to address health and social inequities. Underpinned by this approach, the aim of this paper is to consider how key health promotion principles, namely, rights, empowerment and partnership feature (and are framed) within the SDGs and to consider how these framings may shape future directions for health promotion. To that end, a critical frame analysis of the Transforming Our World document was conducted. The analysis interrogated varying uses and meanings of partnerships, empowerment and rights (and their connections) within the SDGs. The analysis here presents three framings from the SDGs: (1) a moral code for global action on (in)equity; (2) a future orientation to address global issues yet devoid of history; and (3) a reductionist framing of health as the absence of disease. These framings raise important questions about the underpinning values of the SDGs and pathways to health equity - offering both challenges and opportunities for defining the nature and scope of health promotion.

  15. Exploratory market structure analysis. Topology-sensitive methodology.

    OpenAIRE

    Mazanec, Josef

    1999-01-01

    Given the recent abundance of brand choice data from scanner panels market researchers have neglected the measurement and analysis of perceptions. Heterogeneity of perceptions is still a largely unexplored issue in market structure and segmentation studies. Over the last decade various parametric approaches toward modelling segmented perception-preference structures such as combined MDS and Latent Class procedures have been introduced. These methods, however, are not taylored for qualitative ...

  16. On the methodology of the analysis of Moessbauer spectra

    International Nuclear Information System (INIS)

    Vandenberghe, R.E.; Grave, E. de; Bakker, P.M.A. de

    1994-01-01

    A review is presented of the direct fitting procedures which are used in the analysis of Moessbauer spectra. Direct lineshape fitting with alternative profiles as well as shape-dependent, shape-independent and quasi shape-independent distribution fitting methods all can easily be incorporated in one computer program scheme yielding a large versatility for modification and/or extension of the programs according to specific spectra. (orig.)

  17. Methodological issues underlying multiple decrement life table analysis.

    Science.gov (United States)

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  18. Overdiagnosis of Bipolar Disorder: A Critical Analysis of the Literature

    Directory of Open Access Journals (Sweden)

    Amna A. Ghouse

    2013-01-01

    Full Text Available Bipolar disorder (BD is considered one of the most disabling mental conditions, with high rates of morbidity, disability, and premature death from suicide. Although BD is often misdiagnosed as major depressive disorder, some attention has recently been drawn to the possibility that BD could be overdiagnosed in some settings. The present paper focuses on a critical analysis of the overdiagnosis issue among bipolar patients. It includes a review of the available literature findings, followed by some recommendations aiming at optimizing the diagnosis of BD and increasing its reliability.

  19. Vaccines for human papillomavirus infection: A critical analysis

    Directory of Open Access Journals (Sweden)

    Nath Amiya

    2009-01-01

    Full Text Available This article takes a critical look at the pros and cons of human papillomavirus (HPV vaccines. There is enough evidence to suggest that the prophylactic vaccines are efficacious in preventing various benign and malignant conditions (including cervical cancers caused by HPV. Even though the vaccine is costly, hypothetical analysis has shown that HPV vaccination will be cost effective in the long run. Therapeutic HPV vaccines used to treat established disease are still undergoing evaluation in clinical studies, and results seem to be encouraging. Although several countries have started mandatory vaccination programs with the prophylactic HPV vaccines, conservatives have voiced concerns regarding the moral impact of such vaccination programs.

  20. DHLW Glass Waste Package Criticality Analysis (SCPB:N/A)

    International Nuclear Information System (INIS)

    Davis, J.W.

    1996-01-01

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development Department (WPDD) to determine the viability of the Defense High-Level Waste (DHLW) Glass waste package concept with respect to criticality regulatory requirements in compliance with the goals of the Waste Package Implementation Plan (Ref. 5.1) for conceptual design. These design calculations are performed in sufficient detail to provide a comprehensive comparison base with other design alternatives. The objective of this evaluation is to show to what extent the concept meets the regulatory requirements or indicate additional measures that are required for the intact waste package

  1. A critical analysis of the NegaWatt scenario

    International Nuclear Information System (INIS)

    Anon.

    2011-01-01

    The author proposes a rather radical critical analysis of the NegaWatt scenario which is mainly based on the development of the use of solid and liquid biomass produced by forests and farms, and of some marginal resources like wood and urban wastes. He shows that wood resources in France are not sufficient as part of the wood is used for construction. A further exploitation of wood would lead to a dramatic increase of costs. He shows that the scenario overestimates the available wood in France, and moreover, that the promoters of the scenario overstep the physical, biological, social and economic limits of the real world of agriculture

  2. Fast critical experiments in FCA and their analysis

    International Nuclear Information System (INIS)

    Hirota, Jitsuya

    1984-02-01

    JAERI Fast Critical Facility FCA went critical for the first time in April, 1967. Since then, critical experiments and their analysis were carried out on thirty-five assemblies until march, 1982. This report summarizes many achievements obtained in these fifteen years and points out disagreements observed between the calculation and experiment for further studies. A series of mock-up experiments for Experimental Fast Reactor JOYO, a theoretical and numerical study of adjustment of group constants by using integral data and a development of proton-recoil counter system for fast neutron spectrum measurement won high praise. Studies of Doppler effect of structural materials, effect of fission product accumulation on sodium-void worth, axially heterogeneous core and actinide cross sections attracted world-side attention. Significant contributions were also made to Prototype Fast Breeder Reactor MONJU through the partial mock-up experiments. Disagreements between the calculation and experiment were observed in the following items; reaction rate distribution and reactivity worth of B 4 C absorber in radial blanket, central reactivity worth in core with reflector, plate/pin fuel heterogeneity effect on criticality, sodium-void effect in central core region, Doppler effect of structural materials, core neutron spectrum near large resonances of iron and oxygen, effect of fission product accumulation on sodium-void worth, physics property of heterogeneous core, reactivity change resulted from fuel slumping and so on. Further efforts should be made to solve these disagreements through recalculating the experimental results with newly developed data and methods and carrying out the experiments intended to identify the cause of disagreement. (author)

  3. Criticality safety analysis of a calciner exit chute

    International Nuclear Information System (INIS)

    Haught, C.F.; Basoglu, B.; Brewer, R.W.; Hollenback, D.F.; Wilkinson, A.D.; Dodds, H.L.

    1994-01-01

    Calcination of uranyl nitrate into uranium oxide is part of normal operations of some enrichment plants. Typically, a calciner discharges uranium oxide powder (U 3 O 8 ) into an exit chute that directs the powder into a receiving can located in a glove box. One possible scenario for a criticality accident is the exit chute becoming blocked with powder near its discharge. The blockage restricts the flow of powder causing the exit chute to become filled with the powder. If blockage does occur, the height of the powder could reach a level that would not be safe from a criticality point of view. In this analysis, the subcritical height limit is examined for 98% enriched U 3 O 8 in the exit chute with full water reflection and optimal water moderation. The height limit for ensuring criticality safety during such an accumulation is 28.2 cm above the top of the discharge pipe at the bottom of the chute. Chute design variations are also evaluated with full water reflection and optimal water moderation. Subcritical configurations for the exit chute variation are developed, but the configurations are not safe when combined with the calciner. To ensure criticality safety, modifications must be made to the calciner tube or safety measures must be implemented if these designs are to be utilized with 98% enriched material. A geometrically safe configuration for the exit chute is developed for a blockage of 20% enriched powder with full water reflection and optimal water moderation, and this configuration is safe when combined with the existing calciner

  4. Codevelopment of conceptual understanding and critical attitude: toward a systemic analysis of the survival blanket

    Science.gov (United States)

    Viennot, Laurence; Décamp, Nicolas

    2016-01-01

    One key objective of physics teaching is the promotion of conceptual understanding. Additionally, the critical faculty is universally seen as a central quality to be developed in students. In recent years, however, teaching objectives have placed stronger emphasis on skills than on concepts, and there is a risk that conceptual structuring may be disregarded. The question therefore arises as to whether it is possible for students to develop a critical stance without a conceptual basis, leading in turn to the issue of possible links between the development of conceptual understanding and critical attitude. In an in-depth study to address these questions, the participants were seven prospective physics and chemistry teachers. The methodology included a ‘teaching interview’, designed to observe participants’ responses to limited explanations of a given phenomenon and their ensuing intellectual satisfaction or frustration. The explanatory task related to the physics of how a survival blanket works, requiring a full and appropriate system analysis of the blanket. The analysis identified five recurrent lines of reasoning and linked these to judgments of adequacy of explanation, based on metacognitive/affective (MCA) factors, intellectual (dis)satisfaction and critical stance. Recurrent themes and MCA factors were used to map the intellectual dynamics that emerged during the interview process. Participants’ critical attitude was observed to develop in strong interaction with their comprehension of the topic. The results suggest that most students need to reach a certain level of conceptual mastery before they can begin to question an oversimplified explanation, although one student’s replies show that a different intellectual dynamics is also possible. The paper ends with a discussion of the implications of these findings for future research and for decisions concerning teaching objectives and the design of learning environments.

  5. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  6. A Critical Evaluation of Ground-Penetrating Radar Methodology on the Kalavasos and Maroni Built Environments (KAMBE) Project, Cyprus (Invited)

    Science.gov (United States)

    Leon, J.; Urban, T.; Gerard-Little, P.; Kearns, C.; Manning, S. W.; Fisher, K.; Rogers, M.

    2013-12-01

    at these settlements. Having just completed this first phase of the project, we report on the results of large-scale geophysical survey, including the identification of at least two previously unknown building complexes (one at each site). Here we focus particularly on ground-penetrating radar (GPR) data and survey methodology, in an effort to critically examine the range of approaches applied throughout the project (e.g. various antennae frequencies, data-collection densities, soil moisture/seasonality of survey, and post-collection data processing [2]), and to identify the most effective parameters for archaeological geophysical survey in the region. This paper also advocates for the role of geophysical survey within a multi-component archaeological project, not simply as a prospection tool but as an archaeological data collection method in its own right. 1]Fisher, K. D., J. Leon, S. Manning, M. Rogers, and D. Sewell. In Press. 2011-2012. 'The Kalavasos and Maroni Built Environments Project: Introduction and preliminary report on the 2008 and 2010 seasons. Report of the Department of Antiquities, Cyprus. 2] e.g. Rogers, M., J. F. Leon, K. D. Fisher, S. W. Manning and D. Sewell. 2012. 'Comparing similar ground-penetrating radar surveys under different soil moisture conditions at Kalavasos-Ayios Dhimitrios, Cyprus.' Archaeological Prospection 19 (4): 297-305.

  7. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...... the successful applications of the methodology. Moreover, energy requirements for various column configurations corresponding to different feed locatio...

  8. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  9. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  10. Mediation analysis in nursing research: a methodological review.

    Science.gov (United States)

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  11. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  12. THE MURCHISON WIDEFIELD ARRAY 21 cm POWER SPECTRUM ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Daniel C.; Beardsley, A. P.; Bowman, Judd D. [Arizona State University, School of Earth and Space Exploration, Tempe, AZ 85287 (United States); Hazelton, B. J.; Sullivan, I. S.; Barry, N.; Carroll, P. [University of Washington, Department of Physics, Seattle, WA 98195 (United States); Trott, C. M.; Pindor, B.; Briggs, F.; Gaensler, B. M. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia); Dillon, Joshua S.; Oliveira-Costa, A. de; Ewall-Wice, A.; Feng, L. [MIT Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Pober, J. C. [Brown University, Department of Physics, Providence, RI 02912 (United States); Bernardi, G. [Department of Physics and Electronics, Rhodes University, Grahamstown 6140 (South Africa); Cappallo, R. J.; Corey, B. E. [MIT Haystack Observatory, Westford, MA 01886 (United States); Emrich, D., E-mail: daniel.c.jacobs@asu.edu [International Centre for Radio Astronomy Research, Curtin University, Perth, WA 6845 (Australia); and others

    2016-07-10

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  13. Methodology for global nonlinear analysis of nuclear systems

    International Nuclear Information System (INIS)

    Cacuci, D.G.; Cacuci, G.L.

    1987-01-01

    This paper outlines a general method for globally computing the crucial features of nonlinear problems: bifurcations, limit points, saddle points, extrema (maxima and minima); our method also yields the local sensitivities (i.e., first order derivatives) of the system's state variables (e.g., fluxes, power, temperatures, flows) at any point in the system's phase space. We also present an application of this method to the nonlinear BWR model discussed in Refs. 8 and 11. The most significant novel feature of our method is the recasting of a general mathematical problem comprising three aspects: (1) nonlinear constrained optimization, (2) sensitivity analysis, into a fixed point problem of the form F[u(s), λ(s)] = 0 whose global zeros and singular points are related to the special features (i.e., extrema, bifurcations, etc.) of the original problem

  14. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  15. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  16. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  17. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  18. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  19. Percorsi linguistici e semiotici: Critical Multimodal Analysis of Digital Discourse

    Directory of Open Access Journals (Sweden)

    edited by Ilaria Moschini

    2014-12-01

    Full Text Available The language section of LEA - edited by Ilaria Moschini - is dedicated to the Critical Multimodal Analysis of Digital Discourse, an approach that encompasses the linguistic and semiotic detailed investigation of texts within a socio-cultural perspective. It features an interview with Professor Theo van Leeuwen by Ilaria Moschini and four essays: “Retwitting, reposting, repinning; reshaping identities online: Towards a social semiotic multimodal analysis of digital remediation” by Elisabetta Adami; “Multimodal aspects of corporate social responsibility communication” by Carmen Daniela Maier; “Pervasive Technologies and the Paradoxes of Multimodal Digital Communication” by Sandra Petroni and “Can the powerless speak? Linguistic and multimodal corporate media manipulation in digital environments: the case of Malala Yousafzai” by Maria Grazia Sindoni. 

  20. Criticality Analysis of SFP Region I under Dry Air Condition

    International Nuclear Information System (INIS)

    Kim, Ki Yong; Kim, Min Chul

    2016-01-01

    This paper is to provide a result of the criticality evaluation under the condition that new fuel assemblies for initial fuel loading are storing in Region 1 of SFP in the dry air. The objective of this analysis is to ensure the effective neutron multiplication factor(k_e_f_f) of SFP is less than 0.95 under that condition. This analysis ensured the effective neutron multiplication factor(k_e_f_f) of Region 1 of SFP is less than 0.95 under the condition in the air. The keff in Region I of SFP under the condition of the dry air is 0.5865. The increased k_c_a_l_c of the Region 1 after the mislocated fuel assembly accident is 0.0444 at the pool flooded with un-borated water