WorldWideScience

Sample records for analysis methodology volume

  1. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  2. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  3. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    International Nuclear Information System (INIS)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-01-01

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  4. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    Miller, J.Q.; Hale, T.; Miller, D.

    1991-09-01

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  5. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    Science.gov (United States)

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  6. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  7. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  8. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  9. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  10. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  11. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  12. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-03-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  13. Methodological issues in radiation dose-volume outcome analyses: Summary of a joint AAPM/NIH workshop

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Niemierko, Andrzej; Herbert, Donald; Yan, Di; Jackson, Andrew; Ten Haken, Randall K.; Langer, Mark; Sapareto, Steve

    2002-01-01

    This report represents a summary of presentations at a joint workshop of the National Institutes of Health and the American Association of Physicists in Medicine (AAPM). Current methodological issues in dose-volume modeling are addressed here from several different perspectives. Areas of emphasis include (a) basic modeling issues including the equivalent uniform dose framework and the bootstrap method, (b) issues in the valid use of statistics, including the need for meta-analysis, (c) issues in dealing with organ deformation and its effects on treatment response, (d) evidence for volume effects for rectal complications, (e) the use of volume effect data in liver and lung as a basis for dose escalation studies, and (f) implications of uncertainties in volume effect knowledge on optimized treatment planning. Taken together, these approaches to studying volume effects describe many implications for the development and use of this information in radiation oncology practice. Areas of significant interest for further research include the meta-analysis of clinical data; interinstitutional pooled data analyses of volume effects; analyses of the uncertainties in outcome prediction models, minimal parameter number outcome models for ranking treatment plans (e.g., equivalent uniform dose); incorporation of the effect of motion in the outcome prediction; dose-escalation/isorisk protocols based on outcome models; the use of functional imaging to study radio-response; and the need for further small animal tumor control probability/normal tissue complication probability studies

  14. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  15. Waste isolation in the U.S., technical programs and public education. Volume 2 - low level waste, volume reduction methodologies and economics. Vol. 2

    International Nuclear Information System (INIS)

    Post, R.G.

    1984-01-01

    This volume presents information regarding low-level waste, volume reduction methodologies and economics. Topics include: public education on nuclear waste; economics of low-level waste management systems; operating experience with advanced volume reduction techniques; solidification of waste; operating experience with advanced volume reduction techniques--incineration; regional plans for the disposal of low-level waste; radwaste system modifications at nuclear power plants; operating experience with advanced volume reduction techniques--operations and on-site storage issues; and economic impact of 10CFR61

  16. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  17. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  18. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  19. Assessment methodology for new cooling lakes. Volume 3. Limnological and fisheries data and bibliography. Final report

    International Nuclear Information System (INIS)

    1981-10-01

    This is the data volume of the report entitled Assessment Methodology for New Cooling Lakes. Limnological and fisheries data were compiled in this volume for potential users in the utility industry. Published papers, reports, other written information, computer files, and direct contracts were used to compile a matrix of information. This volume presents data and the bibliographic sources of the power plant and geographical, limnological, and fisheries information for 181 lakes and reservoirs, of which 134 were used for cooling purposes. Data for 65 lakes were completed with respect to the limnology and fisheries parameters so that complete statistical analysis could be performed. Of these 65 lakes, 42 are used for cooling. Tables in this report contain data arranged by utility, power plant, limnology, water quality, morphoedaphic, and fishery categories. The data in the tables are keyed to a lake code. The references for the data shown are keyed to a numerical listing of the bibliography. Author, state, lake, and subject indexes facilitate searching for bibliographic information

  20. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  1. Volume totalizers analysis of pipelines operated by TRANSPETRO National Operational Control Center; Analise de totalizadores de volume em oleodutos operados pelo Centro Nacional de Controle e Operacao da TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Aramaki, Thiago Lessa; Montalvao, Antonio Filipe Falcao [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Marques, Thais Carrijo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2012-07-01

    This paper aims to present the results and methodology in the analysis of differences in volume totals used in systems such as batch tracking and leak detection of pipelines operated by the National Center for Operational Control (CNCO) at TRANSPETRO. In order to optimize this type of analysis, software was developed to acquisition and processing of historical data using the methodology developed. The methodology developed takes into account the particularities encountered in systems operated by TRANSPETRO, more specifically, by CNCO. (author)

  2. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  3. Diversion Path Analysis handbook. Volume 4 (of 4 volumes). Computer Program 2

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 2 (DPACP-2) is used to produce tables and statistics on modifications identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 259088 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-2 assist the DPA team in analyzing and evaluating modifications to the plant's safeguards system that would eliminate, or reduce the severity of, vulnerabilities identified by means of the DPA. These vulnerabilities relate to the capability of the plant's material control and material accounting subsystems to indicate diversion of special nuclear material (SNM) by a knowledgeable insider

  4. Diversion Path Analysis handbook. Volume 3 (of 4 volumes). Computer Program 1

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 1 (DPACP-1), is used to assemble and tabulate the data for Specific Diversion Paths (SDPs) identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 255498 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-1 are used by the DPA team to assist in analyzing vulnerabilities, in a plant's material control and material accounting subsystems, to diversion of special nuclear material (SNM) by a knowledgable insider. Based on this analysis, the DPA team can identify, and propose to plant management, modifications to the plant's safeguards system that would eliminate, or reduce the severity of, the identified vulnerabilities. The data are also used by plant supervision when investigating a potential diversion

  5. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  6. Decomposition analysis of differential dose volume histograms

    International Nuclear Information System (INIS)

    Heuvel, Frank van den

    2006-01-01

    Dose volume histograms are a common tool to assess the value of a treatment plan for various forms of radiation therapy treatment. The purpose of this work is to introduce, validate, and apply a set of tools to analyze differential dose volume histograms by decomposing them into physically and clinically meaningful normal distributions. A weighted sum of the decomposed normal distributions (e.g., weighted dose) is proposed as a new measure of target dose, rather than the more unstable point dose. The method and its theory are presented and validated using simulated distributions. Additional validation is performed by analyzing simple four field box techniques encompassing a predefined target, using different treatment energies inside a water phantom. Furthermore, two clinical situations are analyzed using this methodology to illustrate practical usefulness. A comparison of a treatment plan for a breast patient using a tangential field setup with wedges is compared to a comparable geometry using dose compensators. Finally, a normal tissue complication probability (NTCP) calculation is refined using this decomposition. The NTCP calculation is performed on a liver as organ at risk in a treatment of a mesothelioma patient with involvement of the right lung. The comparison of the wedged breast treatment versus the compensator technique yields comparable classical dose parameters (e.g., conformity index ≅1 and equal dose at the ICRU dose point). The methodology proposed here shows a 4% difference in weighted dose outlining the difference in treatment using a single parameter instead of at least two in a classical analysis (e.g., mean dose, and maximal dose, or total dose variance). NTCP-calculations for the mesothelioma case are generated automatically and show a 3% decrease with respect to the classical calculation. The decrease is slightly dependant on the fractionation and on the α/β-value utilized. In conclusion, this method is able to distinguish clinically

  7. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects

    Directory of Open Access Journals (Sweden)

    Naoki Hashimoto

    2018-01-01

    Discussion: A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  8. De Minimis waste impacts analysis methodology. IMPACTS - BRC user's guide and methodology for radioactive wastes below regulatory concern. Draft report for comment. Volume 2

    International Nuclear Information System (INIS)

    Forstom, J.M.; Goode, D.J.

    1986-07-01

    This report describes the methodology and computer program used by NRC to evaluate radiological impacts associated with petitions to have specific slightly contaminated radioactive waste streams designated as ''below regulatory concern.'' These wastes could be treated and disposed of at facilities which are not licensed for low-level radioactive waste management. The IMPACTS-BRC computer program is implemented on IBM-PC microcomputers using the FORTRAN programming language. Radiological impacts (doses) are estimated for several pathways including direct gamma radiation exposure, worker inhalation and exposure, offsite atmospheric and water releases, and intruder exposures. Annual impacts are calculated for the maximum individual, critical groups, and general population. The treatment and disposal options include onsite incineration, incineration at municipal and hazardous waste facilities, and disposal at sanitary landfills and hazardous waste landfills. Modifications to the program (from Volume 1) are primarily for microcomputer compatibility and to provide information needed to evaluate the petitions. Default environmental and facility parameters are developed representing conservative assumptions about site selection and operational procedures. In particular, the parameters of the groundwater pathway model are modified to represent more conservative assumptions than the original model (Volume 1)

  9. Tourism Methodologies - New Perspectives, Practices and Procedures

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in differen...... codings and analysis, and tapping into the global network of social media.......This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  10. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  11. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  12. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  13. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  14. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  15. The Self-Concept. Volume 1, A Review of Methodological Considerations and Measuring Instruments. Revised Edition.

    Science.gov (United States)

    Wylie, Ruth C.

    This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…

  16. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  17. Bare-Hand Volume Cracker for Raw Volume Data Analysis

    Directory of Open Access Journals (Sweden)

    Bireswar Laha

    2016-09-01

    Full Text Available Analysis of raw volume data generated from different scanning technologies faces a variety of challenges, related to search, pattern recognition, spatial understanding, quantitative estimation, and shape description. In a previous study, we found that the Volume Cracker (VC 3D interaction (3DI technique mitigated some of these problems, but this result was from a tethered glove-based system with users analyzing simulated data. Here, we redesigned the VC by using untethered bare-hand interaction with real volume datasets, with a broader aim of adoption of this technique in research labs. We developed symmetric and asymmetric interfaces for the Bare-Hand Volume Cracker (BHVC through design iterations with a biomechanics scientist. We evaluated our asymmetric BHVC technique against standard 2D and widely used 3D interaction techniques with experts analyzing scanned beetle datasets. We found that our BHVC design significantly outperformed the other two techniques. This study contributes a practical 3DI design for scientists, documents lessons learned while redesigning for bare-hand trackers, and provides evidence suggesting that 3D interaction could improve volume data analysis for a variety of visual analysis tasks. Our contribution is in the realm of 3D user interfaces tightly integrated with visualization, for improving the effectiveness of visual analysis of volume datasets. Based on our experience, we also provide some insights into hardware-agnostic principles for design of effective interaction techniques.

  18. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  19. A general methodology for three-dimensional analysis of variation in target volume delineation

    NARCIS (Netherlands)

    Remeijer, P.; Rasch, C.; Lebesque, J. V.; van Herk, M.

    1999-01-01

    A generic method for three-dimensional (3-D) evaluation of target volume delineation in multiple imaging modalities is presented. The evaluation includes geometrical and statistical methods to estimate observer differences and variability in defining the Gross Tumor Volume (GTV) in relation to the

  20. SLIM-MAUD: an approach to assessing human error probabilities using structured expert judgment. Volume II. Detailed analysis of the technical issues

    International Nuclear Information System (INIS)

    Embrey, D.E.; Humphreys, P.; Rosa, E.A.; Kirwan, B.; Rea, K.

    1984-07-01

    This two-volume report presents the procedures and analyses performed in developing an approach for structuring expert judgments to estimate human error probabilities. Volume I presents an overview of work performed in developing the approach: SLIM-MAUD (Success Likelihood Index Methodology, implemented through the use of an interactive computer program called MAUD-Multi-Attribute Utility Decomposition). Volume II provides a more detailed analysis of the technical issues underlying the approach

  1. Abnormal condition and events analysis for instrumentation and control systems. Volume 1: Methodology for nuclear power plant digital upgrades. Final report

    International Nuclear Information System (INIS)

    McKemy, S.; Marcelli, M.; Boehmer, N.; Crandall, D.

    1996-01-01

    The ACES project was initiated to identify a cost-effective methodology for addressing abnormal conditions and events (ACES) in digital upgrades to nuclear power plant systems, as introduced by IEEE Standard 7-4.3.2-1993. Several methodologies and techniques currently in use in the defense, aerospace, and other communities for the assurance of digital safety systems were surveyed, and although several were shown to possess desirable qualities, non sufficiently met the needs of the nuclear power industry. This report describes a tailorable methodology for performing ACES analysis that is based on the more desirable aspects of the reviewed methodologies and techniques. The methodology is applicable to both safety- and non-safety-grade systems, addresses hardware, software, and system-level concerns, and can be applied in either a lifecycle or post-design timeframe. Employing this methodology for safety systems should facilitate the digital upgrade licensing process

  2. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  3. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  4. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance.......The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  5. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  6. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  7. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  8. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  9. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  10. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  11. Methodology to Forecast Volume and Cost of Cancer Drugs in Low- and Middle-Income Countries

    Directory of Open Access Journals (Sweden)

    Yehoda M. Martei

    2018-02-01

    Full Text Available Purpose: In low- and middle-income countries (LMICs, frequent outages of the stock of cancer drugs undermine cancer care delivery and are potentially fatal for patients with cancer. The aim of this study is to describe a methodologic approach to forecast chemotherapy volume and estimate cost that can be readily updated and applied in most LMICs. Methods: Prerequisite data for forecasting are population-based incidence data and cost estimates per unit of drug to be ordered. We used the supplementary guidelines from the WHO list of essential medicines for cancer to predict treatment plans and ordering patterns. We used de-identified aggregate data from the Botswana National Cancer Registry to estimate incident cases. The WHO Management Sciences for Health International Price Indicator was used to estimate unit costs per drug. Results: Chemotherapy volume required for incident cancer cases was estimated as the product of the standardized dose required to complete a full treatment regimen per patient, with a given cancer diagnosis and stage, multiplied by the total number of incident cancer cases with the respective diagnosis. The estimated chemotherapy costs to treat the 10 most common cancers in the public health care sector of Botswana is approximately 2.3 million US dollars. An estimated 66% of the budget is allocated to costs of rituximab and trastuzumab alone, which are used by approximately 10% of the cancer population. Conclusion: This method provides a reproducible approach to forecast chemotherapy volume and cost in LMICs. The chemotherapy volume and cost outputs of this methodology provide key stakeholders with valuable information that can guide budget estimation, resource allocation, and drug-price negotiations for cancer treatment. Ultimately, this will minimize drug shortages or outages and reduce potential loss of lives that result from an erratic drug supply.

  12. Researching virtual worlds methodologies for studying emergent practices

    CERN Document Server

    Phillips, Louise

    2013-01-01

    This volume presents a wide range of methodological strategies that are designed to take into account the complex, emergent, and continually shifting character of virtual worlds. It interrogates how virtual worlds emerge as objects of study through the development and application of various methodological strategies. Virtual worlds are not considered objects that exist as entities with fixed attributes independent of our continuous engagement with them and interpretation of them. Instead, they are conceived of as complex ensembles of technology, humans, symbols, discourses, and economic structures, ensembles that emerge in ongoing practices and specific situations. A broad spectrum of perspectives and methodologies is presented: Actor-Network-Theory and post-Actor-Network-Theory, performativity theory, ethnography, discourse analysis, Sense-Making Methodology, visual ethnography, multi-sited ethnography, and Social Network Analysis.

  13. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  14. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  15. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  16. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  17. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  18. A meta-analysis of randomized controlled trials of low-volume polyethylene glycol plus ascorbic acid versus standard-volume polyethylene glycol solution as bowel preparations for colonoscopy.

    Directory of Open Access Journals (Sweden)

    Qingsong Xie

    Full Text Available BACKGROUND: Standard-volume polyethylene glycol (PEG gut lavage solutions are safe and effective, but they require the consumption of large volumes of fluid. A new lower-volume solution of PEG plus ascorbic acid has been used recently as a preparation for colonoscopy. AIM: A meta-analysis was performed to compare the performance of low-volume PEG plus ascorbic acid with standard-volume PEG as bowel preparation for colonoscopy. STUDY: Electronic and manual searches were performed to identify randomized controlled trials (RCTs that compared the performance of low-volume PEG plus ascorbic acid with standard-volume PEG as bowel preparation for colonoscopy. After a methodological quality assessment and data extraction, the pooled estimates of bowel preparation efficacy during bowel cleansing, compliance with preparation, willingness to repeat the same preparation, and the side effects were calculated. We calculated pooled estimates of odds ratios (OR by fixed- and/or random-effects models. We also assessed heterogeneity among studies and the publication bias. RESULTS: Eleven RCTs were identified for analysis. The pooled OR for preparation efficacy during bowel cleansing and for compliance with preparation for low-volume PEG plus ascorbic acid were 1.08 (95% CI = 0.98-1.28, P = 0.34 and 2.23 (95% CI = 1.67-2.98, P<0.00001, respectively, compared with those for standard-volume PEG. The side effects of vomiting and nausea for low-volume PEG plus ascorbic acid were reduced relative to standard-volume PEG. There was no significant publication bias, according to a funnel plot. CONCLUSIONS: Low-volume PEG plus ascorbic acid gut lavage achieved non-inferior efficacy for bowel cleansing, is more acceptable to patients, and has fewer side effects than standard-volume PEG as a bowel preparation method for colonoscopy.

  19. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  20. Probabilistic safety analysis procedures guide, Sections 8-12. Volume 2, Rev. 1

    International Nuclear Information System (INIS)

    McCann, M.; Reed, J.; Ruger, C.; Shiu, K.; Teichmann, T.; Unione, A.; Youngblood, R.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. The first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. This second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  1. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  2. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  3. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji

    2010-01-01

    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  4. The methodology of semantic analysis for extracting physical effects

    Science.gov (United States)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  5. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  6. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  7. Safety assessment methodologies for near surface disposal facilities. Results of a co-ordinated research project (ISAM). Volume 1: Review and enhancement of safety assessment approaches and tools. Volume 2: Test cases

    International Nuclear Information System (INIS)

    2004-07-01

    the Safety Guide on 'Safety Assessment for Near Surface Disposal of Radioactive Waste' (Safety Standards Series No. WS-G- 1.1). The report of this CRP is presented in two volumes; Volume 1 contains a summary and a complete description of the ISAM project methodology and Volume 2 presents the application of the methodology to three hypothetical test cases

  8. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  9. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  10. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  11. MPCV Exercise Operational Volume Analysis

    Science.gov (United States)

    Godfrey, A.; Humphreys, B.; Funk, J.; Perusek, G.; Lewandowski, B. E.

    2017-01-01

    In order to minimize the loss of bone and muscle mass during spaceflight, the Multi-purpose Crew Vehicle (MPCV) will include an exercise device and enough free space within the cabin for astronauts to use the device effectively. The NASA Digital Astronaut Project (DAP) has been tasked with using computational modeling to aid in determining whether or not the available operational volume is sufficient for in-flight exercise.Motion capture data was acquired using a 12-camera Smart DX system (BTS Bioengineering, Brooklyn, NY), while exercisers performed 9 resistive exercises without volume restrictions in a 1g environment. Data were collected from two male subjects, one being in the 99th percentile of height and the other in the 50th percentile of height, using between 25 and 60 motion capture markers. Motion capture data was also recorded as a third subject, also near the 50th percentile in height, performed aerobic rowing during a parabolic flight. A motion capture system and algorithms developed previously and presented at last years HRP-IWS were utilized to collect and process the data from the parabolic flight [1]. These motions were applied to a scaled version of a biomechanical model within the biomechanical modeling software OpenSim [2], and the volume sweeps of the motions were visually assessed against an imported CAD model of the operational volume. Further numerical analysis was performed using Matlab (Mathworks, Natick, MA) and the OpenSim API. This analysis determined the location of every marker in space over the duration of the exercise motion, and the distance of each marker to the nearest surface of the volume. Containment of the exercise motions within the operational volume was determined on a per-exercise and per-subject basis. The orientation of the exerciser and the angle of the footplate were two important factors upon which containment was dependent. Regions where the exercise motion exceeds the bounds of the operational volume have been

  12. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  13. Low Tidal Volume versus Non-Volume-Limited Strategies for Patients with Acute Respiratory Distress Syndrome. A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Walkey, Allan J; Goligher, Ewan C; Del Sorbo, Lorenzo; Hodgson, Carol L; Adhikari, Neill K J; Wunsch, Hannah; Meade, Maureen O; Uleryk, Elizabeth; Hess, Dean; Talmor, Daniel S; Thompson, B Taylor; Brower, Roy G; Fan, Eddy

    2017-10-01

    Trials investigating use of lower tidal volumes and inspiratory pressures for patients with acute respiratory distress syndrome (ARDS) have shown mixed results. To compare clinical outcomes of mechanical ventilation strategies that limit tidal volumes and inspiratory pressures (LTV) to strategies with tidal volumes of 10 to 15 ml/kg among patients with ARDS. This is a systematic review and meta-analysis of clinical trials investigating LTV mechanical ventilation strategies. We used random effects models to evaluate the effect of LTV on 28-day mortality, organ failure, ventilator-free days, barotrauma, oxygenation, and ventilation. Our primary analysis excluded trials for which the LTV strategy was combined with the additional strategy of higher positive end-expiratory pressure (PEEP), but these trials were included in a stratified sensitivity analysis. We performed metaregression of tidal volume gradient achieved between intervention and control groups on mortality effect estimates. We used Grading of Recommendations Assessment, Development, and Evaluation methodology to determine the quality of evidence. Seven randomized trials involving 1,481 patients met eligibility criteria for this review. Mortality was not significantly lower for patients receiving an LTV strategy (33.6%) as compared with control strategies (40.4%) (relative risk [RR], 0.87; 95% confidence interval [CI], 0.70-1.08; heterogeneity statistic I 2  = 46%), nor did an LTV strategy significantly decrease barotrauma or ventilator-free days when compared with a lower PEEP strategy. Quality of evidence for clinical outcomes was downgraded for imprecision. Metaregression showed a significant inverse association between larger tidal volume gradient between LTV and control groups and log odds ratios for mortality (β, -0.1587; P = 0.0022). Sensitivity analysis including trials that protocolized an LTV/high PEEP cointervention showed lower mortality associated with LTV (nine trials and 1

  14. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  15. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  16. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  17. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  18. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  19. Methodology for Mode Selection in Corridor Analysis of Freight Transportation

    OpenAIRE

    Kanafani, Adib

    1984-01-01

    The purpose of tins report is to outline a methodology for the analysis of mode selection in freight transportation. This methodology is intended to partake of transportation corridor analysts, a component of demand analysis that is part of a national transportation process. The methodological framework presented here provides a basis on which specific models and calculation procedures might be developed. It also provides a basis for the development of a data management system suitable for co...

  20. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  1. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  2. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  3. Intravascular volume in cirrhosis. Reassessment using improved methodology

    International Nuclear Information System (INIS)

    Rector, W.G. Jr.; Ibarra, F.

    1988-01-01

    Previous studies of blood volume (BV) in cirrhosis have either not adjusted BV properly for body size; determined plasma volume from the dilution of labeled albumin 10-20 min postinjection, when some extravascular redistribution has already occurred; and/or not used the correct whole body-peripheral hematocrit ratio (0.82) in calculating whole BV from plasma volume and the peripheral hematocrit. We measured BV with attention to these considerations in 19 patients with cirrhosis and reexamined the determinants of vascular volume and the relationship between vascular volume and sodium retention. BV was calculated as plasma volume (determined from extrapolated plasma activity of intravenously injected [ 131 I]+albumin at time 0) divided by (peripheral hematocrit X 0.82). The result was expressed per kilogram dry body weight, determined by subtracting the mass of ascites (measured by isotope dilution; 1 liter = 1 kg) from the actual body weight of nonedematous patients. Measured and expressed in this way, BV correlated strongly with esophageal variceal size (r = 0.87, P less than 0.05), although not with net portal, right atrial, inferior vena caval, or arterial pressure, and was significantly greater in patients with sodium retention as compared to patients without sodium retention. The principal modifier of vascular volume in cirrhosis is vascular capacity, which is probably mainly determined by the extent of the portasystemic collateral circulation. Increased vascular volume in patients with sodium retention as compared to patients without sodium retention supports the overflow theory of ascites formation

  4. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  5. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  6. Assessment of historical leak model methodology as applied to the REDOX high-level waste tank SX-108

    International Nuclear Information System (INIS)

    JONES, T.E.

    1999-01-01

    Using the Historical Leak Model approach, the estimated leak rate (and therefore, projected leak volume) for Tank 241-SX-108 could not be reproduced using the data included in the initial document describing the leak methodology. An analysis of parameters impacting tank heat load calculations strongly suggest that the historical tank operating data lack the precision and accuracy required to estimate tank leak volumes using the Historical Leak Model methodology

  7. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  8. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  9. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  10. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  11. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  12. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  13. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  14. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  15. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  16. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  17. Diversion path analysis handbook. Volume I. Methodology

    International Nuclear Information System (INIS)

    Maltese, M.D.K.; Goodwin, K.E.; Schleter, J.C.

    1976-10-01

    Diversion Path Analysis (DPA) is a procedure for analyzing internal controls of a facility in order to identify vulnerabilities to successful diversion of material by an adversary. The internal covert threat is addressed but the results are also applicable to the external overt threat. The diversion paths are identified. Complexity parameters include records alteration or falsification, multiple removals of sub-threshold quantities, collusion, and access authorization of the individual. Indicators, or data elements and information of significance to detection of unprevented theft, are identified by means of DPA. Indicator sensitivity is developed in terms of the threshold quantity, the elapsed time between removal and indication and the degree of localization of facility area and personnel given by the indicator. Evaluation of facility internal controls in light of these sensitivities defines the capability of interrupting identified adversary action sequences related to acquisition of material at fixed sites associated with the identified potential vulnerabilities. Corrective measures can, in many cases, also be prescribed for management consideration and action. DPA theory and concepts have been developing over the last several years, and initial field testing proved both the feasibility and practicality of the procedure. Follow-on implementation testing verified the ability of facility personnel to perform DPA

  18. Probabilistic safety analysis procedures guide. Sections 1-7 and appendices. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Bari, R.A.; Buslik, A.J.; Cho, N.Z.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. This first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. The second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  19. Pressure fluctuation analysis for charging pump of chemical and volume control system of nuclear power plant

    Directory of Open Access Journals (Sweden)

    Chen Qiang

    2016-01-01

    Full Text Available Equipment Failure Root Cause Analysis (ERCA methodology is employed in this paper to investigate the root cause for charging pump’s pressure fluctuation of chemical and volume control system (RCV in pressurized water reactor (PWR nuclear power plant. RCA project task group has been set up at the beginning of the analysis process. The possible failure modes are listed according to the characteristics of charging pump’s actual pressure fluctuation and maintenance experience during the analysis process. And the failure modes are analysed in proper sequence by the evidence-collecting. It suggests that the gradually untightened and loosed shaft nut in service should be the root cause. And corresponding corrective actions are put forward in details.

  20. SLSF loop handling system. Volume I. Structural analysis

    International Nuclear Information System (INIS)

    Ahmed, H.; Cowie, A.; Ma, D.

    1978-10-01

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision

  1. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  2. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  3. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  4. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  5. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  6. Coyote Papers: The University of Arizona Working Papers in Linguistics, Volume 11. Special Volume on Native American Languages.

    Science.gov (United States)

    Weinberg, Jessica P., Ed.; O'Bryan, Erin L., Ed.; Moll, Laura A., Ed.; Haugan, Jason D., Ed.

    The five papers included in this volume approach the study of American Indian languages from a diverse array of methodological and theoretical approaches to linguistics. Two papers focus on approaches that come from the applied linguistics tradition, emphasizing ethnolinguistics and discourse analysis: Sonya Bird's paper "A Cross Cultural…

  7. Canonical duality theory unified methodology for multidisciplinary study

    CERN Document Server

    Latorre, Vittorio; Ruan, Ning

    2017-01-01

    This book on canonical duality theory provides a comprehensive review of its philosophical origin, physics foundation, and mathematical statements in both finite- and infinite-dimensional spaces. A ground-breaking methodological theory, canonical duality theory can be used for modeling complex systems within a unified framework and for solving a large class of challenging problems in multidisciplinary fields in engineering, mathematics, and the sciences. This volume places a particular emphasis on canonical duality theory’s role in bridging the gap between non-convex analysis/mechanics and global optimization.  With 18 total chapters written by experts in their fields, this volume provides a nonconventional theory for unified understanding of the fundamental difficulties in large deformation mechanics, bifurcation/chaos in nonlinear science, and the NP-hard problems in global optimization. Additionally, readers will find a unified methodology and powerful algorithms for solving challenging problems in comp...

  8. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  9. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  10. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  11. Combined cycle solar central receiver hybrid power system study. Volume III. Appendices. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-11-01

    A design study for a 100 MW gas turbine/steam turbine combined cycle solar/fossil-fuel hybrid power plant is presented. This volume contains the appendices: (a) preconceptual design data; (b) market potential analysis methodology; (c) parametric analysis methodology; (d) EPGS systems description; (e) commercial-scale solar hybrid power system assessment; and (f) conceptual design data lists. (WHK)

  12. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    Science.gov (United States)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-05-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  13. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    Science.gov (United States)

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  14. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  15. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  16. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  17. Assessing the quality of the volume-outcome relationship in uro-oncology.

    Science.gov (United States)

    Mayer, Erik K; Purkayastha, Sanjay; Athanasiou, Thanos; Darzi, Ara; Vale, Justin A

    2009-02-01

    To assess systematically the quality of evidence for the volume-outcome relationship in uro-oncology, and thus facilitate the formulating of health policy within this speciality, as 'Implementation of Improving Outcome Guidance' has led to centralization of uro-oncology based on published studies that have supported a 'higher volume-better outcome' relationship, but improved awareness of methodological drawbacks in health service research has questioned the strength of this proposed volume-outcome relationship. We systematically searched previous relevant reports and extracted all articles from 1980 onwards assessing the volume-outcome relationship for cystectomy, prostatectomy and nephrectomy at the institution and/or surgeon level. Studies were assessed for their methodological quality using a previously validated rating system. Where possible, meta-analytical methods were used to calculate overall differences in outcome measures between low and high volume healthcare providers. In all, 22 studies were included in the final analysis; 19 of these were published in the last 5 years. Only four studies appropriately explored the effect of both the institution and surgeon volume on outcome measures. Mortality and length of stay were the most frequently measured outcomes. The median total quality scores within each of the operation types were 8.5, 9 and 8 for cystectomy, prostatectomy and nephrectomy, respectively (possible maximum score 18). Random-effects modelling showed a higher risk of mortality in low-volume institutions than in higher-volume institutions for both cystectomy and nephrectomy (odds ratio 1.88, 95% confidence interval 1.54-2.29, and 1.28, 1.10-1.49, respectively). The methodological quality of volume-outcome research as applied to cystectomy, prostatectomy and nephrectomy is only modest at best. Accepting several limitations, pooled analysis confirms a higher-volume, lower-mortality relationship for cystectomy and nephrectomy. Future research should

  18. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  19. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  20. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  1. Analysis of increasing trend of mortgage volume in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Petra Střelcová

    2009-01-01

    Full Text Available The aim of this paper is an empirical analysis of mortgage volume in the Czech Republic and factors identification of the increasing trend of the mortgage volume in the period from 2001 to 2007. Firstly, analysis of quarterly time series of mortgage volume and average mortgage rate are performed. Consequently, causality between mortgage volume and average mortgage rate is analysed. The morgage rate is the most important factor for economic subjects decision of residential investment. Afterwards, it is analysed causality between mortgage volume and selected factors via multiple regression analysis. Based on this analysis, influencing factors for multiple regression analysis describing mortgage volume are selected. Our empirical analysis validate the causality between mortgage volume and mortgage rate, unemployment rate and price level of real estates. Part of this paper is also economic eduction of causality and estimation of expect progress of mortgage volume especially in connection with present economic and business recession.

  2. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  3. Volume conduction effects on wavelet cross-bicoherence analysis

    International Nuclear Information System (INIS)

    Memon, I.A.; Channa, C.

    2013-01-01

    Cross-bicoherence analysis is one of the important nonlinear signal processing tools which is used to measure quadratic phase coupling between frequencies of two different time series. It is frequently used in the diagnosis of various cognitive and neurological disorders in EEG (Electroencephalography) analysis. Volume conduction effects of various uncorrelated sources present in the brain can produce biased estimates into the estimated values of cross-bicoherence function. Previous studies have discussed volume conduction effects on coherence function which is used to measure linear relationship between EEG signals in terms of their phase and amplitude. However, volume conduction effect on cross-bicoherence analysis which is quite a different technique has not been investigated up to now to the best of our knowledge. This study is divided into two major parts, the first part deals with the investigation of VCUS (Volume Conduction effects due to Uncorrelated Sources) characteristics on EEG-cross-bicoherence analysis. The simulated EEG data due to uncorrelated sources present in the brain was used in this part of study. The next part of study is based upon investigating the effects of VCUS on the statistical analysis of results of EEG-based cross-bicoherence analysis. The study provides an important clinical application because most of studies based on EEG cross-bicoherence analysis have avoided the issue of VCUS. The cross-bicoherence analysis was performed by detecting the change in MSCB (Magnitude Square Cross-Bicoherence Function) between EEG activities of change detection and no-change detection trials. The real EEG signals were used. (author)

  4. Criteria for the development and use of the methodology for environmentally-acceptable fossil energy site evaluation and selection. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Eckstein, L.; Northrop, G.; Scott, R.

    1980-02-01

    This report serves as a companion document to the report, Volume 1: Environmentally-Acceptable Fossil Energy Site Evaluation and Selection: Methodology and Users Guide, in which a methodology was developed which allows the siting of fossil fuel conversion facilities in areas with the least environmental impact. The methodology, known as SELECS (Site Evaluation for Energy Conversion Systems) does not replace a site specific environmental assessment, or an environmental impact statement (EIS), but does enhance the value of an EIS by thinning down the number of options to a manageable level, by doing this in an objective, open and selective manner, and by providing preliminary assessment and procedures which can be utilized during the research and writing of the actual impact statement.

  5. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  6. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  7. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  8. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  9. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  10. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  11. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  12. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  13. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  14. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  15. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  16. Experimental analysis of fuzzy controlled energy efficient demand controlled ventilation economizer cycle variable air volume air conditioning system

    Directory of Open Access Journals (Sweden)

    Rajagopalan Parameshwaran

    2008-01-01

    Full Text Available In the quest for energy conservative building design, there is now a great opportunity for a flexible and sophisticated air conditioning system capable of addressing better thermal comfort, indoor air quality, and energy efficiency, that are strongly desired. The variable refrigerant volume air conditioning system provides considerable energy savings, cost effectiveness and reduced space requirements. Applications of intelligent control like fuzzy logic controller, especially adapted to variable air volume air conditioning systems, have drawn more interest in recent years than classical control systems. An experimental analysis was performed to investigate the inherent operational characteristics of the combined variable refrigerant volume and variable air volume air conditioning systems under fixed ventilation, demand controlled ventilation, and combined demand controlled ventilation and economizer cycle techniques for two seasonal conditions. The test results of the variable refrigerant volume and variable air volume air conditioning system for each techniques are presented. The test results infer that the system controlled by fuzzy logic methodology and operated under the CO2 based mechanical ventilation scheme, effectively yields 37% and 56% per day of average energy-saving in summer and winter conditions, respectively. Based on the experimental results, the fuzzy based combined system can be considered to be an alternative energy efficient air conditioning scheme, having significant energy-saving potential compared to the conventional constant air volume air conditioning system.

  17. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  18. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  19. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 1: Executive Summary

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.Z.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  20. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  1. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  2. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  3. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  4. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  5. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  6. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  7. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  8. Exploring the Central Nervous System: methodological state of the art

    International Nuclear Information System (INIS)

    Darcourt, Jacques; Koulibaly, Pierre-Malick; Migneco, Octave

    2005-01-01

    The analysis of the clinical use of brain SPECT demonstrate a defacing between the methodological developments published recently and its current use in clinical practice. We review a description of recent methodological developments that could be useful in three classical clinical application: the diagnosis of Alzheimer's disease, the evaluation of dopaminergic neurotransmission in Parkinson's Disease and the study of epilepsy. In Alzheimer's disease the methods of spatial standardization and the comparison to a normative data base are more useful to observers that have the least experience and for this end methodological approaches that are oriented to routine work better and are simpler than SPM. Quantification is essential in the study of dopaminergic neurotransmission and the measurement of binding potential appears biased due to septal penetration, attenuation, diffusion and partial volume effect. Partial volume effect introduces most error and its correction is difficult because of the co registration precision required with magnetic resonance images. The study of epilepsy by subtraction of ictal and interictal SPECT has demonstrated its clinical value. It is a fusion of images operation that has now very well defined methods (au)

  9. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  10. Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005

    Science.gov (United States)

    Coffman, Julia, Ed.

    2005-01-01

    This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…

  11. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  12. Performance evaluation of the technical capabilities of DOE sites for disposal of mixed low-level waste. Volume 1: Executive summary

    International Nuclear Information System (INIS)

    1996-03-01

    A team of analysts designed and conducted a performance evaluation (PE) to estimate the technical capabilities of fifteen Department of Energy sites for disposal of mixed low-level waste (i.e., waste that contains both low-level radioactive materials and hazardous constituents). Volume 1 summarizes the process for selecting the fifteen sites, the methodology used in the evaluation, and the conclusions derived from the evaluation. Volume 1 is an executive summary both of the PE methodology and of the results obtained from the PEs. While this volume briefly reviews the scope and method of analyses, its main objective is to emphasize the important insights and conclusions derived from the conduct of the PEs. Volume 2 provides details about the site-selection process, the performance-evaluation methodology, and the overall results of the analysis. Volume 3 contains detailed evaluations of the fifteen sites and discussions of the results for each site

  13. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  14. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  15. Economic Analysis. Volume V. Course Segments 65-79.

    Science.gov (United States)

    Sterling Inst., Washington, DC. Educational Technology Center.

    The fifth volume of the multimedia, individualized course in economic analysis produced for the United States Naval Academy covers segments 65-79 of the course. Included in the volume are discussions of monopoly markets, monopolistic competition, oligopoly markets, and the theory of factor demand and supply. Other segments of the course, the…

  16. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  17. Opening Remarks of the Acquisition Path Analysis Methodology Session

    International Nuclear Information System (INIS)

    Renis, T.

    2015-01-01

    An overview of the recent development work that has been done on acquisition path analysis, implementation of the methodologies within the Department of Safeguards, lessons learned and future areas for development will be provided. (author)

  18. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  19. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  20. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  1. Research program for seismic qualification of nuclear plant electrical and mechanical equipment. Task 3. Recommendations for improvement of equipment qualification methodology and criteria. Volume 3

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-08-01

    The Research Program for Seismic Qualification of Nuclear Plant Electrical and Mechanical Equipment has spanned a period of three years and resulted in seven technical summary reports, each of which covered in detail the findings of different tasks and subtasks, and have been combined into five NUREG/CR volumes. Volume 3 presents recommendations for improvement of equipment qualification methodology and procedural clarification/modification. The fifth category identifies issues where adequate information does not exist to allow a recommendation to be made

  2. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  3. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  4. An Introduction to the Special Volume on

    Directory of Open Access Journals (Sweden)

    Micah Altman

    2011-08-01

    Full Text Available This special volume of the Journal of Statistical Software on political methodology includes 14 papers, with wide-ranging software contributions of political scientists to their own field, and more generally to statistical data analysis in the the social sciences and beyond. Special emphasis is given to software that is written in or can cooperate with the R system for statistical computing.

  5. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  6. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  7. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  8. Sensitivity and uncertainty analyses applied to criticality safety validation. Volume 2

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This report presents the application of sensitivity and uncertainty (S/U) analysis methodologies developed in Volume 1 to the code/data validation tasks of a criticality safety computational study. Sensitivity and uncertainty analysis methods were first developed for application to fast reactor studies in the 1970s. This work has revitalized and updated the existing S/U computational capabilities such that they can be used as prototypic modules of the SCALE code system, which contains criticality analysis tools currently in use by criticality safety practitioners. After complete development, simplified tools are expected to be released for general use. The methods for application of S/U and generalized linear-least-square methodology (GLLSM) tools to the criticality safety validation procedures were described in Volume 1 of this report. Volume 2 of this report presents the application of these procedures to the validation of criticality safety analyses supporting uranium operations where enrichments are greater than 5 wt %. Specifically, the traditional k eff trending analyses are compared with newly developed k eff trending procedures, utilizing the D and c k coefficients described in Volume 1. These newly developed procedures are applied to a family of postulated systems involving U(11)O 2 fuel, with H/X values ranging from 0--1,000. These analyses produced a series of guidance and recommendations for the general usage of these various techniques. Recommendations for future work are also detailed

  9. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  10. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  11. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  12. Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

    OpenAIRE

    J. R. Wang; S. W. Chen; Y. Chiang; W. S. Hsu; J. H. Yang; Y. S. Tseng; C. Shih

    2017-01-01

    In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the paramet...

  13. Geostatistical approach for assessing soil volumes requiring remediation: validation using lead-polluted soils underlying a former smelting works.

    Science.gov (United States)

    Demougeot-Renard, Helene; De Fouquet, Chantal

    2004-10-01

    Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.

  14. Control-Volume Analysis Of Thrust-Augmenting Ejectors

    Science.gov (United States)

    Drummond, Colin K.

    1990-01-01

    New method of analysis of transient flow in thrust-augmenting ejector based on control-volume formulation of governing equations. Considered as potential elements of propulsion subsystems of short-takeoff/vertical-landing airplanes.

  15. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  16. Performance evaluation of the technical capabilities of DOE sites for disposal of mixed low-level waste: Volume 3, Site evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Waters, R.D.; Gruebel, M.M. [eds.

    1996-03-01

    A team of analysts designed and conducted a performance evaluation to estimate the technical capabilities of fifteen Department of Energy sites for disposal of mixed low-level waste (i.e., waste that contains both low-level radioactive materials and hazardous constituents). Volume 1 summarizes the process for selecting the fifteen sites, the methodology used in the evaluation, and the conclusions derived from the evaluation. Volume 2 provides details about the site-selection process, the performance-evaluation methodology, and the overall results of the analysis. Volume 3 contains detailed evaluations of the fifteen sites and discussion of the results for each site.

  17. Performance evaluation of the technical capabilities of DOE sites for disposal of mixed low-level waste: Volume 3, Site evaluations

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.

    1996-03-01

    A team of analysts designed and conducted a performance evaluation to estimate the technical capabilities of fifteen Department of Energy sites for disposal of mixed low-level waste (i.e., waste that contains both low-level radioactive materials and hazardous constituents). Volume 1 summarizes the process for selecting the fifteen sites, the methodology used in the evaluation, and the conclusions derived from the evaluation. Volume 2 provides details about the site-selection process, the performance-evaluation methodology, and the overall results of the analysis. Volume 3 contains detailed evaluations of the fifteen sites and discussion of the results for each site

  18. New methodology for a person identification system

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 31; Issue 3. New methodology for a person identification system. R Bremananth A Chitra. Volume 31 Issue 3 June 2006 pp 259-276 ... Experimental results illustrate that the proposed method has been easily espoused in elections, bank transactions and other security applications.

  19. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  20. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...... the successful applications of the methodology. Moreover, energy requirements for various column configurations corresponding to different feed locatio...

  1. Architectural and Behavioral Systems Design Methodology and Analysis for Optimal Habitation in a Volume-Limited Spacecraft for Long Duration Flights

    Science.gov (United States)

    Kennedy, Kriss J.; Lewis, Ruthan; Toups, Larry; Howard, Robert; Whitmire, Alexandra; Smitherman, David; Howe, Scott

    2016-01-01

    As our human spaceflight missions change as we reach towards Mars, the risk of an adverse behavioral outcome increases, and requirements for crew health, safety, and performance, and the internal architecture, will need to change to accommodate unprecedented mission demands. Evidence shows that architectural arrangement and habitability elements impact behavior. Net habitable volume is the volume available to the crew after accounting for elements that decrease the functional volume of the spacecraft. Determination of minimum acceptable net habitable volume and associated architectural design elements, as mission duration and environment varies, is key to enabling, maintaining, andor enhancing human performance and psychological and behavioral health. Current NASA efforts to derive minimum acceptable net habitable volumes and study the interaction of covariates and stressors, such as sensory stimulation, communication, autonomy, and privacy, and application to internal architecture design layouts, attributes, and use of advanced accommodations will be presented. Furthermore, implications of crew adaptation to available volume as they transfer from Earth accommodations, to deep space travel, to planetary surface habitats, and return, will be discussed.

  2. Tidal volume and mortality in mechanically ventilated children: a systematic review and meta-analysis of observational studies*.

    Science.gov (United States)

    de Jager, Pauline; Burgerhof, Johannes G M; van Heerde, Marc; Albers, Marcel J I J; Markhorst, Dick G; Kneyber, Martin C J

    2014-12-01

    To determine whether tidal volume is associated with mortality in critically ill, mechanically ventilated children. MEDLINE, EMBASE, and CINAHL databases from inception until July 2013 and bibliographies of included studies without language restrictions. Randomized clinical trials and observational studies reporting mortality in mechanically ventilated PICU patients. Two authors independently selected studies and extracted data on study methodology, quality, and patient outcomes. Meta-analyses were performed using the Mantel-Haenszel random-effects model. Heterogeneity was quantified using I. Study quality was assessed using the Newcastle-Ottawa Score for cohort studies. Out of 142 citations, seven studies met the inclusion criteria, and additional two articles were identified from references of the found articles. One was excluded. These eight studies included 1,756 patients. Mortality rates ranged from 13% to 42%. There was no association between tidal volume and mortality when tidal volume was dichotomized at 7, 8, 10, or 12 mL/kg. Comparing patients ventilated with tidal volume less than 7 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg and tidal volume less than 8 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg also showed no association between tidal volume and mortality. Limiting the analysis to patients with acute lung injury/acute respiratory distress syndrome did not change these results. Heterogeneity was observed in all pooled analyses. A relationship between tidal volume and mortality in mechanically ventilated children could not be identified, irrespective of the severity of disease. The significant heterogeneity observed in the pooled analyses necessitates future studies in well-defined patient populations to understand the effects of tidal volume on patient outcome.

  3. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  4. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  5. Optimisation of the link volume for weakest link failure prediction in NBG-18 nuclear graphite

    International Nuclear Information System (INIS)

    Hindley, Michael P.; Groenwold, Albert A.; Blaine, Deborah C.; Becker, Thorsten H.

    2014-01-01

    This paper describes the process for approximating the optimal size of a link volume required for weakest link failure calculation in nuclear graphite, with NBG-18 used as an example. As part of the failure methodology, the link volume is defined in terms of two grouping criteria. The first criterion is a factor of the maximum grain size and the second criterion is a function of an equivalent stress limit. A methodology for approximating these grouping criteria is presented. The failure methodology employs finite element analysis (FEA) in order to predict the failure load, at 50% probability of failure. The average experimental failure load, as determined for 26 test geometries, is used to evaluate the accuracy of the weakest link failure calculations. The influence of the two grouping criteria on the failure load prediction is evaluated by defining an error in prediction across all test cases. Mathematical optimisation is used to find the minimum error across a range of test case failure predictions. This minimum error is shown to deliver the most accurate failure prediction across a whole range of components, although some test cases in the range predict conservative failure load. The mathematical optimisation objective function is penalised to account for non-conservative prediction of the failure load for any test case. The optimisation is repeated and a link volume found for conservative failure prediction. The failure prediction for each test case is evaluated, in detail, for the proposed link volumes. Based on the analysis, link design volumes for NBG-18 are recommended for either accurate or conservative failure prediction

  6. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  7. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  8. Improved robotic stereotactic body radiation therapy plan quality and planning efficacy for organ-confined prostate cancer utilizing overlap-volume histogram-driven planning methodology

    International Nuclear Information System (INIS)

    Wu, Binbin; Pang, Dalong; Lei, Siyuan; Gatti, John; Tong, Michael; McNutt, Todd; Kole, Thomas; Dritschilo, Anatoly; Collins, Sean

    2014-01-01

    Background and purpose: This study is to determine if the overlap-volume histogram (OVH)-driven planning methodology can be adapted to robotic SBRT (CyberKnife Robotic Radiosurgery System) to further minimize the bladder and rectal doses achieved in plans manually-created by clinical planners. Methods and materials: A database containing clinically-delivered, robotic SBRT plans (7.25 Gy/fraction in 36.25 Gy) of 425 patients with localized prostate cancer was used as a cohort to establish an organ’s distance-to-dose model. The OVH-driven planning methodology was refined by adding the PTV volume factor to counter the target’s dose fall-off effect and incorporated into Multiplan to automate SBRT planning. For validation, automated plans (APs) for 12 new patients were generated, and their achieved dose/volume values were compared to the corresponding manually-created, clinically-delivered plans (CPs). A two-sided, Wilcoxon rank-sum test was used for statistical comparison with a significance level of p < 0.05. Results: PTV’s V(36.25 Gy) was comparable: 95.6% in CPs comparing to 95.1% in APs (p = 0.2). On average, the refined approach lowered V(18.12 Gy) to the bladder and rectum by 8.2% (p < 0.05) and 6.4% (p = 0.14). A physician confirmed APs were clinically acceptable. Conclusions: The improvements in APs could further reduce toxicities observed in SBRT for organ-confined prostate cancer

  9. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  10. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  11. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  12. Diversion path analysis handbook. Volume 2 (of 4 volumes). Example

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Volume 2 of the Handbook is divided into two parts, the workpaper documentation and the summary documentation. The former sets forth, in terms of the hypothetical process, the analysis guidelines, the information gathered, the characterization of the process, the specific diversion paths related to the process, and, finally, the results and findings of the Diversion Path Analysis (DPA). The summary documentation, made up of portions of sections already prepared for the workpapers, is a concise statement of results and recommendations for management use. Most of the details available in the workpapers are not used, or are held to a minimum, in this report. Also, some rearrangement of the excerpted sections has been made in order to permit rapid comprehension by a manager having only limited time to devote to study and review of the analysis

  13. Evaluation of potential severe accidents during Low Power and Shutdown Operations at Grand Gulf, Unit 1. Volume 2, Part 1B: Analysis of core damage frequency from internal events for Plant Operational State 5 during a refueling outage, Main report (Section 10)

    International Nuclear Information System (INIS)

    Whitehead, D.; Darby, J.; Yakle, J.

    1994-06-01

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power performed on Grand Gulf. This document, Volume 2, Part 1B, presents chapters Section 10 of this report, Human Reliability Analysis

  14. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  15. Whole-Volume Clustering of Time Series Data from Zebrafish Brain Calcium Images via Mixture Modeling.

    Science.gov (United States)

    Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L

    2018-02-01

    Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.

  16. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  18. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  19. Development and analysis of finite volume methods

    International Nuclear Information System (INIS)

    Omnes, P.

    2010-05-01

    This document is a synthesis of a set of works concerning the development and the analysis of finite volume methods used for the numerical approximation of partial differential equations (PDEs) stemming from physics. In the first part, the document deals with co-localized Godunov type schemes for the Maxwell and wave equations, with a study on the loss of precision of this scheme at low Mach number. In the second part, discrete differential operators are built on fairly general, in particular very distorted or nonconforming, bidimensional meshes. These operators are used to approach the solutions of PDEs modelling diffusion, electro and magneto-statics and electromagnetism by the discrete duality finite volume method (DDFV) on staggered meshes. The third part presents the numerical analysis and some a priori as well as a posteriori error estimations for the discretization of the Laplace equation by the DDFV scheme. The last part is devoted to the order of convergence in the L2 norm of the finite volume approximation of the solution of the Laplace equation in one dimension and on meshes with orthogonality properties in two dimensions. Necessary and sufficient conditions, relatively to the mesh geometry and to the regularity of the data, are provided that ensure the second-order convergence of the method. (author)

  20. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  1. Price-volume multifractal analysis and its application in Chinese stock markets

    Science.gov (United States)

    Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying

    2012-06-01

    An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.

  2. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  3. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  4. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  5. Two-dimensional transient thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Yalle Negreiros; Silva, Mário Augusto Bezerra da; Lira, Carlos Alberto de Oliveira, E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear

    2017-07-01

    One of the greatest concerns when studying a nuclear reactor is the warranty of safe temperature limits all over the system at all time. The preservation of core structure along with the constraint of radioactive material into a controlled system are the main focus during the operation of a reactor. The purpose of this paper is to present the temperature distribution for a nominal channel of the AP1000 reactor developed by Westinghouse Co. during steady-state and transient operations. In the analysis, the system was subjected to normal operation conditions and then to blockages of the coolant flow. The time necessary to achieve a new safe stationary stage (when it was possible) was presented. The methodology applied in this analysis was based on a two-dimensional survey accomplished by the application of Finite Volume Method (FVM). A steady solution is obtained and compared with an analytical analysis that disregard axial heat transport to determine its relevance. The results show the importance of axial heat transport consideration in this type of study. A transient analysis shows the behavior of the system when submitted to coolant blockage at channel's entrance. Three blockages were simulated (10%, 20% and 30%) and the results show that, for a nominal channel, the system can still be considerate safe (there's no bubble formation until that point). (author)

  6. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  7. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  8. Price-volume multifractal analysis of the Moroccan stock market

    Science.gov (United States)

    El Alaoui, Marwane

    2017-11-01

    In this paper, we analyzed price-volume multifractal cross-correlations of Moroccan Stock Exchange. We chose the period from January 1st 2000 to January 20th 2017 to investigate the multifractal behavior of price change and volume change series. Then, we used multifractal detrended cross-correlations analysis method (MF-DCCA) and multifractal detrended fluctuation analysis (MF-DFA) to analyze the series. We computed bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively cross-correlations. Furthermore, we used detrended cross-correlations coefficient (DCCA) and cross-correlation test (Q(m)) to analyze cross-correlation quantitatively and qualitatively. By analyzing results, we found existence of price-volume multifractal cross-correlations. The spectrum width has a strong multifractal cross-correlation. We remarked that volume change series is anti-persistent when we analyzed the generalized Hurst exponent for all moments q. The cross-correlation test showed the presence of a significant cross-correlation. However, DCCA coefficient had a small positive value, which means that the level of correlation is not very significant. Finally, we analyzed sources of multifractality and their degree of contribution in the series.

  9. Evaluation of operational safety at Babcock and Wilcox Plants: Volume 2, Thermal-hydraulic results

    International Nuclear Information System (INIS)

    Wheatley, P.D.; Davis, C.B.; Callow, R.A.; Fletcher, C.D.; Dobbe, C.A.; Beelman, R.J.

    1987-11-01

    The Nuclear Regulatory Commission has initiated a research program to develop a methodology to assess the operational performance of Babcock and Wilcox plants and to apply this methodology on a trial basis. The methodology developed for analyzing Babcock and Wilcox plants integrated methods used in both thermal-hydraulics and human factors and compared results with information used in the assessment of risk. The integrated methodology involved an evaluation of a selected plant for each pressurized water reactor vendor during a limited number of transients. A plant was selected to represent each vendor, and three transients were identified for analysis. The plants were Oconee Unit 1 for Babcock and Wilcox, H.B. Robinson Unit 2 for Westinghouse, and Calvert Cliffs Unit 1 for Combustion Engineering. The three transients were a complete loss of all feedwater, a small-break loss-of-coolant accident, and a steam-generator overfill with auxiliary feedwater. Included in the integrated methodology was an assessment of the thermal-hydraulic behavior, including event timing, of the plants during the three transients. Thermal-hydraulic results are presented in this volume (Volume 2) of the report. 26 refs., 30 figs., 7 tabs

  10. Export Potential of the Enterprise: Essence and Methodological Bases of the Analysis

    Directory of Open Access Journals (Sweden)

    Melnyk Olga G.

    2017-03-01

    Full Text Available The article considers theoretical and methodological aspects of the analysis of the enterprise’s export potential and the methodological basis for its measurement. Analyzing and summarizing scientific works on the problem, the views of researchers on the definition of the concept of “export potential of the enterprise” are systematized. The article considers the economic content of the enterprise’s export potential from the standpoint of the system-structural approach defining it as a complex systemic formation of interrelated and interacting elements of economic and non-economic origin, internal and external action. It is found out that in the international economic space the export potential of the enterprise acquires new qualitative features reflecting not just the resource potential of the national economic entity but also the needs and interests of foreign countries and their economic agents. It is identified that the functional role of the export potential is to implement the targets of the foreign economic activity of the enterprise. The nature of these targets can be different and is formed on the principle of ensuring the needs of external markets. The level of satisfaction of these needs by an individual enterprise can be evaluated through such indicators as the volume of exports, the quality of exported products, the level of export diversification, which determine the result of the export activity and in relation to its purpose serve as a criterion of the efficiency of the enterprise’s export potential. As a result of the study, the components of the export potential of the enterprise are singled out, and a model of their interrelationships is presented. The prospects of the research are connected with branch aspects of the formation of the enterprise’s export potential allowing to highlight its structural elements and directions of development.

  11. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  12. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  13. Improved USGS methodology for assessing continuous petroleum resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2010-01-01

    This report presents an improved methodology for estimating volumes of continuous (unconventional) oil and gas resources within the United States and around the world. The methodology is based on previously developed U.S. Geological Survey methodologies that rely on well-scale production data. Improvements were made primarily to how the uncertainty about estimated ultimate recoveries is incorporated in the estimates. This is particularly important when assessing areas with sparse or no production data, because the new methodology allows better use of analog data from areas with significant discovery histories.

  14. A methodology for finding the optimal iteration number of the SIRT algorithm for quantitative Electron Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Okariz, Ana, E-mail: ana.okariz@ehu.es [eMERG, Fisika Aplikatua I Saila, Faculty of Engineering, University of the Basque Country, UPV/EHU, Rafael Moreno “Pitxitxi” Pasealekua 3, 48013 Bilbao (Spain); Guraya, Teresa [eMERG, Departamento de Ingeniería Minera y Metalúrgica y Ciencia de los Materiales, Faculty of Engineering, University of the Basque Country, UPV/EHU, Rafael Moreno “Pitxitxi” Pasealekua 3, 48013 Bilbao (Spain); Iturrondobeitia, Maider [eMERG, Departamento de Expresión Gráfica y Proyectos de Ingeniería, Faculty of Engineering, University of the Basque Country, UPV/EHU, Rafael Moreno “Pitxitxi” Pasealekua 3, 48013 Bilbao (Spain); Ibarretxe, Julen [eMERG, Fisika Aplikatua I Saila, Faculty of Engineering,University of the Basque Country, UPV/EHU, Rafael Moreno “Pitxitxi” Pasealekua 2, 48013 Bilbao (Spain)

    2017-02-15

    The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. - Highlights: • The non uniformity of the resolution in electron tomography reconstructions has been demonstrated. • An overall resolution for the evaluation of the quality of electron tomography reconstructions has been defined. • Parameters for estimating an overall resolution across the reconstructed volume have been proposed. • The overall resolution of the reconstructions of a phantom has been estimated from the probability density functions. • It has been proven that reconstructions with the best overall resolutions have provided the most accurate segmentations.

  15. A methodology for finding the optimal iteration number of the SIRT algorithm for quantitative Electron Tomography

    International Nuclear Information System (INIS)

    Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen

    2017-01-01

    The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. - Highlights: • The non uniformity of the resolution in electron tomography reconstructions has been demonstrated. • An overall resolution for the evaluation of the quality of electron tomography reconstructions has been defined. • Parameters for estimating an overall resolution across the reconstructed volume have been proposed. • The overall resolution of the reconstructions of a phantom has been estimated from the probability density functions. • It has been proven that reconstructions with the best overall resolutions have provided the most accurate segmentations.

  16. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  17. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  18. Methodological Analysis of Gregarious Behaviour of Agents in the Financial Markets

    OpenAIRE

    Solodukhin Stanislav V.

    2013-01-01

    The article considers methodological approaches to analysis of gregarious behaviour of agents in the financial markets and also studies foundations of the agent modelling of decision making processes with consideration of the gregarious instinct.

  19. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  20. Plasma volume methodology: Evans blue, hemoglobin-hematocrit, and mass density transformations

    Science.gov (United States)

    Greenleaf, J. E.; Hinghofer-Szalkay, H.

    1985-01-01

    Methods for measuring absolute levels and changes in plasma volume are presented along with derivations of pertinent equations. Reduction in variability of the Evans blue dye dilution technique using chromatographic column purification suggests that the day-to-day variability in the plasma volume in humans is less than + or - 20 m1. Mass density determination using the mechanical-oscillator technique provides a method for measuring vascular fluid shifts continuously for assessing the density of the filtrate, and for quantifying movements of protein across microvascular walls. Equations for the calculation of volume and density of shifted fluid are presented.

  1. Southern forest inventory and analysis volume equation user’s guide

    Science.gov (United States)

    Christopher M. Oswalt; Roger C. Conner

    2011-01-01

    Reliable volume estimation procedures are fundamental to the mission of the Forest Inventory and Analysis (FIA) program. Moreover, public access to FIA program procedures is imperative. Here we present the volume estimation procedures used by the southern FIA program of the U.S. Department of Agriculture Forest Service Southern Research Station. The guide presented...

  2. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    Naegle, Brad R.

    2010-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  3. Recycle operations as a methodology for radioactive waste volume reduction

    International Nuclear Information System (INIS)

    Rasmussen, G.A.

    1985-01-01

    The costs for packaging, transportation and burial of low-level radioactive metallic waste have become so expensive that an alternate method of decontamination for volume reduction prior to disposal can now be justified. The operation of a large-scale centralized recycle center for decontamination of selected low level radioactive waste has been proven to be an effective method for waste volume reduction and for retrieving valuable materials for unlimited use. The centralized recycle center concept allows application of state-of-the-art decontamination technology resulting in a reduction in utility disposal costs and a reduction in overall net amount of material being buried. Examples of specific decontamination process activities at the centralized facility will be reviewed along with a discussion of the economic impact of decontamination for recycling and volume reduction. Based on almost two years of operation of a centralized decontamination facility, a demonstrated capability exists. The concept has been cost effective and proves that valuable resources can be recycled

  4. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  5. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  6. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  7. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  8. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data.

    Science.gov (United States)

    Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas

    2016-04-26

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.

  9. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    Science.gov (United States)

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  10. 3-D rod ejection analysis using a conservative methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Ho; Park, Jin Woo; Park, Guen Tae; Um, Kil Sup; Ryu, Seok Hee; Lee, Jae Il; Choi, Tong Soo [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    The point kinetics model which simplifies the core phenomena and physical specifications is used for the conventional rod ejection accident analysis. The point kinetics model is convenient to assume conservative core parameters but this simplification loses large amount of safety margin. The CHASER system couples the three-dimensional core neutron kinetics code ASTRA, the sub-channel analysis code THALES and the fuel performance analysis code FROST. The validation study for the CHASER system is addressed using the NEACRP three-dimensional PWR core transient benchmark problem. A series of conservative rod ejection analyses for the APR1400 type plant is performed for both hot full power (HFP) and hot zero power (HZP) conditions to determine the most limiting cases. The conservative rod ejection analysis methodology is designed to properly consider important phenomena and physical parameters.

  11. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  13. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  14. Does Flywheel Paradigm Training Improve Muscle Volume and Force? A Meta-Analysis.

    Science.gov (United States)

    Nuñez Sanchez, Francisco J; Sáez de Villarreal, Eduardo

    2017-11-01

    Núñez Sanchez, FJ and Sáez de Villarreal, E. Does flywheel paradigm training improve muscle volume and force? A meta-analysis. J Strength Cond Res 31(11): 3177-3186, 2017-Several studies have confirmed the efficacy of flywheel paradigm training for improving or benefiting muscle volume and force. A meta-analysis of 13 studies with a total of 18 effect sizes was performed to analyse the role of various factors on the effectiveness of flywheel paradigm training. The following inclusion criteria were employed for the analysis: (a) randomized studies; (b) high validity and reliability instruments; (c) published in a high quality peer-reviewed journal; (d) healthy participants; (e) studies where the eccentric programme were described; and (f) studies where increases in muscle volume and force were measured before and after training. Increases in muscle volume and force were noted through the use of flywheel systems during short periods of training. The increase in muscle mass appears was not influenced by the existence of eccentric overload during the exercise. The increase in force was significantly higher with the existence of eccentric overload during the exercise. The responses identified in this analysis are essential and should be considered by strength and conditioning professionals regarding the most appropriate dose response trends for flywheel paradigm systems to optimize the increase in muscle volume and force.

  15. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  16. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  17. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  18. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  19. Thermodynamic analysis of a Stirling engine including regenerator dead volume

    Energy Technology Data Exchange (ETDEWEB)

    Puech, Pascal; Tishkova, Victoria [Universite de Toulouse, UPS, CNRS, CEMES, 29 rue Jeanne Marvig, F-31055 Toulouse (France)

    2011-02-15

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine with linear and sinusoidal variations of the volume. The regenerator in a Stirling engine is an internal heat exchanger allowing to reach high efficiency. We used an isothermal model to analyse the net work and the heat stored in the regenerator during a complete cycle. We show that the engine efficiency with perfect regeneration doesn't depend on the regenerator dead volume but this dead volume strongly amplifies the imperfect regeneration effect. An analytical expression to estimate the improvement due to the regenerator has been proposed including the combined effects of dead volume and imperfect regeneration. This could be used at the very preliminary stage of the engine design process. (author)

  20. Methodology for the analysis of self-tensioned wooden structural floors

    Directory of Open Access Journals (Sweden)

    F. Suárez-Riestra

    2017-09-01

    Full Text Available It is described a self-tensioning system constituted by a force multiplying device which, attached to the supports of the ends of the structural element, is able to convert the vertical resultant from the gravitatonial actions into an effective tensioning action, through the movement that was induced by a set of rods. The self-tensioning system is able to offer a high performance, thanks to the beneficial effect of the opposite deflection generated by the tensioning, in proportion to the increasing of the gravitational action. This allows to design long-span timber ribbed floors using reduced depths. The complexity of calculation due to the non-linearity of the system can be obviated with the methodology of analysis developed in the article. In order to illustrate the advantages of the self-tensioning system and the methodology of analysis which were developed, six cases of ribbed floors have been analysed, with spans of 9, 12 and 15 m and variable using loads of 3,00 kN/m2 and 5,00 kN/m2.

  1. A framework for characterizing usability requirements elicitation and analysis methodologies (UREAM)

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Mannaert, H.

    2012-01-01

    Dedicated methodologies for the elicitation and analysis of usability requirements have been proposed in literature, usually developed by usability experts. The usability of these approaches by non-expert software engineers is not obvious. In this paper, the objective is to support developers and

  2. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    International Nuclear Information System (INIS)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon

    2002-01-01

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  3. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  4. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  5. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  6. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  7. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  8. A non-perturbative analysis in finite volume gauge theory

    International Nuclear Information System (INIS)

    Koller, J.; State Univ. of New York, Stony Brook; Van Baal, P.; State Univ. of New York, Stony Brook

    1988-01-01

    We discuss SU(2) gauge theory on a three-torus using a finite volume expansion. Our discovery of natural coordinates allows us to obtain continuum results in a region where Monte Carlo data are also available. The obtained results agree well with the perturbative and semiclassical analysis for small volumes, and there is fair agreement with the Monte Carlo results in intermediate volumes. The simple picture which emerges for the approximate low energy dynamics is that of three interacting particles enclosed in a sphere, with zero total 'angular momentum'. The validity of an adiabatic approximation is investigated. The fundamentally new understanding gained, is that non-perturbative dynamics can be incorporated by imposing boundary conditions which arise through the nontrivial topology of configuration space. (orig.)

  9. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  10. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  11. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  12. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  13. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  14. Interpretive Phenomenological Analysis: An Appropriate Methodology for Educational Research?

    Directory of Open Access Journals (Sweden)

    Edward John Noon

    2018-04-01

    Full Text Available Interpretive phenomenological analysis (IPA is a contemporary qualitative methodology, first developed by psychologist Jonathan Smith (1996. Whilst its roots are in psychology, it is increasingly being drawn upon by scholars in the human, social and health sciences (Charlick, Pincombe, McKellar, & Fielder, 2016. Despite this, IPA has received limited attention across educationalist literature. Drawing upon my experiences of using IPA to explore the barriers to the use of humour in the teaching of Childhood Studies (Noon, 2017, this paper will discuss its theoretical orientation, sampling and methods of data collection and analysis, before examining the strengths and weaknesses to IPA’s employment in educational research.

  15. Predicted costs of environmental controls for a commercial oil shale industry. Volume 1. An engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nevens, T.D.; Culbertson, W.J. Jr.; Wallace, J.R.; Taylor, G.C.; Jovanovich, A.P.; Prien, C.H.; Hicks, R.E.; Probstein, R.F.; Domahidy, G.

    1979-07-01

    The pollution control costs for a commercial oil shale industry were determined in a joint effort by Denver Research Institute, Water Purification Associates of Cambridge, and Stone and Webster Engineering of Boston and Denver. Four commercial oil shale processes were considered. The results in terms of cost per barrel of syncrude oil are predicted to be as follows: Paraho Process, $0.67 to $1.01; TOSCO II Process, $1.43 to $1.91; MIS Process, $2.02 to $3.03; and MIS/Lurgi-Ruhrgas Process, $1.68 to $2.43. Alternative pollution control equipment and integrated pollution control strategies were considered and optimal systems selected for each full-scale plant. A detailed inventory of equipment (along with the rationale for selection), a detailed description of control strategies, itemized costs and predicted emission levels are presented for each process. Capital and operating cost data are converted to a cost per barrel basis using detailed economic evaluation procedures. Ranges of cost are determined using a subjective self-assessment of uncertainty approach. An accepted methodology for probability encoding was used, and cost ranges are presented as subjective probability distributions. Volume I presents the detailed engineering results. Volume II presents the detailed analysis of uncertainty in the predicted costs.

  16. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  17. Volume 2. Probabilistic analysis of HTGR application studies. Supporting data

    International Nuclear Information System (INIS)

    1980-09-01

    Volume II, Probabilistic Analysis of HTGR Application Studies - Supporting Data, gives the detail data, both deterministic and probabilistic, employed in the calculation presented in Volume I. The HTGR plants and the fossil plants considered in the study are listed. GCRA provided the technical experts from which the data were obtained by MAC personnel. The names of the technical experts (interviewee) and the analysts (interviewer) are given for the probabilistic data

  18. Frequency Analysis of Gradient Estimators in Volume Rendering

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Lichtenbelt, Barthold B.A.; Malzbender, Tom

    1996-01-01

    Gradient information is used in volume rendering to classify and color samples along a ray. In this paper, we present an analysis of the theoretically ideal gradient estimator and compare it to some commonly used gradient estimators. A new method is presented to calculate the gradient at arbitrary

  19. Differences in regional grey matter volumes in currently ill patients with anorexia nervosa.

    Science.gov (United States)

    Phillipou, Andrea; Rossell, Susan Lee; Gurvich, Caroline; Castle, David Jonathan; Abel, Larry Allen; Nibbs, Richard Grant; Hughes, Matthew Edward

    2018-01-01

    Neurobiological findings in anorexia nervosa (AN) are inconsistent, including differences in regional grey matter volumes. Methodological limitations often contribute to the inconsistencies reported. The aim of this study was to improve on these methodologies by utilising voxel-based morphometry (VBM) analysis with the use of diffeomorphic anatomic registration through an exponentiated lie algebra algorithm (DARTEL), in a relatively large group of individuals with AN. Twenty-six individuals with AN and 27 healthy controls underwent a T1-weighted magnetic resonance imaging (MRI) scan. AN participants were found to have reduced grey matter volumes in a number of areas including regions of the basal ganglia (including the ventral striatum), and parietal and temporal cortices. Body mass index (BMI) and global scores on the Eating Disorder Examination Questionnaire (EDE-Q) were also found to correlate with grey matter volumes in a region of the brainstem (including the substantia nigra and ventral tegmental area) in AN, and predicted 56% of the variance in grey matter volumes in this area. The brain regions associated with grey matter reductions in AN are consistent with regions responsible for cognitive deficits associated with the illness including anhedonia, deficits in affect perception and saccadic eye movement abnormalities. Overall, the findings suggest reduced grey matter volumes in AN that are associated with eating disorder symptomatology. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  20. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  1. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  2. Performance evaluation of the technical capabilities of DOE sites for disposal of mixed low-level waste. Volume 2: Technical basis and discussion of results

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.; Hospelhorn, M.B.

    1996-03-01

    A team of analysts designed and conducted a performance evaluation to estimate the technical capabilities of fifteen Department of Energy sites for disposal of mixed low-level waste (i.e., waste that contains both low-level radioactive materials and hazardous constituents). Volume 1 summarizes the process for selecting the fifteen sites, the methodology used in the evaluation, and the conclusions derived from the evaluation. Volume 2 first describes the screening process used to determine the sites to be considered in the PEs. This volume then provides the technical details of the methodology for conducting the performance evaluations. It also provides a comparison and analysis of the overall results for all sites that were evaluated. Volume 3 contains detailed evaluations of the fifteen sites and discussions of the results for each site

  3. Know your tools - concordance of different methods for measuring brain volume change after ischemic stroke

    Energy Technology Data Exchange (ETDEWEB)

    Yassi, Nawaf; Campbell, Bruce C.V.; Davis, Stephen M.; Bivard, Andrew [The University of Melbourne, Departments of Medicine and Neurology, Melbourne Brain Centre rate at The Royal Melbourne Hospital, Parkville, Victoria (Australia); Moffat, Bradford A.; Steward, Christopher; Desmond, Patricia M. [The University of Melbourne, Department of Radiology, The Royal Melbourne Hospital, Parkville (Australia); Churilov, Leonid [The University of Melbourne, The Florey Institute of Neurosciences and Mental Health, Parkville (Australia); Parsons, Mark W. [University of Newcastle and Hunter Medical Research Institute, Priority Research Centre for Translational Neuroscience and Mental Health, Newcastle (Australia)

    2015-07-15

    Longitudinal brain volume changes have been investigated in a number of cerebral disorders as a surrogate marker of clinical outcome. In stroke, unique methodological challenges are posed by dynamic structural changes occurring after onset, particularly those relating to the infarct lesion. We aimed to evaluate agreement between different analysis methods for the measurement of post-stroke brain volume change, and to explore technical challenges inherent to these methods. Fifteen patients with anterior circulation stroke underwent magnetic resonance imaging within 1 week of onset and at 1 and 3 months. Whole-brain as well as grey- and white-matter volume were estimated separately using both an intensity-based and a surface watershed-based algorithm. In the case of the intensity-based algorithm, the analysis was also performed with and without exclusion of the infarct lesion. Due to the effects of peri-infarct edema at the baseline scan, longitudinal volume change was measured as percentage change between the 1 and 3-month scans. Intra-class and concordance correlation coefficients were used to assess agreement between the different analysis methods. Reduced major axis regression was used to inspect the nature of bias between measurements. Overall agreement between methods was modest with strong disagreement between some techniques. Measurements were variably impacted by procedures performed to account for infarct lesions. Improvements in volumetric methods and consensus between methodologies employed in different studies are necessary in order to increase the validity of conclusions derived from post-stroke cerebral volumetric studies. Readers should be aware of the potential impact of different methods on study conclusions. (orig.)

  4. Know your tools - concordance of different methods for measuring brain volume change after ischemic stroke

    International Nuclear Information System (INIS)

    Yassi, Nawaf; Campbell, Bruce C.V.; Davis, Stephen M.; Bivard, Andrew; Moffat, Bradford A.; Steward, Christopher; Desmond, Patricia M.; Churilov, Leonid; Parsons, Mark W.

    2015-01-01

    Longitudinal brain volume changes have been investigated in a number of cerebral disorders as a surrogate marker of clinical outcome. In stroke, unique methodological challenges are posed by dynamic structural changes occurring after onset, particularly those relating to the infarct lesion. We aimed to evaluate agreement between different analysis methods for the measurement of post-stroke brain volume change, and to explore technical challenges inherent to these methods. Fifteen patients with anterior circulation stroke underwent magnetic resonance imaging within 1 week of onset and at 1 and 3 months. Whole-brain as well as grey- and white-matter volume were estimated separately using both an intensity-based and a surface watershed-based algorithm. In the case of the intensity-based algorithm, the analysis was also performed with and without exclusion of the infarct lesion. Due to the effects of peri-infarct edema at the baseline scan, longitudinal volume change was measured as percentage change between the 1 and 3-month scans. Intra-class and concordance correlation coefficients were used to assess agreement between the different analysis methods. Reduced major axis regression was used to inspect the nature of bias between measurements. Overall agreement between methods was modest with strong disagreement between some techniques. Measurements were variably impacted by procedures performed to account for infarct lesions. Improvements in volumetric methods and consensus between methodologies employed in different studies are necessary in order to increase the validity of conclusions derived from post-stroke cerebral volumetric studies. Readers should be aware of the potential impact of different methods on study conclusions. (orig.)

  5. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    Cha, Kyoon Ho; Choi, Yu Sun; Lee, Eun Ki; Park, Moon Ghu; Morita, Toshio; Heibel, Michael D.

    2004-01-01

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  6. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  7. Automated methodology for estimating waste streams generated from decommissioning contaminated facilities

    International Nuclear Information System (INIS)

    Toth, J.J.; King, D.A.; Humphreys, K.K.; Haffner, D.R.

    1994-01-01

    As part of the DOE Programmatic Environmental Impact Statement (PEIS), a viable way to determine aggregate waste volumes, cost, and direct labor hours for decommissioning and decontaminating facilities is required. In this paper, a methodology is provided for determining waste streams, cost and direct labor hours from remediation of contaminated facilities. The method is developed utilizing U.S. facility remediation data and information from several decommissioning programs, including reactor decommissioning projects. The method provides for rapid, consistent analysis for many facility types. Three remediation scenarios are considered for facility D ampersand D: unrestricted land use, semi-restricted land use, and restricted land use. Unrestricted land use involves removing radioactive components, decontaminating the building surfaces, and demolishing the remaining structure. Semi-restricted land use involves removing transuranic contamination and immobilizing the contamination on-site. Restricted land use involves removing the transuranic contamination and leaving the building standing. In both semi-restricted and restricted land use scenarios, verification of containment with environmental monitoring is required. To use the methodology, facilities are placed in a building category depending upon the level of contamination, construction design, and function of the building. Unit volume and unit area waste generation factors are used to calculate waste volumes and estimate the amount of waste generated in each of the following classifications: low-level, transuranic, and hazardous waste. Unit factors for cost and labor hours are also applied to the result to estimate D ampersand D cost and labor hours

  8. Impact of partial-volume correction in oncological PET studies. A systematic review and meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cysouw, Matthijs C.F.; Kramer, Gerbrand M.; Hoekstra, Otto S. [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Schoonmade, Linda J. [VU University Medical Centre, Department of Medical Library, Amsterdam (Netherlands); Boellaard, Ronald [VU University Medical Centre, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); University Medical Centre Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); Vet, Henrica C.W. de [VU University Medical Centre, Department of Epidemiology and Biostatistics, Amsterdam (Netherlands)

    2017-11-15

    Positron-emission tomography can be useful in oncology for diagnosis, (re)staging, determining prognosis, and response assessment. However, partial-volume effects hamper accurate quantification of lesions <2-3 x the PET system's spatial resolution, and the clinical impact of this is not evident. This systematic review provides an up-to-date overview of studies investigating the impact of partial-volume correction (PVC) in oncological PET studies. We searched in PubMed and Embase databases according to the PRISMA statement, including studies from inception till May 9, 2016. Two reviewers independently screened all abstracts and eligible full-text articles and performed quality assessment according to QUADAS-2 and QUIPS criteria. For a set of similar diagnostic studies, we statistically pooled the results using bivariate meta-regression. Thirty-one studies were eligible for inclusion. Overall, study quality was good. For diagnosis and nodal staging, PVC yielded a strong trend of increased sensitivity at expense of specificity. Meta-analysis of six studies investigating diagnosis of pulmonary nodules (679 lesions) showed no significant change in diagnostic accuracy after PVC (p = 0.222). Prognostication was not improved for non-small cell lung cancer and esophageal cancer, whereas it did improve for head and neck cancer. Response assessment was not improved by PVC for (locally advanced) breast cancer or rectal cancer, and it worsened in metastatic colorectal cancer. The accumulated evidence to date does not support routine application of PVC in standard clinical PET practice. Consensus on the preferred PVC methodology in oncological PET should be reached. Partial-volume-corrected data should be used as adjuncts to, but not yet replacement for, uncorrected data. (orig.)

  9. The turning points of world history : financial and methodological interpretations

    OpenAIRE

    Kukliński, Antoni (ed.); Pawłowski, Krzysztof (ed.); Swianiewicz, Jan (ed.)

    2012-01-01

    Volumes VI and VII of the REUPUS Series “The Atlantic Community. The Titanic of the XXI Century?” and “The Turning Points of World History” can be seen as twin volumes trying to find new empirical observations, new methodological approaches and new value judgements to face the enigma of the XXI Century. Those volumes try to present some new interpretations of one of the greatest turning points of human history which is the essential feature of our times... Volume VII is the ...

  10. Discourse analysis: A useful methodology for health-care system researches.

    Science.gov (United States)

    Yazdannik, Ahmadreza; Yousefy, Alireza; Mohammadi, Sepideh

    2017-01-01

    Discourse analysis (DA) is an interdisciplinary field of inquiry and becoming an increasingly popular research strategy for researchers in various disciplines which has been little employed by health-care researchers. The methodology involves a focus on the sociocultural and political context in which text and talk occur. DA adds a linguistic approach to an understanding of the relationship between language and ideology, exploring the way in which theories of reality and relations of power are encoded in such aspects as the syntax, style, and rhetorical devices used in texts. DA is a useful and productive qualitative methodology but has been underutilized within health-care system research. Without a clear understanding of discourse theory and DA it is difficult to comprehend important research findings and impossible to use DA as a research strategy. To redress this deficiency, in this article, represents an introduction to concepts of discourse and DA, DA history, Philosophical background, DA types and analysis strategy. Finally, we discuss how affect to the ideological dimension of such phenomena discourse in health-care system, health beliefs and intra-disciplinary relationship in health-care system.

  11. Low-level radioactive waste from commercial nuclear reactors. Volume 1. Recommendations for technology developments with potential to significantly improve low-level radioactive waste management

    International Nuclear Information System (INIS)

    Rodgers, B.R.; Jolley, R.L.

    1986-02-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 1 provides an executive summary and a general introduction to the four-volume set, in addition to recommendations for research and development (R and D) for low-level radioactive waste (LLRW) treatment. Generic, long-range, and/or high-risk programs identified and prioritized as needed R and D in the LLRW field include: (1) systems analysis to develop decision methodology; (2) alternative processes for dismantling, decontaminating, and decommissioning; (3) ion exchange; (4) incinerator technology; (5) disposal technology; (6) demonstration of advanced technologies; (7) technical assistance; (8) below regulatory concern materials; (9) mechanical treatment techniques; (10) monitoring and analysis procedures; (11) radical process improvements; (12) physical, chemical, thermal, and biological processes; (13) fundamental chemistry; (14) interim storage; (15) modeling; and (16) information transfer. The several areas are discussed in detail

  12. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  13. Statistical representative elementary volumes of porous media determined using greyscale analysis of 3D tomograms

    Science.gov (United States)

    Bruns, S.; Stipp, S. L. S.; Sørensen, H. O.

    2017-09-01

    Digital rock physics carries the dogmatic concept of having to segment volume images for quantitative analysis but segmentation rejects huge amounts of signal information. Information that is essential for the analysis of difficult and marginally resolved samples, such as materials with very small features, is lost during segmentation. In X-ray nanotomography reconstructions of Hod chalk we observed partial volume voxels with an abundance that limits segmentation based analysis. Therefore, we investigated the suitability of greyscale analysis for establishing statistical representative elementary volumes (sREV) for the important petrophysical parameters of this type of chalk, namely porosity, specific surface area and diffusive tortuosity, by using volume images without segmenting the datasets. Instead, grey level intensities were transformed to a voxel level porosity estimate using a Gaussian mixture model. A simple model assumption was made that allowed formulating a two point correlation function for surface area estimates using Bayes' theory. The same assumption enables random walk simulations in the presence of severe partial volume effects. The established sREVs illustrate that in compacted chalk, these simulations cannot be performed in binary representations without increasing the resolution of the imaging system to a point where the spatial restrictions of the represented sample volume render the precision of the measurement unacceptable. We illustrate this by analyzing the origins of variance in the quantitative analysis of volume images, i.e. resolution dependence and intersample and intrasample variance. Although we cannot make any claims on the accuracy of the approach, eliminating the segmentation step from the analysis enables comparative studies with higher precision and repeatability.

  14. A SAS2H/KENO-V methodology for 3D fuel burnup analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.

    2002-01-01

    An efficient methodology for 3D fuel burnup analysis of LWR reactors is described in this paper. This methodology is founded on coupling Monte Carlo method for 3D calculation of node power distribution, and transport method for depletion calculation in ID Wigner-Seitz equivalent cell for each node independently. The proposed fuel burnup modeling, based on application of SCALE-4.4a control modules SAS2H and KENO-V.a is verified for the case of 2D x-y model of IRIS 15 x 15 fuel assembly (with reflective boundary condition) by using two well benchmarked code systems. The one is MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility code, and the second is KENO-V.a/ORIGEN2.1 code system recently developed by authors of this paper. The proposed SAS2H/KENO-V.a methodology was applied for 3D burnup analysis of IRIS-1000 benchmark.44 core. Detailed k sub e sub f sub f and power density evolution with burnup are reported. (author)

  15. Principal Component Analysis-Based Pattern Analysis of Dose-Volume Histograms and Influence on Rectal Toxicity

    International Nuclear Information System (INIS)

    Soehn, Matthias; Alber, Markus; Yan Di

    2007-01-01

    Purpose: The variability of dose-volume histogram (DVH) shapes in a patient population can be quantified using principal component analysis (PCA). We applied this to rectal DVHs of prostate cancer patients and investigated the correlation of the PCA parameters with late bleeding. Methods and Materials: PCA was applied to the rectal wall DVHs of 262 patients, who had been treated with a four-field box, conformal adaptive radiotherapy technique. The correlated changes in the DVH pattern were revealed as 'eigenmodes,' which were ordered by their importance to represent data set variability. Each DVH is uniquely characterized by its principal components (PCs). The correlation of the first three PCs and chronic rectal bleeding of Grade 2 or greater was investigated with uni- and multivariate logistic regression analyses. Results: Rectal wall DVHs in four-field conformal RT can primarily be represented by the first two or three PCs, which describe ∼94% or 96% of the DVH shape variability, respectively. The first eigenmode models the total irradiated rectal volume; thus, PC1 correlates to the mean dose. Mode 2 describes the interpatient differences of the relative rectal volume in the two- or four-field overlap region. Mode 3 reveals correlations of volumes with intermediate doses (∼40-45 Gy) and volumes with doses >70 Gy; thus, PC3 is associated with the maximal dose. According to univariate logistic regression analysis, only PC2 correlated significantly with toxicity. However, multivariate logistic regression analysis with the first two or three PCs revealed an increased probability of bleeding for DVHs with more than one large PC. Conclusions: PCA can reveal the correlation structure of DVHs for a patient population as imposed by the treatment technique and provide information about its relationship to toxicity. It proves useful for augmenting normal tissue complication probability modeling approaches

  16. Development of methodology for the analysis of fuel behavior in light water reactor in design basis accidents

    International Nuclear Information System (INIS)

    Salatov, A. A.; Goncharov, A. A.; Eremenko, A. S.; Kuznetsov, V. I.; Bolnov, V. A.; Gusev, A. S.; Dolgov, A. B.; Ugryumov, A. V.

    2013-01-01

    The report attempts to analyze the current experience of the safety fuel for light-water reactors (LWRs) under design-basis accident conditions in terms of its compliance with international requirements for licensing nuclear power plants. The components of fuel behavior analysis methodology in design basis accidents in LWRs were considered, such as classification of design basis accidents, phenomenology of fuel behavior in design basis accidents, system of fuel safety criteria and their experimental support, applicability of used computer codes and input data for computational analysis of the fuel behavior in accidents, way of accounting for the uncertainty of calculation models and the input data. A brief history of the development of probabilistic safety analysis methodology for nuclear power plants abroad is considered. The examples of a conservative approach to safety analysis of VVER fuel and probabilistic approach to safety analysis of fuel TVS-K are performed. Actual problems in development of the methodology of analyzing the behavior of VVER fuel at the design basis accident conditions consist, according to the authors opinion, in following: 1) Development of a common methodology for analyzing the behavior of VVER fuel in the design basis accidents, implementing a realistic approach to the analysis of uncertainty - in the future it is necessary for the licensing of operating VVER fuel abroad; 2) Experimental and analytical support to the methodology: experimental studies to identify and study the characteristics of the key uncertainties of computational models of fuel and the cladding, development of computational models of key events in codes, validation code on the basis of integral experiments

  17. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  18. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  19. Small-Volume Injections: Evaluation of Volume Administration Deviation From Intended Injection Volumes.

    Science.gov (United States)

    Muffly, Matthew K; Chen, Michael I; Claure, Rebecca E; Drover, David R; Efron, Bradley; Fitch, William L; Hammer, Gregory B

    2017-10-01

    In the perioperative period, anesthesiologists and postanesthesia care unit (PACU) nurses routinely prepare and administer small-volume IV injections, yet the accuracy of delivered medication volumes in this setting has not been described. In this ex vivo study, we sought to characterize the degree to which small-volume injections (≤0.5 mL) deviated from the intended injection volumes among a group of pediatric anesthesiologists and pediatric postanesthesia care unit (PACU) nurses. We hypothesized that as the intended injection volumes decreased, the deviation from those intended injection volumes would increase. Ten attending pediatric anesthesiologists and 10 pediatric PACU nurses each performed a series of 10 injections into a simulated patient IV setup. Practitioners used separate 1-mL tuberculin syringes with removable 18-gauge needles (Becton-Dickinson & Company, Franklin Lakes, NJ) to aspirate 5 different volumes (0.025, 0.05, 0.1, 0.25, and 0.5 mL) of 0.25 mM Lucifer Yellow (LY) fluorescent dye constituted in saline (Sigma Aldrich, St. Louis, MO) from a rubber-stoppered vial. Each participant then injected the specified volume of LY fluorescent dye via a 3-way stopcock into IV tubing with free-flowing 0.9% sodium chloride (10 mL/min). The injected volume of LY fluorescent dye and 0.9% sodium chloride then drained into a collection vial for laboratory analysis. Microplate fluorescence wavelength detection (Infinite M1000; Tecan, Mannedorf, Switzerland) was used to measure the fluorescence of the collected fluid. Administered injection volumes were calculated based on the fluorescence of the collected fluid using a calibration curve of known LY volumes and associated fluorescence.To determine whether deviation of the administered volumes from the intended injection volumes increased at lower injection volumes, we compared the proportional injection volume error (loge [administered volume/intended volume]) for each of the 5 injection volumes using a linear

  20. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  1. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  2. Automatic delineation of functional volumes in emission tomography for oncology applications

    International Nuclear Information System (INIS)

    Hatt, M.

    2008-12-01

    One of the main factors of error for semi-quantitative analysis in positron emission tomography (PET) imaging for diagnosis and patient follow up, as well as new flourishing applications like image guided radiotherapy, is the methodology used to define the volumes of interest in the functional images. This is explained by poor image quality in emission tomography resulting from noise and partial volume effects induced blurring, as well as the variability of acquisition protocols, scanner models and image reconstruction procedures. The large number of proposed methodologies for the definition of a PET volume of interest does not help either. The majority of such proposed approaches are based on deterministic binary thresholding that are not robust to contrast variation and noise. In addition, these methodologies are usually unable to correctly handle heterogeneous uptake inside tumours. The objective of this thesis is to develop an automatic, robust, accurate and reproducible 3D image segmentation approach for the functional volumes determination of tumours of all sizes and shapes, and whose activity distribution may be strongly heterogeneous. The approach we have developed is based on a statistical image segmentation framework, combined with a fuzzy measure, which allows to take into account both noisy and blurry properties of nuclear medicine images. It uses a stochastic iterative parameters estimation and a locally adaptive model of the voxel and its neighbours for the estimation and segmentation. The developed approaches have been evaluated using a large array of datasets, comprising both simulated and real acquisitions of phantoms and tumours. The results obtained on phantom acquisitions allowed to validate the accuracy of the segmentation with respect to the size of considered structures, down to 13 mm in diameter (about twice the spatial resolution of a typical PET scanner), as well as its robustness with respect to noise, contrast variation, acquisition

  3. Micro analysis of fringe field formed inside LDA measuring volume

    International Nuclear Information System (INIS)

    Ghosh, Abhijit; Nirala, A K

    2016-01-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement. (paper)

  4. Sectional analysis for volume determination and selection of volume equations for the Tapajos Nacional Forest

    Directory of Open Access Journals (Sweden)

    Renato Bezerra da Silva Ribeiro

    2014-12-01

    Full Text Available The aim of this study was to analyze different sections lengths for volume determination, fitting of volumetric models for timber production estimation in an area of forest management in the Tapajós National Forest (FNT. Six treatments for sectioning were tested in 152 logs of 12 commercial species. The obtained volumes were statistically compared by analysis of variance (ANOVA for the choice of the best method of sectioning and calculating the actual volume of 2,094 sample trees in different diameter commercial classes. Ten mathematical models were fitted to the whole data and to the species Manilkara huberi (Ducke Chevalier (maçaranduba Lecythis lurida (Miers Samori (jarana and Hymenaea courbaril L. (Jatobá. The criteria to choose the best model were adjusted coefficient of determination in percentage (R2adj%, standard error of estimate in percentage (Syx%, significance of the parameters, normality of residuals, Variance Inflation Factor (VIF and residuals graphic distribution. There was no statistical difference between the methods of sectioning and thus the total length of the logs was more operational in the field. The models in logarithmic form of Schumacher and Hall and Spurr were the best to estimate the volume for the species and for the whole sample set.

  5. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  6. Modern charge-density analysis

    CERN Document Server

    Gatti, Carlo

    2012-01-01

    Focusing on developments from the past 10-15 years, this volume presents an objective overview of the research in charge density analysis. The most promising methodologies are included, in addition to powerful interpretative tools and a survey of important areas of research.

  7. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  8. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  9. Optimization of cyanide extraction from wastewater using emulsion liquid membrane system by response surface methodology.

    Science.gov (United States)

    Xue, Juan Qin; Liu, Ni Na; Li, Guo Ping; Dang, Long Tao

    To solve the disposal problem of cyanide wastewater, removal of cyanide from wastewater using a water-in-oil emulsion type of emulsion liquid membrane (ELM) was studied in this work. Specifically, the effects of surfactant Span-80, carrier trioctylamine (TOA), stripping agent NaOH solution and the emulsion-to-external-phase-volume ratio on removal of cyanide were investigated. Removal of total cyanide was determined using the silver nitrate titration method. Regression analysis and optimization of the conditions were conducted using the Design-Expert software and response surface methodology (RSM). The actual cyanide removals and the removals predicted using RSM analysis were in close agreement, and the optimal conditions were determined to be as follows: the volume fraction of Span-80, 4% (v/v); the volume fraction of TOA, 4% (v/v); the concentration of NaOH, 1% (w/v); and the emulsion-to-external-phase volume ratio, 1:7. Under the optimum conditions, the removal of total cyanide was 95.07%, and the RSM predicted removal was 94.90%, with a small exception. The treatment of cyanide wastewater using an ELM is an effective technique for application in industry.

  10. Methodology for thermal-hydraulics analysis of pool type MTR fuel research reactors

    International Nuclear Information System (INIS)

    Umbehaun, Pedro Ernesto

    2000-01-01

    This work presents a methodology developed for thermal-hydraulic analysis of pool type MTR fuel research reactors. For this methodology a computational program, FLOW, and a model, MTRCR-IEAR1 were developed. FLOW calculates the cooling flow distribution in the fuel elements, control elements, irradiators, and through the channels formed among the fuel elements and among the irradiators and reflectors. This computer program was validated against experimental data for the IEA-R1 research reactor core at IPEN-CNEN/SP. MTRCR-IEAR1 is a model based on the commercial program Engineering Equation Solver (EES). Besides the thermal-hydraulic analyses of the core in steady state accomplished by traditional computational programs like COBRA-3C/RERTR and PARET, this model allows to analyze parallel channels with different cooling flow and/or geometry. Uncertainty factors of the variables from neutronic and thermalhydraulic calculation and also from the fabrication of the fuel element are introduced in the model. For steady state analyses MTRCR-IEAR1 showed good agreement with the results of COBRA-3C/RERTR and PARET. The developed methodology was used for the calculation of the cooling flow distribution and the thermal-hydraulic analysis of a typical configuration of the IEA-R1 research reactor core. (author)

  11. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  12. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    Science.gov (United States)

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  13. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  14. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  15. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  16. Left ventricular volume analysis as a basic tool to describe cardiac function.

    Science.gov (United States)

    Kerkhof, Peter L M; Kuznetsova, Tatiana; Ali, Rania; Handly, Neal

    2018-03-01

    The heart is often regarded as a compression pump. Therefore, determination of pressure and volume is essential for cardiac function analysis. Traditionally, ventricular performance was described in terms of the Starling curve, i.e., output related to input. This view is based on two variables (namely, stroke volume and end-diastolic volume), often studied in the isolated (i.e., denervated) heart, and has dominated the interpretation of cardiac mechanics over the last century. The ratio of the prevailing coordinates within that paradigm is termed ejection fraction (EF), which is the popular metric routinely used in the clinic. Here we present an insightful alternative approach while describing volume regulation by relating end-systolic volume (ESV) to end-diastolic volume. This route obviates the undesired use of metrics derived from differences or ratios, as employed in previous models. We illustrate basic principles concerning ventricular volume regulation by data obtained from intact animal experiments and collected in healthy humans. Special attention is given to sex-specific differences. The method can be applied to the dynamics of a single heart and to an ensemble of individuals. Group analysis allows for stratification regarding sex, age, medication, and additional clinically relevant covariates. A straightforward procedure derives the relationship between EF and ESV and describes myocardial oxygen consumption in terms of ESV. This representation enhances insight and reduces the impact of the metric EF, in favor of the end-systolic elastance concept advanced 4 decades ago.

  17. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  18. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  19. Tools for the analysis of dose optimization: I. Effect-volume histogram

    International Nuclear Information System (INIS)

    Alber, M.; Nuesslin, F.

    2002-01-01

    With the advent of dose optimization algorithms, predominantly for intensity-modulated radiotherapy (IMRT), computer software has progressed beyond the point of being merely a tool at the hands of an expert and has become an active, independent mediator of the dosimetric conflicts between treatment goals and risks. To understand and control the internal decision finding as well as to provide means to influence it, a tool for the analysis of the dose distribution is presented which reveals the decision-making process performed by the algorithm. The internal trade-offs between partial volumes receiving high or low doses are driven by functions which attribute a weight to each volume element. The statistics of the distribution of these weights is cast into an effect-volume histogram (EVH) in analogy to dose-volume histograms. The analysis of the EVH reveals which traits of the optimum dose distribution result from the defined objectives, and which are a random consequence of under- or misspecification of treatment goals. The EVH can further assist in the process of finding suitable objectives and balancing conflicting objectives. If biologically inspired objectives are used, the EVH shows the distribution of local dose effect relative to the prescribed level. (author)

  20. Control volume based hydrocephalus research; a phantom study

    Science.gov (United States)

    Cohen, Benjamin; Voorhees, Abram; Madsen, Joseph; Wei, Timothy

    2009-11-01

    Hydrocephalus is a complex spectrum of neurophysiological disorders involving perturbation of the intracranial contents; primarily increased intraventricular cerebrospinal fluid (CSF) volume and intracranial pressure are observed. CSF dynamics are highly coupled to the cerebral blood flows and pressures as well as the mechanical properties of the brain. Hydrocephalus, as such, is a very complex biological problem. We propose integral control volume analysis as a method of tracking these important interactions using mass and momentum conservation principles. As a first step in applying this methodology in humans, an in vitro phantom is used as a simplified model of the intracranial space. The phantom's design consists of a rigid container filled with a compressible gel. Within the gel a hollow spherical cavity represents the ventricular system and a cylindrical passage represents the spinal canal. A computer controlled piston pump supplies sinusoidal volume fluctuations into and out of the flow phantom. MRI is used to measure fluid velocity and volume change as functions of time. Independent pressure measurements and momentum flow rate measurements are used to calibrate the MRI data. These data are used as a framework for future work with live patients and normal individuals. Flow and pressure measurements on the flow phantom will be presented through the control volume framework.

  1. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  2. Atlas based brain volumetry: How to distinguish regional volume changes due to biological or physiological effects from inherent noise of the methodology.

    Science.gov (United States)

    Opfer, Roland; Suppa, Per; Kepp, Timo; Spies, Lothar; Schippling, Sven; Huppertz, Hans-Jürgen

    2016-05-01

    Fully-automated regional brain volumetry based on structural magnetic resonance imaging (MRI) plays an important role in quantitative neuroimaging. In clinical trials as well as in clinical routine multiple MRIs of individual patients at different time points need to be assessed longitudinally. Measures of inter- and intrascanner variability are crucial to understand the intrinsic variability of the method and to distinguish volume changes due to biological or physiological effects from inherent noise of the methodology. To measure regional brain volumes an atlas based volumetry (ABV) approach was deployed using a highly elastic registration framework and an anatomical atlas in a well-defined template space. We assessed inter- and intrascanner variability of the method in 51 cognitively normal subjects and 27 Alzheimer dementia (AD) patients from the Alzheimer's Disease Neuroimaging Initiative by studying volumetric results of repeated scans for 17 compartments and brain regions. Median percentage volume differences of scan-rescans from the same scanner ranged from 0.24% (whole brain parenchyma in healthy subjects) to 1.73% (occipital lobe white matter in AD), with generally higher differences in AD patients as compared to normal subjects (e.g., 1.01% vs. 0.78% for the hippocampus). Minimum percentage volume differences detectable with an error probability of 5% were in the one-digit percentage range for almost all structures investigated, with most of them being below 5%. Intrascanner variability was independent of magnetic field strength. The median interscanner variability was up to ten times higher than the intrascanner variability. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Reachable volume RRT

    KAUST Repository

    McMahon, Troy

    2015-05-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  4. Reachable volume RRT

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2015-01-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  5. Aroma profile design of wine spirits: Multi-objective optimization using response surface methodology.

    Science.gov (United States)

    Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco

    2018-04-15

    Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Methodologies and intelligent systems for technology enhanced learning

    CERN Document Server

    Gennari, Rosella; Vitorini, Pierpaolo; Vicari, Rosa; Prieta, Fernando

    2014-01-01

    This volume presents recent research on Methodologies and Intelligent Systems for Technology Enhanced Learning. It contains the contributions of ebuTEL 2013 conference which took place in Trento, Italy, on September, 16th 2013 and of mis4TEL 2014 conference, which took take place in Salamanca, Spain, on September, 4th-6th 2014 This conference series are an open forum for discussing intelligent systems for Technology Enhanced Learning and empirical methodologies for its design or evaluation.

  7. Performance analysis for disposal of mixed low-level waste. 1: Methodology

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.

    1999-01-01

    A simple methodology has been developed for evaluating the technical capabilities of potential sites for disposal of mixed low-level radioactive waste. The results of the evaluation are expressed as permissible radionuclide concentrations in disposed waste. The methodology includes an analysis of three separate pathways: (1) releases of radionuclides to groundwater; (2) releases of potentially volatile radionuclides to the atmosphere; and (3) the consequences of inadvertent intrusion into a disposal facility. For each radionuclide, its limiting permissible concentration in disposed waste is the lowest of the permissible concentrations determined from each of the three pathways. These permissible concentrations in waste at an evaluated site can be used to assess the capability of the site to dispose of waste streams containing multiple radionuclides

  8. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  9. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  10. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  11. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  12. Analysis of the high water wave volume for the Sava River near Zagreb

    Science.gov (United States)

    Trninic, Dusan

    2010-05-01

    The paper analyses volumes of the Sava River high water waves near Zagreb during the period: 1926-2008 (N = 83 years), which is needed for more efficient control of high and flood waters. The primary Sava flood control structures in the City of Zagreb are dikes built on both riverbanks, and the Odra Relief Canal with lateral spillway upstream from the City of Zagreb. Intensive morphological changes in the greater Sava area near Zagreb, and anthropological and climate variations and changes at the Sava catchment up to the Zagreb area require detailed analysis of the water wave characteristics. In one analysis, maximum annual volumes are calculated for high water waves with constant duration of: 10, 20, 30, 40, 50 and 60 days. Such calculations encompass total quantity of water (basic and surface runoff). The log Pearson III distribution is adapted for this series of maximum annual volumes. Based on the results obtained, the interrelations are established between the wave volume as function of duration and occurrence probability. In addition to the analysis of maximum volumes of constant duration, it is interesting to carry out the analyses of maximum volume in excess of the reference discharge since it is very important for the flood control. To determine the reference discharges, a discharge of specific duration is used from an average discharge duration curve. The adopted reference discharges have durations of 50, 40, 30, 20 and 10%. Like in the previous case, log Pearson III distribution is adapted to the maximum wave data series. For reference discharge Q = 604 m3/s (duration 10%), a linear trend is calculated of maximum annual volumes exceeding the reference discharge for the Sava near Zagreb during the analyzed period. The analysis results show a significant decrease trend. A similar analysis is carried out for the following three reference discharges: regular flood control measures at the Sava near Zagreb, which are proclaimed when the water level is 350 cm

  13. Optimising the education of responsible shift personnel in nuclear power plants. Volume 1 for Chapter 3: Investigational methodology

    International Nuclear Information System (INIS)

    1985-01-01

    In line with the usual announcement procedures, an analysis was to be carried out of those activities from which capabilities, knowledge and then learning objectives can be derived in consecutive stages. In this respect, this volume contains articles on the following: the derivation of learning objectives from activities on the themes of capabilities and knowledge; the analysis of professional activity; the appraisal of the descriptors and a textual presentation of the activities. (DG) [de

  14. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  15. The methodology of root cause analysis for equipment failure and its application at Guangdong nuclear power stations

    International Nuclear Information System (INIS)

    Gao Ligang; Lu Qunxian

    2004-01-01

    The methodology of Equipment Failure Root Cause Analysis (RCA) is described, as a systematic analysis methodology, it includes 9 steps. Its process is explained by some real examples, and the 6 precautions applying RCA is pointed out. The paper also summarizes the experience of RCA application at Daya Bay Nuclear Power Station, and the 7 key factors for RCA success is emphasized, that mainly concerns organization, objective, analyst, analysis technique, external technical supporting system, corrective actions developing and monitoring system for corrective actions. (authors)

  16. Real options and volume uncertainty by field development projects

    International Nuclear Information System (INIS)

    Ekern, S.; Stensland, G.

    1993-12-01

    The report concerns a study on the use of option methodology in field development projects. The report shows how the value of flexibility in the different decision processes is to be found by means of real option methodology. Particular attention is laid on the uncertainty concerning the volume of reserves and production capacity. The results from the study were based on the research project dubbed ''Use of real options in field development projects''. The project is partially connected to another project dubbed ''Decisive behaviour and alternative action under uncertainty in the petroleum sector''. Main topics cover as follow: Example with volume uncertainty; real options and volume uncertainty; gradual disclosure of uncertainty in the production; value of flexible production equipment. 33 refs., 19 figs., 17 tabs

  17. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  18. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  19. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  20. Sentiment analysis and ontology engineering an environment of computational intelligence

    CERN Document Server

    Chen, Shyi-Ming

    2016-01-01

    This edited volume provides the reader with a fully updated, in-depth treatise on the emerging principles, conceptual underpinnings, algorithms and practice of Computational Intelligence in the realization of concepts and implementation of models of sentiment analysis and ontology –oriented engineering. The volume involves studies devoted to key issues of sentiment analysis, sentiment models, and ontology engineering. The book is structured into three main parts. The first part offers a comprehensive and prudently structured exposure to the fundamentals of sentiment analysis and natural language processing. The second part consists of studies devoted to the concepts, methodologies, and algorithmic developments elaborating on fuzzy linguistic aggregation to emotion analysis, carrying out interpretability of computational sentiment models, emotion classification, sentiment-oriented information retrieval, a methodology of adaptive dynamics in knowledge acquisition. The third part includes a plethora of applica...

  1. Socioeconomic effects of the DOE Gas Centrifuge Enrichment Plant. Volume 1: methodology and analysis

    International Nuclear Information System (INIS)

    1979-01-01

    The socioeconomic effects of the Gas Centrifuge Enrichment Plant being built in Portsmouth, Ohio were studied. Chapters are devoted to labor force, housing, population changes, economic impact, method for analysis of services, analysis of service impacts, schools, and local government finance

  2. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  3. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  4. Conceptual foresight of the volumes of postal money orders in the Republic of Komi

    Directory of Open Access Journals (Sweden)

    Lyubov' Aleksandrovna Kuratova

    2012-03-01

    Full Text Available This paper describes a methodology elaborated for forecasting the volume of postal services on the basis of statistical methods of regression analysis on the example of the Republic of Komi. The influence of internal and external factors on the market of postal money orders of the Republic is constructed and investigated using the statistical regression model of the market of postal money orders of the Komi Republic in the period of 2005–2010. The conceptual foresight of development of the regional market of postal money orders for 2011–2012 is presented. Regression models were analyzed not only for the dynamic sequence of data, but also for sequences of data on territories, which revealed independent correlated factors which are weakly changing and evolving over time. The presented results have important practical and methodological significance for predicting both the volume of postal money orders as well as other types of services.

  5. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  6. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  7. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  8. Shop for quality or quantity? Volumes and costs in clinical laboratories.

    Science.gov (United States)

    Barletta, Giovanni; Zaninotto, Martina; Faggian, Diego; Plebani, Mario

    2013-02-01

    The increasing need to reduce the costs of providing diagnostic laboratory services has prompted initiatives based on the centralization and consolidation of laboratory facilities. However, the majority of papers and experiences reported in literature focus on "cost per test" thus overlooking the real value of a laboratory service, which requires more complex economic evaluations, such as cost-benefit, cost-effectiveness, and cost-utility analysis. It is important to perform cost analysis, which is no mean feat, by taking into consideration all variables affecting the final and true cost per test. The present study was conducted in order to evaluate the costs of delivering laboratory services in 20 Italian clinical laboratories using a widely accepted methodology, the so-called "activity-based costing analysis". The finding of a trend towards a decrease in total costs - due to an increase in test volumes - attained statistical significance only for quantities of up to about 1,100,00 tests per year. For 1,800,00 tests and more, the cost per test appeared to range from 1.5 to 2.0 € irrespective of the different volumes. Regarding the relationship between volumes and number of staff, there is an evident linear relationship between the number of senior staff and volumes, whereas this trend is not observed in the case of medical technologists, the degree and type of automation strongly affecting this variable. The findings made in the present study confirm that the relationship between volumes and costs is not linear; since it is complex, numerous variables should be taken into account.

  9. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  10. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  11. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    Science.gov (United States)

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  12. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  13. VOLUMNECT: measuring volumes with Kinect

    Science.gov (United States)

    Quintino Ferreira, Beatriz; Griné, Miguel; Gameiro, Duarte; Costeira, João. Paulo; Sousa Santos, Beatriz

    2014-03-01

    This article presents a solution to volume measurement object packing using 3D cameras (such as the Microsoft KinectTM). We target application scenarios, such as warehouses or distribution and logistics companies, where it is important to promptly compute package volumes, yet high accuracy is not pivotal. Our application auto- matically detects cuboid objects using the depth camera data and computes their volume and sorting it allowing space optimization. The proposed methodology applies to a point cloud simple computer vision and image processing methods, as connected components, morphological operations and Harris corner detector, producing encouraging results, namely an accuracy in volume measurement of 8mm. Aspects that can be further improved are identified; nevertheless, the current solution is already promising turning out to be cost effective for the envisaged scenarios.

  14. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  15. PRT Impact Study Pre-PRT Phase : Volume 1. Travel Analysis.

    Science.gov (United States)

    1976-03-01

    Part of a three-volume work, this report describes the analysis performed on travel data collected for the Pre-PRT Impact Study. The data analyzed consist of travel behavior, travel patterns, model utilization and travel costs of various modes of tra...

  16. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  17. METHODOLOGY OF THE DRUGS MARKET VOLUME MODELING ON THE EXAMPLE OF HEMOPHILIA A

    Directory of Open Access Journals (Sweden)

    N. B. Molchanova

    2015-01-01

    Full Text Available Hemophilia A is a serious genetic disease, which may lead to disability of a patient even in early ages without a required therapy. The only one therapeutic approach is a replacement therapy with drugs of bloodcoagulation factor VIII (FVIII. The modeling of coagulation drugs market volume will allow evaluation of the level of patients’ provision with a necessary therapy. Modeling of a “perfect” market of drugs and its comparison with the real one was the purpose of the study. During the modeling of market volume we have used the data about the number of hamophilia A patients on the basis of the federal registry, Russian and international morbidity indices, and the data of a real practice about average consumption of drugs of bloodcoagulation factors and data about the drugs prescription according to the standards and protocols of assistance rendering. According to the standards of care delivery, average annual volume of FVIII drugs consumption amounted to 406 325 244 IU for children and 964 578 678 IU for adults, i.e. an average volume of a “perfect” market is equal to 1 370 903 922 IU for all patients. The market volume is 1.8 times bigger than a real volume of FVIII drugs which, according to the data of IMS marketing agency, amounted to 765 000 000 IU in 2013. The modeling conducted has shown that despite a relatively high patients’ coverage there is a potential for almost double growth.

  18. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  19. Analysis of the chemical equilibrium of combustion at constant volume

    Directory of Open Access Journals (Sweden)

    Marius BREBENEL

    2014-04-01

    Full Text Available Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant volume is next considered as example of application, observing the changes occurring in the composition of the combustion gases depending on temperature.

  20. Development of Thermal-hydraulic Analysis Methodology for Multi-module Breeding Blankets in K-DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun; Park, Goon-Cherl; Cho, Hyoung-Kyu [Seoul National University, Seoul (Korea, Republic of); Im, Kihak [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, the purpose of the analyses is to extend the capability of MARS-KS to the entire blanket system which includes a few hundreds of single blanket modules. Afterwards, the plan for the whole blanket system analysis using MARS-KS is introduced and the result of the multiple blanket module analysis is summarized. A thermal-hydraulic analysis code for a nuclear reactor safety, MARS-KS, was applied for the conceptual design of the K-DEMO breeding blanket thermal analysis. Then, a methodology to simulate multiple blanket modules was proposed, which uses a supervisor program to handle each blanket module individually at first and then distribute the flow rate considering pressure drops arises in each module. For a feasibility test of the proposed methodology, 10 outboard blankets in a toroidal field sector were simulated, which are connected with each other through the inlet and outlet common headers. The calculation results of flow rates, pressure drops, and temperatures showed the validity of the calculation and thanks to the parallelization using MPI, almost linear speed-up could be obtained.

  1. Ovarian volume and antral follicle count assessed by MRI and transvaginal ultrasonography: a methodological study.

    Science.gov (United States)

    Leonhardt, Henrik; Gull, Berit; Stener-Victorin, Elisabet; Hellström, Mikael

    2014-03-01

    Ultrasonographic measurements of ovarian volume and antral follicle count are of clinical importance as diagnostic features of polycystic ovarian syndrome (PCOS), and as a parameter in estimation of ovarian follicular reserve in infertility care. To compare two-dimensional (2D)/three-dimensional (3D) transvaginal ultrasonography (TVUS) and magnetic resonance imaging (MRI) for estimation of ovarian volume and antral follicle count, and to assess reproducibility and inter-observer agreement of MRI measurements. Volumes of 172 ovaries in 99 women aged 21-37 years were calculated (length x width x height x 0.523) with conventional 2D TVUS and 2D MRI. Semi-automatic estimates of ovarian volumes were obtained by 3D MRI. Antral follicles were counted manually on 2D MRI and automatically by 3D TVUS (SonoAVC), and stratified according to follicle size. Mean ovarian volume assessed by 2D TVUS (13.1 ± 6.4 mL) was larger than assessed by 2D MRI (9.6 ± 4.1) and 3D MRI (11.4 ± 4.5) (P 0.77. 2D MRI reveals more antral follicles, especially of small size, than 3D TVUS. Ovarian volume estimation by MRI provides smaller volumes than by the reference standard 2D TVUS. Ovarian volume estimation by 3D MRI, allowing independence of non-ellipsoid ovarian shape measurement errors, provides volumes closer to 2D TVUS values than does 2D MRI. Reproducibility and inter-observer agreement of 2D MRI measurements of ovarian volume and total follicle count are good.

  2. Left ventricular pressure and volume data acquisition and analysis using LabVIEW.

    Science.gov (United States)

    Cassidy, S C; Teitel, D F

    1997-03-01

    To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.

  3. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  4. Peripheral i.v. analysis (PIVA) of venous waveforms for volume assessment in patients undergoing haemodialysis.

    Science.gov (United States)

    Hocking, K M; Alvis, B D; Baudenbacher, F; Boyer, R; Brophy, C M; Beer, I; Eagle, S

    2017-12-01

    The assessment of intravascular volume status remains a challenge for clinicians. Peripheral i.v. analysis (PIVA) is a method for analysing the peripheral venous waveform that has been used to monitor volume status. We present a proof-of-concept study for evaluating the efficacy of PIVA in detecting changes in fluid volume. We enrolled 37 hospitalized patients undergoing haemodialysis (HD) as a controlled model for intravascular volume loss. Respiratory rate (F0) and pulse rate (F1) frequencies were measured. PIVA signal was obtained by fast Fourier analysis of the venous waveform followed by weighing the magnitude of the amplitude of the pulse rate frequency. PIVA was compared with peripheral venous pressure and standard monitoring of vital signs. Regression analysis showed a linear correlation between volume loss and change in the PIVA signal (R2=0.77). Receiver operator curves demonstrated that the PIVA signal showed an area under the curve of 0.89 for detection of 20 ml kg-1 change in volume. There was no correlation between volume loss and peripheral venous pressure, blood pressure or pulse rate. PIVA-derived pulse rate and respiratory rate were consistent with similar numbers derived from the bio-impedance and electrical signals from the electrocardiogram. PIVA is a minimally invasive, novel modality for detecting changes in fluid volume status, respiratory rate and pulse rate in spontaneously breathing patients with peripheral i.v. cannulas. © The Author 2017. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  6. More Than Just a Discursive Practice? Conceptual Principles and Methodological Aspects of Dispositif Analysis

    Directory of Open Access Journals (Sweden)

    Andrea D. Bührmann

    2007-05-01

    Full Text Available This article gives an introduction into the conceptual and practical field of dispositf analysis—a field that is of great importance but that is as yet underdeveloped. In order to render this introduction, we first explain the terms discourse and dispositif. Then we examine the conceptual instruments and methodological procedures of dispositf analysis. In this way, we define the relations between discourse and (a non discoursive practices (b subjectification, (c everyday orders of knowledge and (d institutional practices like societal changes as central issues of dispositif analysis. Furthermore, we point out the methodological possibilities and limitations of dispositif analysis. We demonstrate these possibilities and limitations with some practical examples. In general, this article aims to provide an extension of the perspectives of discourse theory and research by stressing the relations between normative orders of knowledge, their effects on interactions and individual self–reflections connected with them. URN: urn:nbn:de:0114-fqs0702281

  7. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Rebollo, L.

    1993-01-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  8. Mixing volume determination in batch transfers through sonic detectors

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas]. E-mail: renan@cenpes.petrobras.com.br; Rachid, Felipe Bastos de Freitas [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Engenharia Mecanica]. E-mail: rachid@mec.uff.br; Araujo, Jose Henrique Carneiro de [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Ciencia da Computacao]. E-mail: jhca@dcc.ic.uff.br

    2000-07-01

    An experimental methodology to evaluate mixing volumes in batch transfers by means of sonic detectors has been reported in this paper. Mixing volumes have then been computed in a transfer of diesel/gasoline carried out through a pipeline operated by Petrobras for different interface points. It has been shown that an adequate choice of the interface points is crucial for keeping the mixing volume uncertainty within acceptable limits. (author)

  9. A retrospective analysis of complications of large volume liposuction; local perspective from a third world country

    International Nuclear Information System (INIS)

    Arshad, S.M.; Latif, S.; Altaf, H.N.

    2017-01-01

    This study was aimed at evaluating the complications that occurred in patients undergoing large volume liposuction and to see if there was a correlation between amount of aspirate and the rate of complications. Methodology: A detailed history, complete physical examination, BMI, and anthropometric measurements were documented for all patients. All patients under went liposuction using tumescent technique under general anesthesia in Yusra General Hospital. Patients were discharged home after 24 to 48 hours. Pressure garments were advised for 6 weeks and were called for weekly follow up for 6 weeks. Pressure garments were advised for 6 weeks. Complications were documented. SPSS version 20 was used for analysis of data. Results: Out of 217 patients, 163 (75%) were female and 54 male. Mean age was 37.1 SD+-6.7 years. Bruising and seroma were most common complications; 4.1% and 2.3%, respectively. The incidence of infection was 0.9%. One patient had over-correction and four patients (1.8%) had under-correction. Significant blood loss was encountered in one patient. Two patients (0.9%) had pulmonary embolism and 2(0.9%) suffered from necrotizing fasciitis. None of our patients undergoing large volume liposuction had fat embolism and there was no mortality. Conclusion: Careful patient selection and strict adherence to guidelines can ensure a good outcome and can minimize risk of complications. Both physicians and patients should be educated to have realistic expectations to avoid complications and improve patient safety. (author)

  10. Methodologies and Intelligent Systems for Technology Enhanced Learning

    CERN Document Server

    Gennari, Rosella; Vittorini, Pierpaolo; Prieta, Fernando

    2015-01-01

    This volume presents recent research on Methodologies and Intelligent Systems for Technology Enhanced Learning. It contains the contributions of MIS4TEL 2015, which took place in Salamanca, Spain,. On June 3rd to 5th 2015. Like the previous edition, this proceedings and the conference is an open forum for discussing intelligent systems for Technology Enhanced Learning and empirical methodologies for their design or evaluation MIS4TEL’15 conference has been organized by University of L’aquila, Free University of Bozen-Bolzano and the University of Salamanca.  .

  11. The TE coupled RELAP5/PANTHER/COBRA code package and methodology for integrated PWR accident analysis

    International Nuclear Information System (INIS)

    Schneidesch, Christophe R.; Zhang, Jinzhao; Ammirabile, Luca; Dalleur, Jean-Paul

    2006-01-01

    At Tractebel Engineering (TE), a dynamic coupling has been developed between the best estimate thermal hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel core thermal-hydraulic analysis code COBRA 3C has been established for on-line calculation of the Departure from Nucleate Boiling Ratio (DNBR). In addition to the standard RELAP5-PANTHER coupling, the fully dynamic coupling of the RELAP5/PANTHER/COBRA3C-TE code package can be activated for evaluation purposes in which the PANTHER close-channel thermal-hydraulics module is replaced by the COBRA3C-TE with cross flow modelling and extended T-H flow conditions capabilities. The qualification of the RELAP5-PANTHER coupling demonstrated the robustness achieved by the combined 3-D neutron kinetics/system T-H code package for transient simulations. The coupled TE code package has been approved by the Belgian Safety Authorities and is used at TE for analyzing asymmetric PWR accidents with strong core-system interactions. In particular, the TE coupled code package was first used to develop a main steam line break in hot shutdown conditions (SLBHZP) accident analysis methodology based on the TE deterministic bounding approach. This methodology has been reviewed and accepted by the Belgian Safety Authorities for specific applications. Those specific applications are related to the power up-rate and steam generator replacement project of the Doel 2 plant or to the Tihange-3 SLB accident re-analysis. A coupled feedwater line break (FLB) accident analysis methodology is currently being reviewed for application approval. The results of coupled thermal-hydraulic and neutronic analysis of SLB and FLB show that there exist important margins in the traditional final safety analysis report (FSAR) accident analysis. Those margins can be used to increase the operational flexibility of the plants. Moreover, the

  12. The TE coupled RELAP5/PANTHER/COBRA code package and methodology for integrated PWR accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schneidesch, Christophe R.; Zhang, Jinzhao; Ammirabile, Luca; Dalleur, Jean-Paul [Suez-Tractebel Engineering, Avenue Ariane 7, B-1200 Brussels (Belgium)

    2006-07-01

    At Tractebel Engineering (TE), a dynamic coupling has been developed between the best estimate thermal hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel core thermal-hydraulic analysis code COBRA 3C has been established for on-line calculation of the Departure from Nucleate Boiling Ratio (DNBR). In addition to the standard RELAP5-PANTHER coupling, the fully dynamic coupling of the RELAP5/PANTHER/COBRA3C-TE code package can be activated for evaluation purposes in which the PANTHER close-channel thermal-hydraulics module is replaced by the COBRA3C-TE with cross flow modelling and extended T-H flow conditions capabilities. The qualification of the RELAP5-PANTHER coupling demonstrated the robustness achieved by the combined 3-D neutron kinetics/system T-H code package for transient simulations. The coupled TE code package has been approved by the Belgian Safety Authorities and is used at TE for analyzing asymmetric PWR accidents with strong core-system interactions. In particular, the TE coupled code package was first used to develop a main steam line break in hot shutdown conditions (SLBHZP) accident analysis methodology based on the TE deterministic bounding approach. This methodology has been reviewed and accepted by the Belgian Safety Authorities for specific applications. Those specific applications are related to the power up-rate and steam generator replacement project of the Doel 2 plant or to the Tihange-3 SLB accident re-analysis. A coupled feedwater line break (FLB) accident analysis methodology is currently being reviewed for application approval. The results of coupled thermal-hydraulic and neutronic analysis of SLB and FLB show that there exist important margins in the traditional final safety analysis report (FSAR) accident analysis. Those margins can be used to increase the operational flexibility of the plants. Moreover, the

  13. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  14. Use of lean and six sigma methodology to improve operating room efficiency in a high-volume tertiary-care academic medical center.

    Science.gov (United States)

    Cima, Robert R; Brown, Michael J; Hebl, James R; Moore, Robin; Rogers, James C; Kollengode, Anantha; Amstutz, Gwendolyn J; Weisbrod, Cheryl A; Narr, Bradly J; Deschamps, Claude

    2011-07-01

    Operating rooms (ORs) are resource-intense and costly hospital units. Maximizing OR efficiency is essential to maintaining an economically viable institution. OR efficiency projects often focus on a limited number of ORs or cases. Efforts across an entire OR suite have not been reported. Lean and Six Sigma methodologies were developed in the manufacturing industry to increase efficiency by eliminating non-value-added steps. We applied Lean and Six Sigma methodologies across an entire surgical suite to improve efficiency. A multidisciplinary surgical process improvement team constructed a value stream map of the entire surgical process from the decision for surgery to discharge. Each process step was analyzed in 3 domains, ie, personnel, information processed, and time. Multidisciplinary teams addressed 5 work streams to increase value at each step: minimizing volume variation; streamlining the preoperative process; reducing nonoperative time; eliminating redundant information; and promoting employee engagement. Process improvements were implemented sequentially in surgical specialties. Key performance metrics were collected before and after implementation. Across 3 surgical specialties, process redesign resulted in substantial improvements in on-time starts and reduction in number of cases past 5 pm. Substantial gains were achieved in nonoperative time, staff overtime, and ORs saved. These changes resulted in substantial increases in margin/OR/day. Use of Lean and Six Sigma methodologies increased OR efficiency and financial performance across an entire operating suite. Process mapping, leadership support, staff engagement, and sharing performance metrics are keys to enhancing OR efficiency. The performance gains were substantial, sustainable, positive financially, and transferrable to other specialties. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  15. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  16. Development of a methodology of analysis of instabilities in BWR reactors; Desarrollo de una metodologia de analisis de inestabilidades en reactores PWR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Fenoll, M.; Abarca, A.; Barrachina, T.; Miro, R.; Verdu, G.

    2012-07-01

    This paper presents a methodology of analysis of the reactors instabilities of BWR type. This methodology covers of modal analysis of the point operation techniques of signal analysis and simulation of transients, through 3D Coupled RELAP5/PARCSv2.7 code.

  17. Validating analysis methodologies used in burnup credit criticality calculations

    International Nuclear Information System (INIS)

    Brady, M.C.; Napolitano, D.G.

    1992-01-01

    The concept of allowing reactivity credit for the depleted (or burned) state of pressurized water reactor fuel in the licensing of spent fuel facilities introduces a new challenge to members of the nuclear criticality community. The primary difference in this analysis approach is the technical ability to calculate spent fuel compositions (or inventories) and to predict their effect on the system multiplication factor. Isotopic prediction codes are used routinely for in-core physics calculations and the prediction of radiation source terms for both thermal and shielding analyses, but represent an innovation for criticality specialists. This paper discusses two methodologies currently being developed to specifically evaluate isotopic composition and reactivity for the burnup credit concept. A comprehensive approach to benchmarking and validating the methods is also presented. This approach involves the analysis of commercial reactor critical data, fuel storage critical experiments, chemical assay isotopic data, and numerical benchmark calculations

  18. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  19. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  20. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  1. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  2. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  3. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  6. Methodology for LOCA analysis and its qualification procedures for PWR reload licensing

    International Nuclear Information System (INIS)

    Serrano, M.A.B.

    1986-01-01

    The methodology for LOCA analysis developed by FURNAS and its qualification procedure for PWR reload licensing are presented. Digital computer codes developed by NRC and published collectively as the WREM package were modified to get versions that comply to each requirement of Brazilian Licensing Criteria. This metodology is applied to Angra-1 basic case to conclude the qualification process. (Author) [pt

  7. A Cercla-Based Decision Model to Support Remedy Selection for an Uncertain Volume of Contaminants at a DOE Facility

    Energy Technology Data Exchange (ETDEWEB)

    Christine E. Kerschus

    1999-03-31

    The Paducah Gaseous Diffusion Plant (PGDP) operated by the Department of Energy is challenged with selecting the appropriate remediation technology to cleanup contaminants at Waste Area Group (WAG) 6. This research utilizes value-focused thinking and multiattribute preference theory concepts to produce a decision analysis model designed to aid the decision makers in their selection process. The model is based on CERCLA's five primary balancing criteria, tailored specifically to WAG 6 and the contaminants of concern, utilizes expert opinion and the best available engineering, cost, and performance data, and accounts for uncertainty in contaminant volume. The model ranks 23 remediation technologies (trains) in their ability to achieve the CERCLA criteria at various contaminant volumes. A sensitivity analysis is performed to examine the effects of changes in expert opinion and uncertainty in volume. Further analysis reveals how volume uncertainty is expected to affect technology cost, time and ability to meet the CERCLA criteria. The model provides the decision makers with a CERCLA-based decision analysis methodology that is objective, traceable, and robust to support the WAG 6 Feasibility Study. In addition, the model can be adjusted to address other DOE contaminated sites.

  8. A Cercla-Based Decision Model to Support Remedy Selection for an Uncertain Volume of Contaminants at a DOE Facility

    International Nuclear Information System (INIS)

    Christine E. Kerschus

    1999-01-01

    The Paducah Gaseous Diffusion Plant (PGDP) operated by the Department of Energy is challenged with selecting the appropriate remediation technology to cleanup contaminants at Waste Area Group (WAG) 6. This research utilizes value-focused thinking and multiattribute preference theory concepts to produce a decision analysis model designed to aid the decision makers in their selection process. The model is based on CERCLA's five primary balancing criteria, tailored specifically to WAG 6 and the contaminants of concern, utilizes expert opinion and the best available engineering, cost, and performance data, and accounts for uncertainty in contaminant volume. The model ranks 23 remediation technologies (trains) in their ability to achieve the CERCLA criteria at various contaminant volumes. A sensitivity analysis is performed to examine the effects of changes in expert opinion and uncertainty in volume. Further analysis reveals how volume uncertainty is expected to affect technology cost, time and ability to meet the CERCLA criteria. The model provides the decision makers with a CERCLA-based decision analysis methodology that is objective, traceable, and robust to support the WAG 6 Feasibility Study. In addition, the model can be adjusted to address other DOE contaminated sites

  9. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  10. Investigation of Causal Relationship between Stock Prices and Trading Volume using Toda and Yamamoto Procedure

    Directory of Open Access Journals (Sweden)

    Sushil BAJAJ

    2014-11-01

    Full Text Available The present study probes the relationship between the stock prices and trading volume. For achieving this purpose, daily data of adjusted closing stock prices, trading volume of 39 individual securities and S&P CNX Nifty from January 1, 1998 to May 31, 2013 have been used. In this study, instead of applying ordinary Granger causality test to investigate the relationship between stock prices and trading volume, Toda and Yamamoto (1995 procedure has been applied for analyzing the data. Lag length chosen by AIC and FPE criterion has been insured by running Lagrange Multiplier (LM test and causality determined by Toda and Yamamoto test has also been confirmed by using VAR methodology. Although, Toda and Yamamoto and VAR test produced little dissimilar results, nevertheless, the empirical analysis provides sufficient grounds to declare the presence of interaction between stock prices and trading volume.

  11. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  12. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  13. Dose-volume analysis for quality assurance of interstitial brachytherapy for breast cancer

    International Nuclear Information System (INIS)

    Vicini, Frank A.; Kestin, Larry L.; Edmundson, Gregory K.; Jaffray, David A.; Wong, John W.; Kini, Vijay R.; Chen, Peter Y.; Martinez, Alvaro A.

    1999-01-01

    Purpose/Objective: The use of brachytherapy in the management of breast cancer has increased significantly over the past several years. Unfortunately, few techniques have been developed to compare dosimetric quality and target volume coverage concurrently. We present a new method of implant evaluation that incorporates computed tomography-based three-dimensional (3D) dose-volume analysis with traditional measures of brachytherapy quality. Analyses performed in this fashion will be needed to ultimately assist in determining the efficacy of breast implants. Methods and Materials: Since March of 1993, brachytherapy has been used as the sole radiation modality after lumpectomy in selected protocol patients with early-stage breast cancer treated with breast-conserving therapy. Eight patients treated with high-dose-rate (HDR) brachytherapy who had surgical clips outlining the lumpectomy cavity and underwent computed tomography (CT) scanning after implant placement were selected for this study. For each patient, the postimplant CT dataset was transferred to a 3D treatment planning system. The lumpectomy cavity, target volume (lumpectomy cavity plus a 1-cm margin), and entire breast were outlined on each axial slice. Once all volumes were entered, the programmed HDR brachytherapy source positions and dwell times were imported into the 3D planning system. Using the tools provided by the 3D planning system, the implant dataset was then registered to the visible implant template in the CT dataset. The distribution of the implant dose was analyzed with respect to defined volumes via dose-volume histograms (DVH). Isodose surfaces, the dose homogeneity index, and dosimetric coverage of the defined volumes were calculated and contrasted. All patients received 32 Gy to the entire implanted volume in 8 fractions of 4 Gy over 4 days. Results: Three-plane implants were used for 7 patients and a two-plane implant for 1 patient. The median number of needles per implant was 16.5 (range

  14. Development methodology for industrial diesel engines; Entwicklungsmethode fuer Industrie-Dieselmotoren

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Dirk; Kech, Johannes [MTU Friedrichshafen GmbH (Germany)

    2011-11-15

    In order to remain cost-effective with relatively low production volumes in spite of the high requirements regarding emissions and durability, MTU uses a clearly structured development methodology with a close interlinking of technology and product development in the development of its large engines. For the new engine of the 4000 Series with cooled EGR, MTU applied this methodology in order to implement the emissions concept from the initial idea right through to the serial product. (orig.)

  15. Automated scoping methodology for liquid metal natural circulation small reactor

    International Nuclear Information System (INIS)

    Son, Hyung M.; Suh, Kune Y.

    2014-01-01

    Highlights: • Automated scoping methodology for natural circulation small modular reactor is developed. • In-house code is developed to carry out system analysis and core geometry generation during scoping. • Adjustment relations are obtained to correct the critical core geometry out of diffusion theory. • Optimized design specification is found using objective function value. • Convex hull volume is utilized to quantify the impact of different constraints on the scope range. - Abstract: A novel scoping method is proposed that can automatically generate design variable range of the natural circulation driven liquid metal cooled small reactor. From performance requirements based upon Generation IV system roadmap, appropriate structure materials are selected and engineering constraints are compiled based upon literature. Utilizing ASME codes and standards, appropriate geometric sizing criteria on constituting components are developed to ensure integrity of the system during its lifetime. In-house one dimensional thermo-hydraulic system analysis code is developed based upon momentum integral model and finite element methods to deal with non-uniform descritization of temperature nodes for convection and thermal diffusion equation of liquid metal coolant. In order to quickly generate critical core dimensions out of given unit cell information, an adjustment relation that relates the critical geometry estimated from one-group diffusion and that from MCNP code is constructed and utilized throughout the process. For the selected unit cell dimension ranges, burnup calculations are carried out to check the cores can generate energy over the reactor lifetime. Utilizing random method, sizing criteria, and in-house analysis codes, an automated scoping methodology is developed. The methodology is applied to nitride fueled integral type lead cooled natural circulation reactor concept to generate design scopes which satisfies given constraints. Three dimensional convex

  16. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  17. Fully automated segmentation of oncological PET volumes using a combined multiscale and statistical model

    International Nuclear Information System (INIS)

    Montgomery, David W. G.; Amira, Abbes; Zaidi, Habib

    2007-01-01

    The widespread application of positron emission tomography (PET) in clinical oncology has driven this imaging technology into a number of new research and clinical arenas. Increasing numbers of patient scans have led to an urgent need for efficient data handling and the development of new image analysis techniques to aid clinicians in the diagnosis of disease and planning of treatment. Automatic quantitative assessment of metabolic PET data is attractive and will certainly revolutionize the practice of functional imaging since it can lower variability across institutions and may enhance the consistency of image interpretation independent of reader experience. In this paper, a novel automated system for the segmentation of oncological PET data aiming at providing an accurate quantitative analysis tool is proposed. The initial step involves expectation maximization (EM)-based mixture modeling using a k-means clustering procedure, which varies voxel order for initialization. A multiscale Markov model is then used to refine this segmentation by modeling spatial correlations between neighboring image voxels. An experimental study using an anthropomorphic thorax phantom was conducted for quantitative evaluation of the performance of the proposed segmentation algorithm. The comparison of actual tumor volumes to the volumes calculated using different segmentation methodologies including standard k-means, spatial domain Markov Random Field Model (MRFM), and the new multiscale MRFM proposed in this paper showed that the latter dramatically reduces the relative error to less than 8% for small lesions (7 mm radii) and less than 3.5% for larger lesions (9 mm radii). The analysis of the resulting segmentations of clinical oncologic PET data seems to confirm that this methodology shows promise and can successfully segment patient lesions. For problematic images, this technique enables the identification of tumors situated very close to nearby high normal physiologic uptake. The

  18. Methodology for characterisation of glass fibre composite architecture

    DEFF Research Database (Denmark)

    Hansen, Jens Zangenberg; Larsen, J.B.; Østergaard, R.C.

    2012-01-01

    of the fibres. The information is used for different analyses to investigate and characterise the fibre architecture. As an example, the methodology is applied to glass fibre reinforced composites with varying fibre contents. The different fibre volume fractions (FVFs) affect the number of contact points per......The present study outlines a methodology for microstructural characterisation of fibre reinforced composites containing circular fibres. Digital micrographs of polished cross-sections are used as input to a numerical image processing tool that determines spatial mapping and radii detection...... fibre, the communal fibre distance and the local FVF. The fibre diameter distribution and packing pattern remain somewhat similar for the considered materials. The methodology is a step towards a better understanding of the composite microstructure and can be used to evaluate the interconnection between...

  19. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  20. Development of safety analysis methodology for moderator system failure of CANDU-6 reactor by thermal-hydraulics/physics coupling

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Jin, Dong Sik; Chang, Soon Heung

    2013-01-01

    Highlights: • Developed new safety analysis methodology of moderator system failures for CANDU-6. • The new methodology used the TH-physics coupling concept. • Thermalhydraulic code is CATHENA, physics code is RFSP-IST. • Moderator system failure ends to the subcriticality through self-shutdown. -- Abstract: The new safety analysis methodology for the CANDU-6 nuclear power plant (NPP) moderator system failure has been developed by using the coupling technology with the thermalhydraulic code, CATHENA and reactor core physics code, RFSP-IST. This sophisticated methodology can replace the legacy methodology using the MODSTBOIL and SMOKIN-G2 in the field of the thermalhydraulics and reactor physics, respectively. The CATHENA thermalhydraulic model of the moderator system can simulate the thermalhydraulic behaviors of all the moderator systems such as the calandria tank, head tank, moderator circulating circuit and cover gas circulating circuit and can also predict the thermalhydraulic property of the moderator such as moderator density, temperature and water level in the calandria tank as the moderator system failures go on. And these calculated moderator thermalhydraulic properties are provided to the 3-dimensional neutron kinetics solution module – CERBRRS of RFSP-IST as inputs, which can predict the change of the reactor power and provide the calculated reactor power to the CATHENA. These coupling calculations are performed at every 2 s time steps, which are equivalent to the slow control of CANDU-6 reactor regulating systems (RRS). The safety analysis results using this coupling methodology reveal that the reactor operation enters into the self-shutdown mode without any engineering safety system and/or human interventions for the postulated moderator system failures of the loss of heat sink and moderator inventory, respectively

  1. Safety analysis methodology with assessment of the impact of the prediction errors of relevant parameters

    International Nuclear Information System (INIS)

    Galia, A.V.

    2011-01-01

    The best estimate plus uncertainty approach (BEAU) requires the use of extensive resources and therefore it is usually applied for cases in which the available safety margin obtained with a conservative methodology can be questioned. Outside the BEAU methodology, there is not a clear approach on how to deal with the issue of considering the uncertainties resulting from prediction errors in the safety analyses performed for licensing submissions. However, the regulatory document RD-310 mentions that the analysis method shall account for uncertainties in the analysis data and models. A possible approach is presented, that is simple and reasonable, representing just the author's views, to take into account the impact of prediction errors and other uncertainties when performing safety analysis in line with regulatory requirements. The approach proposes taking into account the prediction error of relevant parameters. Relevant parameters would be those plant parameters that are surveyed and are used to initiate the action of a mitigating system or those that are representative of the most challenging phenomena for the integrity of a fission barrier. Examples of the application of the methodology are presented involving a comparison between the results with the new approach and a best estimate calculation during the blowdown phase for two small breaks in a generic CANDU 6 station. The calculations are performed with the CATHENA computer code. (author)

  2. Guidance for the application of an assessment methodology for innovative nuclear energy systems. INPRO manual - Overview of the methodology. Vol. 1 of 9 of the final report of phase 1 of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) including a CD-ROM comprising all volumes

    International Nuclear Information System (INIS)

    2008-11-01

    The International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) was initiated in the year 2000, based on a resolution of the IAEA General Conference (GC(44)/RES/21). The main objectives of INPRO are (1) to help to ensure that nuclear energy is available to contribute in fulfilling energy needs in the 21st century in a sustainable manner, (2) to bring together both technology holders and technology users to consider jointly the international and national actions required to achieve desired innovations in nuclear reactors and fuel cycles; and (3) to create a forum to involve all relevant stakeholders that will have an impact on, draw from, and complement the activities of existing institutions, as well as ongoing initiatives at the national and international level. This document follows the guidelines of the INPRO report 'Methodology for the assessment of innovative nuclear reactors and fuel cycles, Report of Phase 1B (first part) of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO)', IAEA-TECDOC-1434 (2004), together with its previous report Guidance for the evaluation for innovative nuclear reactors and fuel cycles, Report of Phase 1A of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), IAEA-TECDOC-1362 (2003). This INPRO manual is comprised of an overview volume (laid out in this report), and eight additional volumes (available on a CD-ROM attached to the inside back cover of this report) covering the areas of economics (Volume 2), infrastructure (Volume 3), waste management (Volume 4), proliferation resistance (Volume 5), physical protection (Volume 6), environment (Volume 7), safety of reactors (Volume 8), and safety of nuclear fuel cycle facilities (Volume 9). The overview volume sets out the philosophy of INPRO and a general discussion of the INPRO methodology. This overview volume discusses the relationship of INPRO with the UN concept of sustainability to demonstrate how the

  3. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    International Nuclear Information System (INIS)

    Reimund, Kevin K.

    2015-01-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π(1+√w -1 ), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at "maximum power density operating pressure" requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  4. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  5. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  6. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  7. Determination of bone mineral volume fraction using impedance analysis and Bruggeman model

    Energy Technology Data Exchange (ETDEWEB)

    Ciuchi, Ioana Veronica; Olariu, Cristina Stefania, E-mail: oocristina@yahoo.com; Mitoseriu, Liliana, E-mail: lmtsr@uaic.ro

    2013-11-20

    Highlights: • Mineral volume fraction of a bone sample was determined. • Dielectric properties for bone sample and for the collagen type I were determined by impedance spectroscopy. • Bruggeman effective medium approximation was applied in order to evaluate mineral volume fraction of the sample. • The computed values were compared with ones derived from a histogram test performed on SEM micrographs. -- Abstract: Measurements by impedance spectroscopy and Bruggeman effective medium approximation model were employed in order to determine the mineral volume fraction of dry bone. This approach assumes that two or more phases are present into the composite: the matrix (environment) and the other ones are inclusion phases. A fragment of femur diaphysis dense bone from a young pig was investigated in its dehydrated state. Measuring the dielectric properties of bone and its main components (hydroxyapatite and collagen) and using the Bruggeman approach, the mineral volume filling factor was determined. The computed volume fraction of the mineral volume fraction was confirmed by a histogram test analysis based on the SEM microstructures. In spite of its simplicity, the method provides a good approximation for the bone mineral volume fraction. The method which uses impedance spectroscopy and EMA modeling can be further developed by considering the conductive components of the bone tissue as a non-invasive in situ impedance technique for bone composition evaluation and monitoring.

  8. Determination of bone mineral volume fraction using impedance analysis and Bruggeman model

    International Nuclear Information System (INIS)

    Ciuchi, Ioana Veronica; Olariu, Cristina Stefania; Mitoseriu, Liliana

    2013-01-01

    Highlights: • Mineral volume fraction of a bone sample was determined. • Dielectric properties for bone sample and for the collagen type I were determined by impedance spectroscopy. • Bruggeman effective medium approximation was applied in order to evaluate mineral volume fraction of the sample. • The computed values were compared with ones derived from a histogram test performed on SEM micrographs. -- Abstract: Measurements by impedance spectroscopy and Bruggeman effective medium approximation model were employed in order to determine the mineral volume fraction of dry bone. This approach assumes that two or more phases are present into the composite: the matrix (environment) and the other ones are inclusion phases. A fragment of femur diaphysis dense bone from a young pig was investigated in its dehydrated state. Measuring the dielectric properties of bone and its main components (hydroxyapatite and collagen) and using the Bruggeman approach, the mineral volume filling factor was determined. The computed volume fraction of the mineral volume fraction was confirmed by a histogram test analysis based on the SEM microstructures. In spite of its simplicity, the method provides a good approximation for the bone mineral volume fraction. The method which uses impedance spectroscopy and EMA modeling can be further developed by considering the conductive components of the bone tissue as a non-invasive in situ impedance technique for bone composition evaluation and monitoring

  9. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  10. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  11. Lobar analysis of collapsibility indices to assess functional lung volumes in COPD patients.

    Science.gov (United States)

    Kitano, Mariko; Iwano, Shingo; Hashimoto, Naozumi; Matsuo, Keiji; Hasegawa, Yoshinori; Naganawa, Shinji

    2014-01-01

    We investigated correlations between lung volume collapsibility indices and pulmonary function test (PFT) results and assessed lobar differences in chronic obstructive pulmonary disease (COPD) patients, using paired inspiratory and expiratory three dimensional (3D) computed tomography (CT) images. We retrospectively assessed 28 COPD patients who underwent paired inspiratory and expiratory CT and PFT exams on the same day. A computer-aided diagnostic system calculated total lobar volume and emphysematous lobar volume (ELV). Normal lobar volume (NLV) was determined by subtracting ELV from total lobar volume, both for inspiratory phase (NLVI) and for expiratory phase (NLVE). We also determined lobar collapsibility indices: NLV collapsibility ratio (NLVCR) (%)=(1-NLVE/NLVI)×100%. Associations between lobar volumes and PFT results, and collapsibility indices and PFT results were determined by Pearson correlation analysis. NLVCR values were significantly correlated with PFT results. Forced expiratory volume in 1 second, measured as percent of predicted results (FEV1%P) was significantly correlated with NLVCR values for the lower lobes (Pvolume, measured as percent of predicted (DLCO/VA%P) results were strongly correlated with ELVI for the upper lobes (Ppulmonary function in COPD patients.

  12. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  13. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  14. Level II Probabilistic Safety Analysis Methodology for the Application to GEN-IV Sodium-cooled Fast Reactor

    International Nuclear Information System (INIS)

    Park, S. Y.; Kim, T. W.; Han, S. H.; Jeong, H. Y.

    2010-03-01

    The Korea Atomic Energy Research Institute (KAERI) has been developing liquid metal reactor (LMR) design technologies under a National Nuclear R and D Program. Nevertheless, there is no experience of the probabilistic safety assessment (PSA) domestically for a fast reactor with the metal fuel. Therefore, the objective of this study is to establish the methodologies of risk assessment for the reference design of GEN-IV sodium fast reactor (SFR). An applicability of the PSA methodology of U. S. NRC and PRISM plant to the domestic GEN-IV SFR has been studied. The study contains a plant damage state analysis, a containment event tree analysis, and a source-term release category binning process

  15. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    Science.gov (United States)

    2014-06-01

    System Number CAIDA Cooperative Association of Internet Data Analysis GB gigabyte IETF IPv4 IP IPv6 ISP NPS NTC RFC RTT TTL ICMP NPS ESD VSD TCP UDP DoS...including, DIMES, IPlane, Ark IPv4 All Prefix /24 and recently NPS probing methodol- ogy. NPS probing methodology is different from the others because it...trace, a history of the forward interface-level path and time to send and acknowledge are available to analyze. However, traceroute may not return

  16. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  17. Preliminary CFD analysis methodology for flow in a LFR fuel assembly

    International Nuclear Information System (INIS)

    Catana, A.; Ioan, M.; Serbanel, M.

    2013-01-01

    In this paper a preliminary Computational Fluid Dynamics (CFD) analysis was performed in order to setup a methodology to be used for more complex coolant flow analysis inside ALFRED nuclear reactor fuel assembly. The core contains 171 separated fuel assembly, each consisting in a hexagonal array of 127 fuel rods. Three honey comb spacer grids are proposed along fuel rods with the aim to keep flow geometry intact during reactor operation. The main goal of this paper is to compute some hydraulic parameters: pressure, velocity, wall shear stress and turbulence parameters with and without spacer grids. In this analysis we consider an adiabatic case, so far no heat transfer is considered but we pave the road toward more complex thermo hydraulic analysis for ALFRED (LFR in general). The CAELINUX CFD distribution was used with its main components: Salome-Meca (for geometry and mesh) and Code-Saturne as mono-phase CFD solver. Paraview and Visist Postprocessors were used for data extraction and graphical displays. (authors)

  18. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  19. Procedures for treating common cause failures in safety and reliability studies: Volume 2, Analytic background and techniques: Final report

    International Nuclear Information System (INIS)

    Mosleh, A.; Fleming, K.N.; Parry, G.W.; Paula, H.M.; Worledge, D.H.; Rasmuson, D.M.

    1988-12-01

    This report presents a framework for the inclusion of the impact of common cause failures in risk and reliability evaluations. Common cause failures are defined as that subset of dependent failures for which causes are not explicitly included in the logic model as basic events. The emphasis here is on providing procedures for a practical, systematic approach that can be used to perform and clearly document the analysis. The framework and the methods discussed for performing the different stages of the analysis integrate insights obtained from engineering assessments of the system and the historical evidence from multiple failure events into a systematic, reproducible, and defensible analysis. This document, Volume 2, contains a series of appendices that provide additional background and methodological detail on several important topics discussed in Volume 1

  20. Cost-volume-profit and net present value analysis of health information systems.

    Science.gov (United States)

    McLean, R A

    1998-08-01

    The adoption of any information system should be justified by an economic analysis demonstrating that its projected benefits outweigh its projected costs. Analysis differ, however, on which methods to employ for such a justification. Accountants prefer cost-volume-profit analysis, and economists prefer net present value analysis. The article explains the strengths and weaknesses of each method and shows how they can be used together so that well-informed investments in information systems can be made.

  1. Methodology and boundary conditions applied to the analysis on internal flooding for Kozloduy NPP units 5 and 6

    International Nuclear Information System (INIS)

    Demireva, E.; Goranov, S.; Horstmann, R.

    2004-01-01

    Within the Modernization Program of Units 5 and 6 of Kozloduy NPP a comprehensive analysis of internal flooding has been carried out for the reactor building outside the containment and for the turbine hall by FRAMATOME ANP and ENPRO Consult. The objective of this presentation is to provide information on the applied methodology and boundary conditions. A separate report called 'Methodology and boundary conditions' has been elaborated in order to provide the fundament for the study. The methodology report provides definitions and advice for the following topics: scope of the study; safety objectives; basic assumptions and postulates (plant conditions, grace periods for manual actions, single failure postulate, etc.); sources of flooding (postulated piping leaks and ruptures, malfunctions and personnel error); main activities of the flooding analysis; study conclusions and suggestions of remedial measures. (authors)

  2. A long-scale biodiversity monitoring methodology for Spanish national forest inventory. Application to Álava region

    Directory of Open Access Journals (Sweden)

    Iciar Alberdi

    2014-04-01

    Full Text Available Aim of study: In this study, a methodology has been designed to assess biodiversity in the frame of the Spanish National Forest Inventory with the aim of evaluating the conservation status of Spanish forests and their future evolution. This methodology takes into account the different national and international initiatives together with the different types and characteristics of forests in Spain. Area of study: Álava province (Basque country, Spain.Material and methods: To analyse the contribution of each of the different indices to the biodiversity assessment, a statistical analysis using PCA multivariate techniques was performed for structure, composition and dead wood indicators. Main Results: The selected biodiversity indicators (based on field measurements are presented along with an analysis of the results from four representative forest types in Álava by way of an example of the potential of this methodology. Research highlights: The statistical analysis revealed the important information contribution of Mingling index to the composition indicators. Regarding the structure indicators, it is remarkable the interest of using standard deviations and skewness of height and diameter as indicators. Finally it is interesting to point out the interest of assessing dead saplings since they provide additional information and their volume is a particularly useful parameter for analyzing the success of regeneration.Keywords: species richness; structural diversity; dead wood; NFI; PCA.

  3. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  4. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  5. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  6. ABC/2 Method Does not Accurately Predict Cerebral Arteriovenous Malformation Volume.

    Science.gov (United States)

    Roark, Christopher; Vadlamudi, Venu; Chaudhary, Neeraj; Gemmete, Joseph J; Seinfeld, Joshua; Thompson, B Gregory; Pandey, Aditya S

    2018-02-01

    Stereotactic radiosurgery (SRS) is a treatment option for cerebral arteriovenous malformations (AVMs) to prevent intracranial hemorrhage. The decision to proceed with SRS is usually based on calculated nidal volume. Physicians commonly use the ABC/2 formula, based on digital subtraction angiography (DSA), when counseling patients for SRS. To determine whether AVM volume calculated using the ABC/2 method on DSA is accurate when compared to the exact volume calculated from thin-cut axial sections used for SRS planning. Retrospective search of neurovascular database to identify AVMs treated with SRS from 1995 to 2015. Maximum nidal diameters in orthogonal planes on DSA images were recorded to determine volume using ABC/2 formula. Nidal target volume was extracted from operative reports of SRS. Volumes were then compared using descriptive statistics and paired t-tests. Ninety intracranial AVMs were identified. Median volume was 4.96 cm3 [interquartile range (IQR) 1.79-8.85] with SRS planning methods and 6.07 cm3 (IQR 1.3-13.6) with ABC/2 methodology. Moderate correlation was seen between SRS and ABC/2 (r = 0.662; P ABC/2 (t = -3.2; P = .002). When AVMs were dichotomized based on ABC/2 volume, significant differences remained (t = 3.1, P = .003 for ABC/2 volume ABC/2 volume > 7 cm3). The ABC/2 method overestimates cerebral AVM volume when compared to volumetric analysis from SRS planning software. For AVMs > 7 cm3, the overestimation is even greater. SRS planning techniques were also significantly different than values derived from equations for cones and cylinders. Copyright © 2017 by the Congress of Neurological Surgeons

  7. METHODOLOGICAL ANALYSIS OF STUDYING THE PROBLEM OF PERCEPTION IN FUTURE MUSIC TEACHERS’ PROFESSIONAL TRAINING

    Directory of Open Access Journals (Sweden)

    Zhang Bo

    2017-04-01

    Full Text Available In the article the methodological analysis of problem of perception in future music teachers’ professional training is presented. The author of the article analyses works of outstanding scientists in philosophy, psychology, and art education. The hierarchical system of musical perception options is revealed. A methodological foundation is supported by consideration of the following modern research in specialty – a theory and methodology of musical study that gives proper appearance and circumstantiality to the presented material. Studying the vocal and choral researches in the field of forming the valued music art perception by future music teachers, an author sets an aim to present the methodological analysis of the problem of perception in future music teachers’ professional training. Realization of the system approach to updating the problem of forming the valued music art perception of future music teachers while being trained to vocal and choral work with senior pupils extends their artistic awareness; contributes to distinguishing art works, phenomena; to seeing their properties; to providing orientation in the informative content of music art works. The special attention is paid to revealing methodological principles of perception of category research in the aspect of the valued understanding images of music art works. As a result of analysing scientific sources on the issue of voice production the author of the article finds out that perception is densely related to transformation of external information, conditioning for forming images, operating category attention, memory, thinking, and emotions. The features of perception of maintaining vocal and choral studies and students’ extrapolation are analysed in the process of future professional activity with senior pupils in the aspects of perception and transformation of musical and intonation information, analysis, object perception, and interpretation in accordance with future

  8. A computationally efficient 3D finite-volume scheme for violent liquid–gas sloshing

    CSIR Research Space (South Africa)

    Oxtoby, Oliver F

    2015-10-01

    Full Text Available We describe a semi-implicit volume-of-fluid free-surface-modelling methodology for flow problems involving violent free-surface motion. For efficient computation, a hybrid-unstructured edge-based vertex-centred finite volume discretisation...

  9. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  10. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  11. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  12. A faster reactor transient analysis methodology for PCs

    International Nuclear Information System (INIS)

    Ott, K.O.

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the ''quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report

  13. Petroleum supply annual 1998: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. This second volume contains final statistics for each month of 1998, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  14. Petroleum supply annual, 1997. Volume 2

    International Nuclear Information System (INIS)

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  15. Petroleum supply annual 1996: Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The second volume contains final statistics for each month of 1996, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  16. Petroleum supply annual, 1997. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1997 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. The second volume contains final statistics for each month of 1997, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs.

  17. Petroleum supply annual 1994, Volume 2

    International Nuclear Information System (INIS)

    1995-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1994 through annual and monthly surveys. The PSA is divided into two volumes. This first volume contains four sections: Summary Statistics, Detailed Statistics, Refinery Capacity, and Oxygenate Capacity each with final annual data. The second volume contains final statistics for each month of 1994, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary

  18. Petroleum supply annual 1996: Volume 2

    International Nuclear Information System (INIS)

    1997-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1996 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Capacity; each with final annual data. The second volume contains final statistics for each month of 1996, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  19. Petroleum supply annual 1995: Volume 2

    International Nuclear Information System (INIS)

    1996-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1995 through monthly surveys. The PSA is divided into two volumes. This first volume contains three sections: Summary Statistics, Detailed Statistics, and selected Refinery Statistics each with final annual data. The second volume contains final statistics for each month of 1995, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary

  20. Petroleum supply annual 1998. Volume 2

    International Nuclear Information System (INIS)

    1999-06-01

    The Petroleum Supply Annual (PSA) contains information on the supply and disposition of crude oil and petroleum products. The publication reflects data that were collected from the petroleum industry during 1998 through monthly surveys. The PSA is divided into two volumes. The first volume contains three sections: Summary Statistics, Detailed Statistics, and Refinery Statistics; each with final annual data. This second volume contains final statistics for each month of 1998, and replaces data previously published in the Petroleum Supply Monthly (PSM). The tables in Volumes 1 and 2 are similarly numbered to facilitate comparison between them. Explanatory Notes, located at the end of this publication, present information describing data collection, sources, estimation methodology, data quality control procedures, modifications to reporting requirements and interpretation of tables. Industry terminology and product definitions are listed alphabetically in the Glossary. 35 tabs

  1. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    International Nuclear Information System (INIS)

    Walsh, B.; Fisher, C.; Zigler, G.; Clark, R.A.

    1990-01-01

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG ampersand G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort

  2. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Legal basis for risk analysis methodology while ensuring food safety in the Eurasian Economic union and the Republic of Belarus

    Directory of Open Access Journals (Sweden)

    E.V. Fedorenko

    2015-09-01

    Full Text Available Health risk analysis methodology is an internationally recognized tool for ensuring food safety. Three main elements of risk analysis are risk assessment, risk management and risk communication to inform the interested parties on the risk, are legislated and implemented in the Eurasian Economic Union and the Republic of Belarus. There is a corresponding organizational and functional framework for the application of risk analysis methodology as in the justification of production safety indicators and the implementation of public health surveillance. Common methodological approaches and criteria for evaluating public health risk are determined, which are used in the development and application of food safety requirements. Risk assessment can be used in justifying the indicators of safety (contaminants, food additives, and evaluating the effectiveness of programs on enrichment of food with micronutrients.

  5. Shrinkage Analysis on Thick Plate Part using Response Surface Methodology (RSM

    Directory of Open Access Journals (Sweden)

    Isafiq M.

    2016-01-01

    Full Text Available The work reported herein is about an analysis on the quality (shrinkage on a thick plate part using Response Surface Methodology (RSM. Previous researches showed that the most influential factor affecting the shrinkage on moulded parts are mould and melt temperature. Autodesk Moldflow Insight software was used for the analysis, while specifications of Nessei NEX 1000 injection moulding machine and P20 mould material were incorporated in this study on top of Acrylonitrile Butadiene Styrene (ABS as a moulded thermoplastic material. Mould temperature, melt temperature, packing pressure and packing time were selected as variable parameters. The results show that the shrinkage have improved 42.48% and 14.41% in parallel and normal directions respectively after the optimisation process.

  6. Application of NASA Kennedy Space Center system assurance analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    The Kennedy Space Center (KSC) entered into an agreement with the Nuclear Regulatory Commission (NRC) to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. In joint meetings of KSC and Duke Power personnel, an agreement was made to select to CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set a Final Safety Analysis Reports as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. The conclusion is drawn that nuclear power plant systems and aerospace ground support systems are similar in complexity and design and share common safety and reliability goals. The SAA methodology is readily adaptable to nuclear power plant designs because of it's practical application of existing and well known safety and reliability analytical techniques tied to an effective management information system

  7. Three-Dimensional Echocardiography-Derived Non-Invasive Right Ventricular Pressure-Volume Analysis.

    Science.gov (United States)

    Huang, Kuan-Chih; Lin, Lian-Yu; Hwang, Juey-Jen; Lin, Lung-Chun

    2017-09-01

    In patients with pulmonary hypertension, repeated evaluations of right ventricular (RV) function are still required for clinical decision making, but the invasive nature of current pressure-volume analysis makes conducting regular follow-ups in a clinical setting infeasible. We enrolled 12 patients with pulmonary arterial hypertension (PAH) and 10 with pulmonary venous hypertension (PVH) May 2016-October 2016. All patients underwent a clinically indicated right heart catheterization (RHC), from which the yielded right ventricular pressure recordings were conjugated with RV volume by 3-D echocardiography to generate a pressure-volume loop. A continuous-wave Doppler envelope of tricuspid regurgitation was transformed into a pressure gradient recording by the simplified Bernoulli equation, and then a systolic pressure gradient-volume (PG-V) diagram was generated from similar methods. The area enclosed by the pressure-volume loop was calculated to represent semi-invasive right ventricular stroke work (RVSW RHC ). The area between the PG-V diagram and x-axis was calculated to estimate non-invasive RVSW (RVSW echo ). Patients with PAH have higher RV pressure, lower pulmonary arterial wedge pressure and larger RV volume that was contributed by the dilation of RV mid-cavity minor dimension. We found no significant difference of traditional parameters between these two groups, but RVSW values were significantly higher in PAH patients. The RVSW values of these two methods were significantly correlated by the equation RVSW echo  = 0.8447 RVSW RHC  + 129.38 (R 2  = 0.9151, p rights reserved.

  8. Current self-reported symptoms of attention deficit/hyperactivity disorder are associated with total brain volume in healthy adults.

    Directory of Open Access Journals (Sweden)

    Martine Hoogman

    Full Text Available BACKGROUND: Reduced total brain volume is a consistent finding in children with Attention Deficit/Hyperactivity Disorder (ADHD. In order to get a better understanding of the neurobiology of ADHD, we take the first step in studying the dimensionality of current self-reported adult ADHD symptoms, by looking at its relation with total brain volume. METHODOLOGY/PRINCIPAL FINDINGS: In a sample of 652 highly educated adults, the association between total brain volume, assessed with magnetic resonance imaging, and current number of self-reported ADHD symptoms was studied. The results showed an association between these self-reported ADHD symptoms and total brain volume. Post-hoc analysis revealed that the symptom domain of inattention had the strongest association with total brain volume. In addition, the threshold for impairment coincides with the threshold for brain volume reduction. CONCLUSIONS/SIGNIFICANCE: This finding improves our understanding of the biological substrates of self-reported ADHD symptoms, and suggests total brain volume as a target intermediate phenotype for future gene-finding in ADHD.

  9. Alternative occupied volume integrity (OVI) tests and analyses.

    Science.gov (United States)

    2013-10-01

    FRA, supported by the Volpe Center, conducted research on alternative methods of evaluating occupied volume integrity (OVI) in passenger railcars. Guided by this research, an alternative methodology for evaluating OVI that ensures an equivalent or gr...

  10. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  11. Comparative analysis between three different methodologies for design of MSE walls: FHWA NHI-10-024, BS 8006 and EBGEO

    International Nuclear Information System (INIS)

    Galindo Mondragon, A.

    2014-01-01

    This document reflects the current practice for design of MSE walls using Partial coefficients. A deep compassion between three of the most applied methodologies around the world for the design of this type of structures has been done (Galindo, 2012). In the study, almost all the limit states involved in an external and internal analysis were analyzed. The methodologies under study are the FHWA NHI-10-024 (2009), BS-8006 ((2010) and EBGEO (2010) used in United States, Great Britain and Germany, respectively. Like a complement of the analysis, the results of two examples developed with the three methodologies are presented, showing that exist a tendency to a more conservative wall design for EBGEO and BS 8006 in comparison with FHWA. (Author)

  12. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  13. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  14. 3-Dimensional Methodology for the Control Rod Ejection Accident Analysis Using UNICORNTM

    International Nuclear Information System (INIS)

    Jang, Chan-su; Um, Kil-sup; Ahn, Dawk-hwan; Kim, Yo-han; Sung, Chang-kyung; Song, Jae-seung

    2006-01-01

    The control rod ejection accident has been analyzed with STRIKIN-II code using the point kinetics model coupled with conservative factors to address the three dimensional aspects. This may result in a severe transient with very high fuel enthalpy deposition. KNFC, under the support of KEPRI and KAERI, is developing 3-dimensional methodology for the rod ejection accident analysis using UNICORNTM (Unified Code of RETRAN, TORC and MASTER). For this purpose, 3-dimensional MASTER-TORC codes, which have been combined with the dynamic-link library by KAERI, are used in the transient analysis of the core and RETRAN code is used to estimate the enthalpy deposition in the hot rod

  15. A comparative analysis of methodology for inventory of greenhouse gases emissions - IPCC and CORINAIR

    International Nuclear Information System (INIS)

    Vasilev, Kh.

    1998-01-01

    The inventory of greenhouse gases (GHG) is performed by two accepted methods - CORINAIR (of EU) and IPCC (of UN Intergovernmental Panel on Climate Changes). The first one is applied only in European countries, the second is conformable to GHG emissions from all over the world. The versions IPCC-95 and CORINAIR94 are compared from theoretical and methodological point of view. In Bulgaria the version CORINAIR95 is not applied yet and the inventory analysis for 1994 uses CORINAIR90. The emissions of main GHG and gases-precursors are compared. The main elements of inventory are analyzed. The values recommended by CORINAIR94 are taken into account. A table for accordance between the two methods is used. The differences concerning transport vehicles are taken into account also. Differences between the two methods are noticed in the following directions: nomenclature of the activities emitting GHG; organization of the inventory guides; kind of the activities and technologies included. The qualitative comparison are done for energy sector and for industry separately. The results show too big differences in the volume of the emitted GHG and the reasons could be classified as methodological ones and differences in the kind and values of the emission coefficients. For their determining standard values for Eastern Europe from IPCC guide have been applied as well as data from experimental investigations. Respectively, in the method CORINAIR emission coefficients CORINAIR90 are used. The differences between the emission coefficients determined in the two methods are as big as twice or even more for CO at solid fuels, i.g. at energy production; as big as three times at NO x and up to twenty times at methane also at solid fuels. The two methods do not read the emissions of gases-precursors at some industrial processes. This disadvantage is overcome at IPCC96 and it is necessary to complement the emission coefficients in the data base, especially for gases-precursors regarding the

  16. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. A rational methodology for the study of foundations for marine structures

    International Nuclear Information System (INIS)

    Mira Mc Willams, P.; Fernandez-Merodo, J. A.; Pastor Perez, M.; Monte Saez, J. L.; Martinez Santamaria, J. M.; Cuellar Mirasol, V.; Martin Baanante, M. E.; Rodriguez Sanchez-Arevalo, I; Lopez Maldonando, J. D.; Tomas Sampedro, A.

    2011-01-01

    A methodology for the study of marine foundations is presented. The response in displacements, stresses and pore water pressures in obtained from a finite element coupled formulation. Loads due to wave action of the foundation are obtained from a volume of fluid type fluid-structure interaction numerical model. Additionally, the methodology includes a Generalized Plasticity based constitutive model for granular materials capable of representing liquefaction fenomena of sands subjected to cyclic loading, such as those frequently appearing in the problems studied. Calibration of this model requires a series of laboratory tests detailed herein. This methodology is applied to the study of the response of a caisson breakwater foundation. (Author) 10 refs.

  18. Measuring service line competitive position. A systematic methodology for hospitals.

    Science.gov (United States)

    Studnicki, J

    1991-01-01

    To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.

  19. Analysis of cold leg LOCA with failed HPSI by means of integrated safety assessment methodology

    International Nuclear Information System (INIS)

    Gonzalez-Cadelo, J.; Queral, C.; Montero-Mayorga, J.

    2014-01-01

    Highlights: • Results of ISA for considered sequences endorse EOPs guidance in an original way. • ISA allows to obtain accurate available times for accident management actions. • RCP-trip adequacy and available time for beginning depressurization are evaluated. • ISA minimizes the necessity of expert judgment to perform safety assessment. - Abstract: The integrated safety assessment (ISA) methodology, developed by the Spanish Nuclear Safety Council (CSN), has been applied to a thermal–hydraulic analysis of cold leg LOCA sequences with unavailable High Pressure Injection System in a Westinghouse 3-loop PWR. This analysis has been performed with TRACE 5.0 patch 1 code. ISA methodology allows obtaining the Damage Domain (the region of space of parameters where a safety limit is exceeded) as a function of uncertain parameters (break area) and operator actuation times, and provides to the analyst useful information about the impact of these uncertain parameters in safety concerns. In this work two main issues have been analyzed: the effect of reactor coolant pump trip and the available time for beginning of secondary-side depressurization. The main conclusions are that present Emergency Operating Procedures (EOPs) are adequate for managing this kind of sequences and the ISA methodology is able to take into account time delays and parameter uncertainties

  20. Warpage analysis on thin shell part using response surface methodology (RSM)

    Science.gov (United States)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.

  1. Sample volume and alignment analysis for an optical particle counter sizer, and other applications

    International Nuclear Information System (INIS)

    Holve, D.J.; Davis, G.W.

    1985-01-01

    Optical methods for particle size distribution measurements in practical high temperature environments are approaching feasibility and offer significant advantages over conventional sampling methods. A key requirement of single particle counting techniques is the need to know features of the sample volume intensity distribution which in general are a function of the particle scattering properties and optical system geometry. In addition, the sample volume intensity distribution is sensitive to system alignment and thus calculations of alignment sensitivity are required for assessment of practical alignment tolerances. To this end, an analysis of sample volume characteristics for single particle counters in general has been developed. Results from the theory are compared with experimental measurements and shown to be in good agreement. A parametric sensitivity analysis is performed and a criterion for allowable optical misalignment is derived for conditions where beam steering caused by fluctuating refractive-index gradients is significant

  2. A Probabilistic Analysis Methodology and Its Application to A Spent Fuel Pool System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyowon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of); Ryu, Ho G. [Daedeok R and D Center, Daejeon (Korea, Republic of)

    2013-05-15

    There was a similar accident occurring at the 2{sup nd} unit of PAKS nuclear power station in Hungary on the 10{sup th} April 2003. Insufficient cooling of spent fuel caused the spent fuel burn up or partly melting. There were many previous studies performed for analyzing and measuring the risk of spent fuel damage. In the 1980s, there are changes in conditions such as development of high density storage racks and new information concerning the possibility of cladding fires in the drained spent fuel pools. The US NRC assessed the spent fuel pool risk under the Generic Issue 82. In the 1990s, under the US NRC sponsorship, the risk assessment about the spent fuel pool at Susquehanna Steam Electric Station (SSES) has been performed and Analysis Evaluation of Operational Data (AEOD) has been organized for accumulating the reliability data. A methodology for assessing the risk associated with the spent fuel pool facility has been developed and is applied to the reference plant. It is shown that the methodology developed in this study might contribute to assessing these kinds of the SFP facilities. In this probabilistic risk analysis, the LINV Initial event results in the high frequent occurrence. The most dominant cut-sets include the human errors. The result of this analysis might contribute to identifying the weakness of the preventive and mitigating system in the SFP facility.

  3. Volume of the adrenal and pituitary glands in depression

    DEFF Research Database (Denmark)

    Kessing, Lars Vedel; Willer, Inge Stoel; Knorr, Ulla

    2011-01-01

    Numerous studies have shown that the hypothalamic-pituitary-adrenal (HPA) axis is hyperactive in some depressed patients. It is unclear whether such hyperactivity results in changed volumes of the adrenal glands, pituitary gland and hypothalamus. We systematically reviewed all controlled studies...... on the adrenal or pituitary glands or hypothalamus volume in unipolar depressive disorder published in PubMed 1966 to December 2009. We identified three studies that investigated the volume of the adrenal glands and eight studies that examined the volume of the pituitary gland, but no studies on hypothalamus...... were found. Two out of three studies found a statistically significant increase in adrenal volume in patients compared to controls. Four out of eight studies found a statistically significant increase in pituitary volume in patients compared to controls. Different methodological problems were...

  4. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  5. Viscous wing theory development. Volume 1: Analysis, method and results

    Science.gov (United States)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  6. Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates

    Science.gov (United States)

    John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin

    2014-01-01

    Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...

  7. Conceptual foresight of the volumes of postal money orders in the Republic of Komi

    OpenAIRE

    Lyubov Kuratova

    2012-01-01

    This paper describes a methodology elaborated for forecasting the volume of postal services on the basis of statistical methods of regression analysis on the example of the Republic of Komi. The influence of internal and external factors on the market of postal money orders of the Republic is constructed and investigated using the statistical regression model of the market of postal money orders of the Komi Republic in the period of 2005–2010. The conceptual foresight of development of the ...

  8. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  9. Application of Agent Methodology in Healthcare Information Systems

    Directory of Open Access Journals (Sweden)

    Reem Abdalla

    2017-02-01

    Full Text Available This paper presents a case study to describe the features and the phases of the two agent methodologies. The Gaia methodology for agent oriented analysis and design, Tropos is a detailed agent oriented software engineering methodology to explore each methodology's ability to present solutions for small problems. Also we provide an attempt to discover whether the methodology is in fact understandable and usable. In addition we were collecting and taking notes of the advantages and weaknesses of these methodologies during the study analysis for each methodology and the relationships among their models. The Guardian Angle: Patient-Centered Health Information System (GA: PCHIS is the personal system to help track, manage, and interpret the subject's health history, and give advice to both patient and provider is used as the case study throughout the paper.

  10. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    Directory of Open Access Journals (Sweden)

    Wani Tanveer A

    2012-11-01

    Full Text Available Abstract Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca. Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ, time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993 over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach and reduction in the analysis cost by 50-fold.

  11. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    Science.gov (United States)

    2012-01-01

    Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143

  12. Analysis and modeling of complex data in behavioral and social sciences

    CERN Document Server

    Okada, Akinori; Ragozini, Giancarlo; Weihs, Claus

    2014-01-01

    This volume presents theoretical developments, applications and computational methods for the analysis and modeling in behavioral and social sciences where data are usually complex to explore and investigate. The challenging proposals provide a connection between statistical methodology and the social domain with particular attention to computational issues in order to effectively address complicated data analysis problems. The papers in this volume stem from contributions initially presented at the joint international meeting JCS-CLADAG held in Anacapri (Italy) where the Japanese Classification Society and the Classification and Data Analysis Group of the Italian Statistical Society had a stimulating scientific discussion and exchange.

  13. A design methodology to reduce waste in the construction process

    Institute of Scientific and Technical Information of China (English)

    AndrewN.BALDWIN; SimonA.AUSTIN; AndrewKEYS

    2003-01-01

    This paper describes a conceptual tool to enable construction professional to identify where waste is generated during the construction of buildings and address how it can be reduced. It allows an improvement in the waste management practices on site by forecasting future waste types and volumes. It will reduce waste volumes on site through identification of wasteful design practices. The tool contributes to all stages of design and construction. At the Concept Stage of Design the proposed methodology provides a framework for reducing waste through better informed decisions. At the Detailed Design Stage it gives a methodology to address the areas of concern and provide focused information to aid the reduction of waste through informed design decisions. During construction it provides a tool to predict waste types arising on site thus allowing a system of proaclive waste management that will aid skip segregation strategies leading to improved waste recycling and waste reuse.

  14. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  15. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  16. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  17. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  18. Analysis of kyoto university reactor physics critical experiments using NCNSRC calculation methodology

    International Nuclear Information System (INIS)

    Amin, E.; Hathout, A.M.; Shouman, S.

    1997-01-01

    The kyoto university reactor physics experiments on the university critical assembly is used to benchmark validate the NCNSRC calculations methodology. This methodology has two lines, diffusion and Monte Carlo. The diffusion line includes the codes WIMSD4 for cell calculations and the two dimensional diffusion code DIXY2 for core calculations. The transport line uses the MULTIKENO-Code vax Version. Analysis is performed for the criticality, and the temperature coefficients of reactivity (TCR) for the light water moderated and reflected cores, of the different cores utilized in the experiments. The results of both Eigen value and TCR approximately reproduced the experimental and theoretical Kyoto results. However, some conclusions are drawn about the adequacy of the standard wimsd4 library. This paper is an extension of the NCNSRC efforts to assess and validate computer tools and methods for both Et-R R-1 and Et-MMpr-2 research reactors. 7 figs., 1 tab

  19. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  20. Thermoeconomic analysis of storage systems for solar heating and cooling systems: A comparison between variable-volume and fixed-volume tanks

    International Nuclear Information System (INIS)

    Buonomano, Annamaria; Calise, Francesco; Ferruzzi, Gabriele

    2013-01-01

    The paper investigates different control strategies for the thermal storage management in SHC (Solar Heating and Cooling) systems. The SHC system under investigation is based on a field of evacuated solar collectors coupled with a single-stage LiBr–H 2 O absorption chiller; auxiliary thermal energy is supplied by a gas-fired boiler. The SHC is also equipped with a novel thermal storage system, consisting in a variable volume storage tank. It includes three separate tanks and a number of mixers and diverters managed by novel control strategies, based on combinations of series/parallel charging and discharging approaches. The aim of this component is to vary the thermal storage capacity as a function of the combinations of solar radiation availability and user thermal/cooling energy demands. The system allows one to increase the number of active tanks when the time shift between solar energy and user demand is high. Conversely, when this time shift is low, the number of active tanks is automatically reduced. In addition, when the solar energy in excess cannot be stored in such tanks, a heat exchanger is also used in the solar loop for producing DHW (Domestic Hot Water). The analysis is carried out by means of a zero-dimensional transient simulation model, developed by using the TRNSYS software. In order to assess the operating and capital costs of the systems under analysis, an economic model is also proposed. In addition, in order to determine the set of the synthesis/design variables which maximize the system profitability, a parametric analysis was implemented. The novel variable-volume storage system, in both the proposed configurations, was also compared with a constant-volume storage system from the energy and economic points of view. The results showed that the presented storage system allows one to save up to 20% of the natural gas used by the auxiliary boiler only for very high solar fractions. In all the other cases, marginal savings are achieved by the

  1. Application of best estimate and uncertainty safety analysis methodology to loss of flow events at Ontario's Power Generation's Darlington Nuclear Generating Station

    International Nuclear Information System (INIS)

    Huget, R.G.; Lau, D.K.; Luxat, J.C.

    2001-01-01

    Ontario Power Generation (OPG) is currently developing a new safety analysis methodology based on best estimate and uncertainty (BEAU) analysis. The framework and elements of the new safety analysis methodology are defined. The evolution of safety analysis technology at OPG has been thoroughly documented. Over the years, the use of conservative limiting assumptions in OPG safety analyses has led to gradual erosion of predicted safety margins. The main purpose of the new methodology is to provide a more realistic quantification of safety margins within a probabilistic framework, using best estimate results, with an integrated accounting of the underlying uncertainties. Another objective of the new methodology is to provide a cost-effective means for on-going safety analysis support of OPG's nuclear generating stations. Discovery issues and plant aging effects require that the safety analyses be periodically revised and, in the past, the cost of reanalysis at OPG has been significant. As OPG enters the new competitive marketplace for electricity, there is a strong need to conduct safety analysis in a less cumbersome manner. This paper presents the results of the first licensing application of the new methodology in support of planned design modifications to the shutdown systems (SDSs) at Darlington Nuclear Generating Station (NGS). The design modifications restore dual trip parameter coverage over the full range of reactor power for certain postulated loss-of-flow (LOF) events. The application of BEAU analysis to the single heat transport pump trip event provides a realistic estimation of the safety margins for the primary and backup trip parameters. These margins are significantly larger than those predicted by conventional limit of the operating envelope (LOE) analysis techniques. (author)

  2. Estimating traffic volume on Wyoming low volume roads using linear and logistic regression methods

    Directory of Open Access Journals (Sweden)

    Dick Apronti

    2016-12-01

    Full Text Available Traffic volume is an important parameter in most transportation planning applications. Low volume roads make up about 69% of road miles in the United States. Estimating traffic on the low volume roads is a cost-effective alternative to taking traffic counts. This is because traditional traffic counts are expensive and impractical for low priority roads. The purpose of this paper is to present the development of two alternative means of cost-effectively estimating traffic volumes for low volume roads in Wyoming and to make recommendations for their implementation. The study methodology involves reviewing existing studies, identifying data sources, and carrying out the model development. The utility of the models developed were then verified by comparing actual traffic volumes to those predicted by the model. The study resulted in two regression models that are inexpensive and easy to implement. The first regression model was a linear regression model that utilized pavement type, access to highways, predominant land use types, and population to estimate traffic volume. In verifying the model, an R2 value of 0.64 and a root mean square error of 73.4% were obtained. The second model was a logistic regression model that identified the level of traffic on roads using five thresholds or levels. The logistic regression model was verified by estimating traffic volume thresholds and determining the percentage of roads that were accurately classified as belonging to the given thresholds. For the five thresholds, the percentage of roads classified correctly ranged from 79% to 88%. In conclusion, the verification of the models indicated both model types to be useful for accurate and cost-effective estimation of traffic volumes for low volume Wyoming roads. The models developed were recommended for use in traffic volume estimations for low volume roads in pavement management and environmental impact assessment studies.

  3. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    International Nuclear Information System (INIS)

    1996-01-01

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues

  4. Model documentation Natural Gas Transmission and Distribution Model of the National Energy Modeling System. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-26

    The Natural Gas Transmission and Distribution Model (NGTDM) of the National Energy Modeling System is developed and maintained by the Energy Information Administration (EIA), Office of Integrated Analysis and Forecasting. This report documents the archived version of the NGTDM that was used to produce the natural gas forecasts presented in the Annual Energy Outlook 1996, (DOE/EIA-0383(96)). The purpose of this report is to provide a reference document for model analysts, users, and the public that defines the objectives of the model, describes its basic approach, and provides detail on the methodology employed. Previously this report represented Volume I of a two-volume set. Volume II reported on model performance, detailing convergence criteria and properties, results of sensitivity testing, comparison of model outputs with the literature and/or other model results, and major unresolved issues.

  5. Methodology for deriving hydrogeological input parameters for safety-analysis models - application to fractured crystalline rocks of Northern Switzerland

    International Nuclear Information System (INIS)

    Vomvoris, S.; Andrews, R.W.; Lanyon, G.W.; Voborny, O.; Wilson, W.

    1996-04-01

    Switzerland is one of many nations with nuclear power that is seeking to identify rock types and locations that would be suitable for the underground disposal of nuclear waste. A common challenge among these programs is to provide engineering designers and safety analysts with a reasonably representative hydrogeological input dataset that synthesizes the relevant information from direct field observations as well as inferences and model results derived from those observations. Needed are estimates of the volumetric flux through a volume of rock and the distribution of that flux into discrete pathways between the repository zones and the biosphere. These fluxes are not directly measurable but must be derived based on understandings of the range of plausible hydrogeologic conditions expected at the location investigated. The methodology described in this report utilizes conceptual and numerical models at various scales to derive the input dataset. The methodology incorporates an innovative approach, called the geometric approach, in which field observations and their associated uncertainty, together with a conceptual representation of those features that most significantly affect the groundwater flow regime, were rigorously applied to generate alternative possible realizations of hydrogeologic features in the geosphere. In this approach, the ranges in the output values directly reflect uncertainties in the input values. As a demonstration, the methodology is applied to the derivation of the hydrogeological dataset for the crystalline basement of Northern Switzerland. (author) figs., tabs., refs

  6. Low-level radioactive waste in the northeast: revised waste volume projections

    International Nuclear Information System (INIS)

    1984-06-01

    The volume of low-level radioactive waste generated in the eleven Northeast states has undergone significant change since the inital 1982 analysis and projection. These revised projections incorporate improved data reporting and evidence of sharp declines in certain categories of waste. Volumes in the 1982-1983 period reflect waste shipped for disposal as reported by disposal site operators. Projected waste volumes represent waste intended for disposal. The recent dramatic changes in source reduction and waste management practices underscore the need for annual review of waste volume projections. The volume of waste shipped for off-site disposal has declined approximately 12% in two years, from an average 1,092,500 ft 3 annually in 1979 to 1981 to an average annual 956,500 ft 3 in 1982 to 1983; reactor waste disposal volumes declined by about 39,000 ft 3 or 7% during this period. Non-reactor waste volumes shipped for disposal declined by over 70,000 ft 3 or 15% during this period. The data suggest that generators increased their use of such management practices as source reduction, compaction, or, for carbon-14 and tritium, temporary storage followed by disposal as non-radioactive waste under the NRC de minimus standard effective March 1981. Using the Technical Subcommittee projection methodology, the volume of low-level waste produced annually in the eleven states, individually and collectively, is expected to increase through the year 2000, but at a significantly lower rate of increase than initially projected. By the year 2000, the Northeast is projected to generate 1,137,600 ft 3 of waste annually, an increase of about 20% over 1982 to 1983 average volume

  7. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  8. Study of the Utah uranium-milling industry. Volume I. A policy analysis

    International Nuclear Information System (INIS)

    Turley, R.E.

    1980-05-01

    This is the first volume of a two volume study of the Utah Uranium Milling Industry. The study was precipitated by a 1977 report issued by the Western Interstate Nuclear Board entitled Policy Recommendations on Financing Stabilization. Perpetual Surveillance and Maintenance of Uranium Mill Tailings. Volume I of this study is a policy analysis or technology assessment of the uranium milling industry in the state of Utah; specifically, the study addresses issues that deal with the perpetual surveillance, monitoring, and maintenance of uranium tailings piles at the end of uranium milling operations, i.e., following shutdown and decommissioning. Volume II of this report serves somewhat as an appendix. It represents a full description of the uranium industry in the state of Utah, including its history and statements regarding its future. The topics covered in volume I are as follows: today's uranium industry in Utah; management of the industry's characteristic nuclear radiation; uranium mill licensing and regulation; state licensing and regulation of uranium mills; forecast of future milling operations; policy needs relative to perpetual surveillance, monitoring, and maintenance of tailings; policy needs relative to perpetual oversight; economic aspects; state revenue from uranium; and summary with conclusions and recommendations. Appendices, figures and tables are also presented

  9. Adaptation of SW-846 methodology for the organic analysis of radioactive mixed wastes

    International Nuclear Information System (INIS)

    Griest, W.H.; Schenley, R.L.; Tomkins, B.A.; Caton, J.E. Jr.; Fleming, G.S.; Harmon, S.H.; Wachter, L.J.; Garcia, M.E.; Edwards, M.D.

    1990-01-01

    Modifications to SW-846 sample preparation methodology permit the organic analysis of radioactive mixed waste with minimum personal radiation exposure and equipment contamination. This paper describes modifications to SW-846 methods 5030 and 3510-3550 for sample preparation in radiation-zoned facilities (hood, glove box, and hot cell) and GC-MS analysis of the decontaminated organic extracts in a conventional laboratory for volatile and semivolatile organics by methods 8240 and 8270 (respectively). Results will be presented from the analysis of nearly 70 nuclear waste storage tank liquids and 17 sludges. Regulatory organics do not account for the organic matter suggested to be present by total organic carbon measurements. 7 refs., 5 tabs

  10. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    Wickett, A.J.; Yadigaroglu, G.

    1994-08-01

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  11. Blood volume studies

    International Nuclear Information System (INIS)

    Lewis, S.M.; Yin, J.A.L.

    1986-01-01

    The use of dilution analysis with such radioisotopes as 51 Cr, 32 P, sup(99m)Tc and sup(113m)In for measuring red cell volume is reviewed briefly. The use of 125 I and 131 I for plasma volume studies is also considered and the subsequent determination of total blood volume discussed, together with the role of the splenic red cell volume. Substantial bibliography. (UK)

  12. Application of NASA Kennedy Space Center System Assurance Analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    In May of 1982, the Kennedy Space Center (KSC) entered into an agreement with the NRC to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. North Carolina's Duke Power Company expressed an interest in the study and proposed the nuclear power facility at CATAWBA for the basis of the study. In joint meetings of KSC and Duke Power personnel, an agreement was made to select two CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set of Final Safety Analysis Reports (FSAR) as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. (orig./HP)

  13. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    International Nuclear Information System (INIS)

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  14. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    Bojacá, Carlos Ricardo; Casilimas, Héctor Albeiro; Gil, Rodrigo; Schrevens, Eddie

    2012-01-01

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha −1 to produce 96.4 GJ ha −1 , resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  15. Multi-Objective Optimization of Moving-magnet Linear Oscillatory Motor Using Response Surface Methodology with Quantum-Behaved PSO Operator

    Science.gov (United States)

    Lei, Meizhen; Wang, Liqiang

    2018-01-01

    To reduce the difficulty of manufacturing and increase the magnetic thrust density, a moving-magnet linear oscillatory motor (MMLOM) without inner-stators was Proposed. To get the optimal design of maximum electromagnetic thrust with minimal permanent magnetic material, firstly, the 3D finite element analysis (FEA) model of the MMLOM was built and verified by comparison with prototype experiment result. Then the influence of design parameters of permanent magnet (PM) on the electromagnetic thrust was systematically analyzed by the 3D FEA to get the design parameters. Secondly, response surface methodology (RSM) was employed to build the response surface model of the new MMLOM, which can obtain an analytical model of the PM volume and thrust. Then a multi-objective optimization methods for design parameters of PM, using response surface methodology (RSM) with a quantum-behaved PSO (QPSO) operator, was proposed. Then the way to choose the best design parameters of PM among the multi-objective optimization solution sets was proposed. Then the 3D FEA of the optimal design candidates was compared. The comparison results showed that the proposed method can obtain the best combination of the geometric parameters of reducing the PM volume and increasing the thrust.

  16. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME I: COMPARATIVE ANALYSIS

    Science.gov (United States)

    This volume represents the analysis of case study facilities' experience with waterbased adhesive use and retrofit requirements. (NOTE: The coated and laminated substrate manufacturing industry was selected as part of NRMRL'S support of the 33/50 Program because of its significan...

  17. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... of the control tumors. The mean nuclear volume of the individual recurrent tumors appeared to change with time, showing a tendency to diminish. A relationship between large nuclear volume at presentation and number of or time interval between recurrences was not found. We conclude that measurement of mean...... nuclear volume in meningiomas might help identify a group at risk of recurrence....

  18. Stratal Control Volumes and Stratal Control Trajectories: A New Method to Constrain, Understand and Reconcile Results from Stratigraphic Outcrop Analysis, Subsurface Analysis and Analogue and Numerical Modelling

    Science.gov (United States)

    Burgess, P. M.; Steel, R. J.

    2016-12-01

    Decoding a history of Earth's surface dynamics from strata requires robust quantitative understanding of supply and accommodation controls. The concept of stratigraphic solution sets has proven useful in this decoding, but application and development of this approach has so far been surprisingly limited. Stratal control volumes, areas and trajectories are new approaches defined here, building on previous ideas about stratigraphic solution sets, to help analyse and understand the sedimentary record of Earth surface dynamics. They may have particular application reconciling results from outcrop and subsurface analysis with results from analogue and numerical experiments. Stratal control volumes are sets of points in a three-dimensional volume, with axes of subsidence, sediment supply and eustatic rates of change, populated with probabilities derived from analysis of subsidence, supply and eustasy timeseries (Figure 1). These empirical probabilities indicate the likelihood of occurrence of any particular combination of control rates defined by any point in the volume. The stratal control volume can then by analysed to determine which parts of the volume represent relative sea-level fall and rise, where in the volume particular stacking patterns will occur, and how probable those stacking patterns are. For outcrop and subsurface analysis, using a stratal control area with eustasy and subsidence combined on a relative sea-level axis allows similar analysis, and may be preferable. A stratal control trajectory is a history of supply and accommodation creation rates, interpreted from outcrop or subsurface data, or observed in analogue and numerical experiments, and plotted as a series of linked points forming a trajectory through the stratal control volume (Figure 1) or area. Three examples are presented, one from outcrop and two theoretical. Much work remains to be done to build a properly representative database of stratal controls, but careful comparison of stratal

  19. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    Science.gov (United States)

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become

  20. Keeping it pure – a pedagogical case study of teaching soft systems methodology in scenario and policy analysis

    Directory of Open Access Journals (Sweden)

    Ian Yeoman

    2016-09-01

    Full Text Available Purpose – Soft systems methodology (SSM is well documented in the academic and management literature. Over the last 40 years, the methodology has come to be adapted depending on the tool users’ skills and experience in order to fit the problem. The purpose of this paper is to demonstrate good teaching and learning practice from a pedagogical perspective. Design/methodology/approach – Dr Ian Yeoman of Victoria University of Wellington provides a personal reflection of how the methodology is used in the teaching and learning of TOUR301 Tourism Policy and Planning as a policy and scenario analysis method. Findings – The paper articulates the seven stages of SSM from problem situation unstructured, through to Rich Pictures, vision and guiding principles, policy solutions, comparisons, feasibility and implementation stages. The paper uses a series of teaching tasks to breakdown the complexity of the methodology thus guiding students and teachers in how to deploy the methodology in the classroom. Originality/value – The value of the paper demonstrates the reflective practice of SSM in action as an exemplar of good practice. The paper clearly articulates the stages of the methodology so students and teachers can adopt this approach in classroom environments following a scaffolding learning approach. The use of teaching tasks throughout the paper helps bring clarity and order thus enabling the teacher to effectively teach the subject and the students to learn. The most significant contribution of this paper is the articulation of good teaching practice in policy and scenario analysis which articulated through four learning lessons: facilitating a learning environment; the impact of visual thinking; political theory; the importance of incremental learning; and problem-based learning and international students.

  1. Research Methodology in Global Strategy Research

    DEFF Research Database (Denmark)

    Cuervo-Cazurra, Alvaro; Mudambi, Ram; Pedersen, Torben

    2017-01-01

    We review advances in research methodology used in global strategy research and provide suggestions on how researchers can improve their analyses and arguments. Methodological advances in the extraction of information, such as computer-aided text analysis, and in the analysis of datasets......, such as differences-in-differences and propensity score matching, have helped deal with challenges (e.g., endogeneity and causality) that bedeviled earlier studies and resulted in conflicting findings. These methodological advances need to be considered as tools that complement theoretical arguments and well......-explained logics and mechanisms so that researchers can provide better and more relevant recommendations to managers designing the global strategies of their organizations....

  2. IAEA methodology of the ITDB information analysis from nuclear security perspective

    International Nuclear Information System (INIS)

    2010-01-01

    The IAEA methodology of the Illicit Trafficking database analyses general and specific risks, trends and patterns. This methodology assist in identification of security needs that are specific to material , activity , location ,country or even regional.Finally the methodology also analyses the lessons learned.

  3. A theoretical-experimental methodology for assessing the sensitivity of biomedical spectral imaging platforms, assays, and analysis methods.

    Science.gov (United States)

    Leavesley, Silas J; Sweat, Brenner; Abbott, Caitlyn; Favreau, Peter; Rich, Thomas C

    2018-01-01

    Spectral imaging technologies have been used for many years by the remote sensing community. More recently, these approaches have been applied to biomedical problems, where they have shown great promise. However, biomedical spectral imaging has been complicated by the high variance of biological data and the reduced ability to construct test scenarios with fixed ground truths. Hence, it has been difficult to objectively assess and compare biomedical spectral imaging assays and technologies. Here, we present a standardized methodology that allows assessment of the performance of biomedical spectral imaging equipment, assays, and analysis algorithms. This methodology incorporates real experimental data and a theoretical sensitivity analysis, preserving the variability present in biomedical image data. We demonstrate that this approach can be applied in several ways: to compare the effectiveness of spectral analysis algorithms, to compare the response of different imaging platforms, and to assess the level of target signature required to achieve a desired performance. Results indicate that it is possible to compare even very different hardware platforms using this methodology. Future applications could include a range of optimization tasks, such as maximizing detection sensitivity or acquisition speed, providing high utility for investigators ranging from design engineers to biomedical scientists. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Development of methodology for evaluation of 99mTc and 131I incorporated activities during lactation

    International Nuclear Information System (INIS)

    Santos, L.; Dantas, A.L.A.; Mesquita, S.A.; Oliveira, S.M.V.; Instituto de Radioprotecao e Dosimetria

    2012-01-01

    Internal contamination of babies may occur for milk ingestion or inhalation of mothers occupationally exposed to ionizing radiation with possible incorporation or mothers submitted to medical exposures during lactation. Radionuclide concentrations in the mother's milk may cause organ absorbed doses in the babies proportionally to the breast volumes. Milk analysis allow to determine activities ingested by the babies by determining the peak of mother's milk considering the decrease of the activity rate and the milk activities drunk at different time intervals. The work had the aim to develop simulators and methodology to evaluate 99m Tc and 131 I in lactation, in the following steps: to prepare standard solution of contaminated milk separately with 99m Tc and 131 I; to build four breast simulators (600 g and 800 g) and respective calibration for two geometries (breast and whole-body) in the Whole-Body Counter Unit in Instituto de Radioprotecao e Dosimetria. The results demonstrated the system efficiency to determine 99m Tc and 131 I activities in breasts during the lactation period. The methodology for positioning in the 'breast geometry' seemed to be more efficient than the 'whole-body geometry' for different breast volumes. The experiment allows achieving better evaluation of internal dosimetry of mothers and their young children. (author)

  5. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  6. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  7. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    International Nuclear Information System (INIS)

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  8. Methodology of evaluation of value created in the productive processes

    OpenAIRE

    M.T. Roszak

    2008-01-01

    Purpose: Of this paper was to present the methodology of analysis of the productive processes with applicationof value analysis and multi-criterion-analysis which allow to evaluate the technology and organization of theproductive processes.Design/methodology/approach: Presented in the paper methodology of evaluation of the productive processesis based on analysis of activities in the productive processes and their characteristics with reference to createdvalue in the productive chain.Findings...

  9. A methodology for the analysis of differential coexpression across the human lifespan.

    Science.gov (United States)

    Gillis, Jesse; Pavlidis, Paul

    2009-09-22

    Differential coexpression is a change in coexpression between genes that may reflect 'rewiring' of transcriptional networks. It has previously been hypothesized that such changes might be occurring over time in the lifespan of an organism. While both coexpression and differential expression of genes have been previously studied in life stage change or aging, differential coexpression has not. Generalizing differential coexpression analysis to many time points presents a methodological challenge. Here we introduce a method for analyzing changes in coexpression across multiple ordered groups (e.g., over time) and extensively test its validity and usefulness. Our method is based on the use of the Haar basis set to efficiently represent changes in coexpression at multiple time scales, and thus represents a principled and generalizable extension of the idea of differential coexpression to life stage data. We used published microarray studies categorized by age to test the methodology. We validated the methodology by testing our ability to reconstruct Gene Ontology (GO) categories using our measure of differential coexpression and compared this result to using coexpression alone. Our method allows significant improvement in characterizing these groups of genes. Further, we examine the statistical properties of our measure of differential coexpression and establish that the results are significant both statistically and by an improvement in semantic similarity. In addition, we found that our method finds more significant changes in gene relationships compared to several other methods of expressing temporal relationships between genes, such as coexpression over time. Differential coexpression over age generates significant and biologically relevant information about the genes producing it. Our Haar basis methodology for determining age-related differential coexpression performs better than other tested methods. The Haar basis set also lends itself to ready interpretation

  10. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  11. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  12. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  13. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  14. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  15. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  16. Elaboration of the methodological referential for life cycle analysis of first generation biofuels in the French context

    International Nuclear Information System (INIS)

    2008-01-01

    This study was made under the particular context of a strong growth of biofuels market, and the implication of French and European public authorities, and certain Member States (Germany, Netherlands, UK), for the development of certification schemes for first generation biofuels. The elaboration of such schemes requires a consensus on the methodology to apply when producing Life Cycle Analysis (LCA) of biofuels. To answer this demand, the study built up the methodological referential for biofuels LCAs in order to assess the Greenhouse Gases (GHG) emissions, fossil fuels consumptions and local atmospheric pollutants emissions induced by the different biofuel production pathways. The work consisted in methodological engineering, and was accomplished thanks to the participation of all the members of the Technical Committee of the study. An initial bibliographic review on biofuels LCAs allowed the identification of the main methodological issues (listed below). For each point, the impact of the methodological choices on the biofuels environmental balances was assessed by several sensitivity analyses. The results of these analyses were taken into account for the elaboration of the recommendations: - Consideration of the environmental burdens associated with buildings, equipments and their maintenance - Quantification of nitrous oxide (N 2 O) emissions from fields - Impact of the Land Use Change (LUC) - Allocation method for the distribution of the environmental impacts of biofuel production pathways between the different products and coproducts generated. Within the framework of this study, we made no distinction in terms of methodological approach between GHG emissions and local pollutants emissions. This results from the fact that the methodological issues cover all the environmental burdens and do not require specific approaches. This executive summary presents the methodological aspects related to biofuels LCAs. The complete report of the study presents in addition

  17. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  18. A cost effective waste management methodology for power reactor waste streams

    International Nuclear Information System (INIS)

    Granus, M.W.; Campbell, A.D.

    1984-01-01

    This paper describes a computer based methodology for the selection of the processing methods (solidification/dewatering) for various power reactor radwaste streams. The purpose of this methodology is to best select the method that provides the most cost effective solution to waste management. This method takes into account the overall cost of processing, transportation and disposal. The selection matrix on which the methodology is based is made up of over ten thousand combinations of liner, cask, process, and disposal options from which the waste manager can choose. The measurement device for cost effective waste management is the concurrent evaluation of total dollars spent. The common denominator is dollars per cubic foot of the input waste stream. Dollars per curie of the input waste stream provides for proper checks and balances. The result of this analysis can then be used to assess the total waste management cost. To this end, the methodology can then be employed to predict a given number of events (processes, transportation, and disposals) and project the annual cost of waste management. For the purposes of this paper, the authors provide examples of the application of the methodology on a typical BWR at 2, 4 and 6 years. The examples are provided in 1984 dollars. Process selection is influenced by a number of factors which must be independently evaluated for each waste stream. Final processing cost is effected by the particular process efficiency and a variety of regulatory constraints. The interface between process selection and cask selection/transportation driven by the goal of placing the greatest amount of pre-processed waste in the package and remaining within the bounds of weight, volume, regulatory, and cask availability limitations. Disposal is the cost of burial and can be affected by disposal, but availability of burial space, and the location of the disposal site in relation to the generator

  19. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  20. An analysis of malar fat volume in two age groups: implications for craniofacial surgery.

    Science.gov (United States)

    Corey, Christina L; Popelka, Gerald R; Barrera, Jose E; Most, Sam P

    2012-12-01

    Objective To evaluate how malar fat pad (MFP) volumes vary with age, after controlling for gender and body mass index (BMI). Study Design A prospective case-control study evaluating volume of the MFP in women of two age groups. Methods Soft tissue dimensions were measured in eight subjects using magnetic resonance imaging. A multiplanar localizing sequence, followed in sagittal and coronal orientations using a turbo spin echo sequence, was performed to define the MFP. Volumetric calculations were then performed using a 3D image analysis application (Dextroscope, Volume Interactions, Republic of Singapore) to circumscribe areas, orient dimensions, and calculate volumes of the MFP. Results These data reveal no significant difference in the mean (standard deviation) right MFP (p = 0.50), left MFP (p = 0.41), or total MFP (p = 0.45) volumes when comparing the two age groups. In addition, these data indicate that there was no correlation between age and total MFP volume (Pearson correlation coefficient 0.27). Moreover, there was no correlation between age and the ratio of total volume/BMI (Pearson correlation coefficient -0.18). Conclusions Although the sample size of this study was small, these data indicate that ptosis of midfacial fat is more important than volume loss in midfacial aging. These data would suggest repositioning as the primary modality for craniofacial reconstruction.