WorldWideScience

Sample records for analysis methodology volume

  1. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  2. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  3. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  4. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume I. Data analysis methodology and hardware description

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and had dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  5. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    Science.gov (United States)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  6. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  7. Socioeconomic effects of the DOE Gas Centrifuge Enrichment Plant. Volume 1: methodology and analysis

    International Nuclear Information System (INIS)

    The socioeconomic effects of the Gas Centrifuge Enrichment Plant being built in Portsmouth, Ohio were studied. Chapters are devoted to labor force, housing, population changes, economic impact, method for analysis of services, analysis of service impacts, schools, and local government finance

  8. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  9. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  10. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  11. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    International Nuclear Information System (INIS)

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  12. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  13. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  14. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  15. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  16. Normative price for a manufactured product: the SAMICS methodology. Volume II. Analysis. JPL publication 78-98. [Solar Array Manufacturing Industry Costing Standards

    Energy Technology Data Exchange (ETDEWEB)

    Chamberlain, R.G.

    1979-01-15

    The Solar Array Manufacturing Industry Costing Standards (SAMICS) provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. This document presents the methodology and its theoretical background. It is contended that the model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simulation computer program (SAMIS III, Release 1) is discussed.

  17. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  18. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  19. Particulate matter test in small volume parenterals: critical aspects in sampling methodology.

    Science.gov (United States)

    Pavanetto, F; Conti, B; Genta, I; Ponci, R; Montanari, L; Grassi, M

    1989-06-01

    The following critical steps of the particulate matter test sampling methodology for small volume parenteral products (SVPs), conduct by light blockage method, were considered: 1) reliability of the small volume aspirator sampler for different sample volumes; 2) particulate matter distribution inside each ampoule in liquid products (8 liquid SVPs tested); 3) influence of the sample preparation method on the evaluation of the final contamination of the sample. Nine liquid SVPs were tested by preparing samples following the three U.S.P. XXI methods: 1) unit as it is (direct analysis), II) unit diluted, III) sample obtained by combining several units. Particles counts were performed by a HIAC/ROYCO model 3000 counter fitted with a small volume sampler. The validation of the sampler shows that it should be improved. A more accurate and strict validation than the one stated by U.S.P. XXI is suggested. The particulate matter distribution in liquid products is found to be uniform inside the ampoule in the size range greater than or equal to 2 microns-greater than or equal to 10 microns; the analysis can be performed examining only a portion of the whole content. The three sample preparation methods lead to significantly different contamination results. The particulate control test should be conduct by direct analysis, as it is carried out under the same conditions as for product use. The combining method (III) is suggested for products of less than 2 ml volume that cannot be examined by direct analysis. PMID:2803449

  20. Methodological advancements in procedures for common cause failure analysis

    International Nuclear Information System (INIS)

    This paper summarizes the methodological advancements achieved in the process of developing a procedures guide for the analysis of common cause failures (CCF) in safety and reliability studies. The work was sponsored by the Electric Power Research Institute and the U.S. Nuclear Regulatory Commission, and resulted in the publication of a two-volume guidebook. The methodological advancements include the development of a systematic framework for qualitative and quantitative analysis of CCFs, introduction of basic events, improvements in parametric models and their estimators, and development of a series of techniques for the creation of a plant-specific CCF database

  1. Exploratory Factor Analysis: Conceptual and Methodological Basis

    OpenAIRE

    Edgardo R. Pérez; Medrano, Leonardo

    2010-01-01

    The present work reviews the foundations and principal procedures of exploratory factor analysis conceived as an essential method for the construction, adjustment and validation of psychological tests. The article, first, considers the principal strategic decisions that researcher’s must make during the implementation of this methodology. Additionally, this work presents discussions regarding different conceptual and methodological considerations that exploratory factor analysis presents in e...

  2. The Methodology of Data Envelopment Analysis.

    Science.gov (United States)

    Sexton, Thomas R.

    1986-01-01

    The methodology of data envelopment analysis, (DEA) a linear programming-based method, is described. Other procedures often used for measuring relative productive efficiency are discussed in relation to DEA, including ratio analysis and multiple regression analysis. The DEA technique is graphically illustrated for only two inputs and one output.…

  3. Methodology of human factor analysis

    International Nuclear Information System (INIS)

    The paper describes the manner in which the Heat Production Department of Electricite de France analyses the human factors in nuclear power plants. After describing the teams and structures set up to deal with this subject, the paper emphasizes two types of methods which are used, most often in complementary fashion: (1) an a posteriori analysis, which consists in studying the events which have taken place at nuclear power plants and in seeking the deepseated causes so as to prevent their reoccurrence in future; (2) an a priori analysis, which consists in analysing a work situation and in detecting all its potential failure factors so as to prevent their resulting once again in dysfunctions of the facility. To illustrate these two types of analysis, two examples are given: first, a study of the telephonic communications between operators in one plant (in which the a posteriori and a priori analysis are developed) and, second, a study of stress in a plant (in which only the a priori analysis is used). (author). 1 tab

  4. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  5. Apparatus for measuring rat body volume: a methodological proposition.

    Science.gov (United States)

    Hohl, Rodrigo; de Oliveira, Renato Buscariolli; Vaz de Macedo, Denise; Brenzikofer, René

    2007-03-01

    We propose a communicating-vessels system to measure body volume in live rats through water level detection by hydrostatic weighing. The reproducibility, accuracy, linearity, and reliability of this apparatus were evaluated in two tests using previously weighed water or six aluminum cylinders of known volume after proper system calibration. The applicability of this apparatus to measurement of live animals (Wistar rats) was tested in a transversal experiment with five rats, anesthetized and nonanesthetized. We took 18 measurements of the volume under each condition (anesthetized and nonanesthetized), totaling 90 measurements. The addition of water volumes (50-700 ml) produced a regression equation with a slope of 1.0006 +/- 0.0017, intercept of 0.75 +/- 0.81 (R(2) = 0.99999, standard error of estimate = 0.58 ml), and bias of approximately 1 ml. The differences between cylinders of known volumes and volumes calculated by the system were air bubbles trapped in the apparatus or the fur. The proposed apparatus for measuring rat body volume is inexpensive and may be useful for a range of scientific purposes. PMID:17082370

  6. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  7. Exploring participatory methodologies in organizational discourse analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... collaborative governance processes in Denmark. The paper contributes to refining methodologies of organizational discourse analysis by elaborating method-mixing that embraces multimodal organizational discourses. Furthermore it discusses practical implications of the struggling subjectification processes of......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork in...

  8. Methodology of analysis of casting defects

    Directory of Open Access Journals (Sweden)

    L.A. Dobrzański

    2006-08-01

    Full Text Available Purpose: The goal of this publication is to present the methodology of the automatic supervision and controlof the technological process of manufacturing the elements from aluminium alloys and of the methodology ofthe automatic quality assessment of these elements basing on analysis of images obtained with the X-ray defectdetection, employing the artificial intelligence tools. The methodologies developed will make identification andclassification of defects possible and the appropriate process control will make it possible to reduce them andto eliminate them - at least in part.Design/methodology/approach: The methodology is presented in the paper, making it possible to determine thetypes and classes of defects developed during casting the elements from aluminium alloys, making use photosobtained with the flaw detection method with the X-ray radiation. It is very important to prepare the neuralnetwork data in the appropriate way, including their standardization, carrying out the proper image analysis andcorrect selection and calculation of the geometrical coefficients of flaws in the X-ray images. The computersoftware was developed for this task.Findings: Combining of all methods making use of image analysis, geometrical shape coefficients, and neuralnetworks will make it possible to achieve the better efficiency of class recognition of flaws developed in thematerial.Practical implications: The presented issues may be essential, among others, for manufacturers of carsubassemblies from light alloys, where meeting the stringent quality requirements ensures the demanded servicelife of the manufactured products.Originality/value: The correctly specified number of products enables such technological process control thatthe number of castings defects can be reduced by means of the proper correction of the process.

  9. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  10. Metabolically active volumes automatic delineation methodologies in PET imaging: Review and perspectives

    International Nuclear Information System (INIS)

    PET imaging is now considered a gold standard tool in clinical oncology, especially for diagnosis purposes. More recent applications such as therapy follow-up or tumor targeting in radiotherapy require a fast, accurate and robust metabolically active tumor volumes delineation on emission images, which cannot be obtained through manual contouring. This clinical need has sprung a large number of methodological developments regarding automatic methods to define tumor volumes on PET images. This paper reviews most of the methodologies that have been recently proposed and discusses their framework and methodological and/or clinical validation. Perspectives regarding the future work to be done are also suggested. (authors)

  11. Methodology for the systems engineering process. Volume 3: Operational availability

    Science.gov (United States)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  12. Assessment methodology for new cooling lakes. Volume 3. Limnological and fisheries data and bibliography. Final report

    International Nuclear Information System (INIS)

    This is the data volume of the report entitled Assessment Methodology for New Cooling Lakes. Limnological and fisheries data were compiled in this volume for potential users in the utility industry. Published papers, reports, other written information, computer files, and direct contracts were used to compile a matrix of information. This volume presents data and the bibliographic sources of the power plant and geographical, limnological, and fisheries information for 181 lakes and reservoirs, of which 134 were used for cooling purposes. Data for 65 lakes were completed with respect to the limnology and fisheries parameters so that complete statistical analysis could be performed. Of these 65 lakes, 42 are used for cooling. Tables in this report contain data arranged by utility, power plant, limnology, water quality, morphoedaphic, and fishery categories. The data in the tables are keyed to a lake code. The references for the data shown are keyed to a numerical listing of the bibliography. Author, state, lake, and subject indexes facilitate searching for bibliographic information

  13. LOFT blowdown experiment safety analysis methodology

    International Nuclear Information System (INIS)

    An unprecedented blowdown experiment safety analysis (ESA) has been performed for the first two scheduled nuclear experiments in the Loss-of-Fluid Test (LOFT) facility. The ESA methodology is a unique approach needed to estimate conservatively the maximum consequences that will occur during an experiment. Through use of this information an acceptable risk in terms of adequate protection of the facility, personnel, and general public can be balanced with the requirements of the experiment program objectives. As an example, one of the LOFT program objectives is to evaluate the performance and effectiveness of emergency core cooling systems (ECCS) while relying on the same ECCSs (and backup ECCSs) to effectively perform as plant protection systems (PPS). The purpose of this paper is to present the LOFT blowdown ESA methodology

  14. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  15. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  16. Enhanced recovery of unconventional gas. The methodology--Volume III (of 3 volumes)

    Energy Technology Data Exchange (ETDEWEB)

    Kuuskraa, V. A.; Brashear, J. P.; Doscher, T. M.; Elkins, L. E.

    1979-02-01

    The methodology is described in chapters on the analytic approach, estimated natural gas production, recovery from tight gas sands, recovery from Devonian shales, recovery from coal seams, and recovery from geopressured aquifers. (JRD)

  17. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  18. Protein crystallography. Methodological development and comprehensive analysis

    International Nuclear Information System (INIS)

    There have been remarkable developments in the methodology for protein structure analysis over the past few decades. Currently, single-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-SAD) is used as a general method for determining protein structure, while the sulfur single-wavelength anomalous diffraction method (S-SAD) using native protein is evolving as a next-generation method. In this paper, we look back on the early applications of multi-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-MAD) and introduce the study of ribosomal proteins as an example of the comprehensive analysis that took place in the 1990s. Furthermore, we refer to the current state of development of the S-SAD method as well as automatic structure determination. (author)

  19. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  20. Methodological issues in radiation dose-volume outcome analyses: Summary of a joint AAPM/NIH workshop

    International Nuclear Information System (INIS)

    This report represents a summary of presentations at a joint workshop of the National Institutes of Health and the American Association of Physicists in Medicine (AAPM). Current methodological issues in dose-volume modeling are addressed here from several different perspectives. Areas of emphasis include (a) basic modeling issues including the equivalent uniform dose framework and the bootstrap method, (b) issues in the valid use of statistics, including the need for meta-analysis, (c) issues in dealing with organ deformation and its effects on treatment response, (d) evidence for volume effects for rectal complications, (e) the use of volume effect data in liver and lung as a basis for dose escalation studies, and (f) implications of uncertainties in volume effect knowledge on optimized treatment planning. Taken together, these approaches to studying volume effects describe many implications for the development and use of this information in radiation oncology practice. Areas of significant interest for further research include the meta-analysis of clinical data; interinstitutional pooled data analyses of volume effects; analyses of the uncertainties in outcome prediction models, minimal parameter number outcome models for ranking treatment plans (e.g., equivalent uniform dose); incorporation of the effect of motion in the outcome prediction; dose-escalation/isorisk protocols based on outcome models; the use of functional imaging to study radio-response; and the need for further small animal tumor control probability/normal tissue complication probability studies

  1. Robust SMO methodology for exposure tool and mask variations in high volume production

    Science.gov (United States)

    Hashimoto, Takaki; Kai, Yasunobu; Masukawa, Kazuyuki; Nojima, Shigeki; Kotani, Toshiya

    2013-04-01

    A robust source mask optimization (RSMO) methodology has been developed for the first time to decrease variations of critical dimension (CD) and overlay displacement on wafer caused by extremely complex exposure tools and mask patterns. The RSMO methodology takes into account exposure tool variations of source shape, aberrations and mask as well as dose and focus to get source shapes and mask patterns robust to the exposure tool variations. A comparison between the conventional SMO and the new RSMO found that the RSMO improved the edge placement error (EPE) and displacement sensitivity to coma and astigmatism aberrations by 14% and 40%, respectively. Interestingly, even a greatly-simplified source from the RSMO provides totally smaller EPE than uselessly complex source shape from the conventional SMO. Thus, the RSMO methodology is much more effective for semiconductor products with high volume production.

  2. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  3. The significance and future of functional analysis methodologies

    OpenAIRE

    Mace, F. Charles

    1994-01-01

    Iwata, Dorsey, Slifer, Bauman, and Richman (1982) presented the first comprehensive and standardized methodology for identifying operant functions of aberrant behavior. This essay discusses the significance functional analysis has had for applied behavior analysis. The methodology has lessened the field's reliance on default technologies and promoted analysis of environment—behavior interactions maintaining target responses as the basis for selecting treatments. It has also contributed to the...

  4. Competitive analysis in banking: appraisal of the methodologies

    OpenAIRE

    Nicola Cetorelli

    1999-01-01

    How do we measure in the banking industry? This article provides an overview of the methodology currently used in competitive analysis and highlights an alternative techniques that could be used to complement this methodology. Given the ongoing process of consolidation in U.S. banking, assessing the competitiveness of financial services markets is an important issue for policymakers.

  5. Texture analysis methodologies for magnetic resonance imaging

    OpenAIRE

    Materka, Andrzej

    2004-01-01

    Methods for the analysis of digital-image texture are reviewed. The functions of MaZda, a computer program for quantitative texture analysis developed within the framework of the European COST (Cooperation in the Field of Scientific and Technical Research) B11 program, are introduced. Examples of texture analysis in magnetic resonance images are discussed.

  6. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  7. A Methodology For Flood Vulnerability Analysis In Complex Flood Scenarios

    Science.gov (United States)

    Figueiredo, R.; Martina, M. L. V.; Dottori, F.

    2015-12-01

    Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a key research topic. In this paper, we propose a methodology for large-scale analysis of flood vulnerability. The methodology is based on a GIS-based index, which considers local topography, terrain roughness and basic information about the flood scenario to reproduce the diffusive behaviour of floodplain flow. The methodology synthetizes the spatial distribution of index values into maps and curves, used to represent the vulnerability in the area of interest. Its application allows for considering different levels of complexity of flood scenarios, from localized flood defence failures to complex hazard scenarios involving river reaches. The components of the methodology are applied and tested in two floodplain areas in Northern Italy recently affected by floods. The results show that the methodology can provide an original and valuable insight of flood vulnerability variables and processes.

  8. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  9. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  10. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  11. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual...

  12. [Free will and neurobiology: a methodological analysis].

    Science.gov (United States)

    Brücher, K; Gonther, U

    2006-04-01

    Whether or not the neurobiological basis of mental processes is compatible with the philosophical postulate of free will is a matter of committed debating in our days. What is the meaning of those frequently-quoted experiments concerning voluntary action? Both convictions, being autonomous subjects and exercising a strong influence on the world by applying sciences, have become most important for modern human self-conception. Now these two views are growing apart and appear contradictory because neurobiology tries to reveal the illusionary character of free will. In order to cope with this ostensible dichotomy it is recommended to return to the core of scientific thinking, i. e. to the reflection about truth and methods. The neurobiological standpoint referring to Libet as well as the philosophical approaches to free will must be analysed, considering pre-conceptions and context-conditions. Hence Libet's experiments can be criticised on different levels: methods, methodology and epistemology. Free will is a highly complex system, not a simple fact. Taking these very complicated details into account it is possible to define conditions of compatibility and to use the term free will still in a meaningful way, negotiating the obstacles called pure chance and determinism. PMID:16671159

  13. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  14. PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jin Hui; Wang Jinnuo; Wang Libin

    2003-01-01

    The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.

  15. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  16. Evaluation of the Field Test of Project Information Packages: Volume III--Resource Cost Analysis.

    Science.gov (United States)

    Al-Salam, Nabeel; And Others

    The third of three volumes evaluating the first year field test of the Project Information Packages (PIPs) provides a cost analysis study as a key element in the total evaluation. The resource approach to cost analysis is explained and the specific resource methodology used in the main cost analysis of the 19 PIP field-test projects detailed. The…

  17. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  18. A New Methodology of Spatial Crosscorrelation Analysis

    CERN Document Server

    Chen, Yanguang

    2015-01-01

    The idea of spatial crosscorrelation was conceived of long ago. However, unlike the related spatial autocorrelation, the theory and method of spatial crosscorrelation analysis have remained undeveloped. This paper presents a set of models and working methods for spatial crosscorrelation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form and by means of mathematical reasoning, I derive a theoretical framework for geographical crosscorrelation analysis. First, two sets of spatial crosscorrelation coefficients are defined, including a global spatial crosscorrelation coefficient and a set of local spatial crosscorrelation coefficients. Second, a pair of scatterplots of spatial crosscorrelation is proposed, and different scatterplots show different relationships between correlated variables. Based on the spatial crosscorrelation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial crosscorrelation) and indirect correlation (sp...

  19. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatyana; Ruel, Huub; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the disco

  20. Climatic analysis methodology of vernacular architecture

    OpenAIRE

    Gil Crespo, Ignacio Javier; Barbero Barrera, María del Mar; Maldonado Ramos, Luis

    2015-01-01

    Vernacular architecture has demonstrated its perfect environmental adaptation through its empirical development and improvement by generations of user-builders. Nowadays, the sustainability of vernacular architecture is the aim of some research projects in which the same method should be applied in order to be comparable. Hence, we propose a research method putting together various steps. Through the analysis of geographical, lithology, economic, cultural and social influence as well as mater...

  1. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    Science.gov (United States)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  2. Conceptual evidence collection and analysis methodology for Android devices

    OpenAIRE

    Martini, Ben; Do, Quang; Choo, Kim-Kwang Raymond

    2015-01-01

    Android devices continue to grow in popularity and capability meaning the need for a forensically sound evidence collection methodology for these devices also increases. This chapter proposes a methodology for evidence collection and analysis for Android devices that is, as far as practical, device agnostic. Android devices may contain a significant amount of evidential data that could be essential to a forensic practitioner in their investigations. However, the retrieval of this data require...

  3. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels;

    2010-01-01

    directly for implementation into a computer aided reasoning tool for HAZOP studies to perform root cause and consequence analysis. Such a tool will facilitate finding causes far away from the site of the deviation. A Functional HAZOP Assistant is proposed and investigated in a HAZOP study of an industrial...... to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of...

  4. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  5. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  6. New methodologies in stable isotope analysis

    International Nuclear Information System (INIS)

    In the 1970s, soil scientists stressed the need for a fast, easy to use 15N analyser to replace the isotope ratio mass spectrometer (IRMS) and Kjeldahl sample preparation. By 1984, three groups had succeeded in interfacing an elemental analyser to an IRMS. 'Continuous flow' Dumas combustion converted N in plant tissue or soil to a pulse of N2 gas, taken to the mass spectrometer by a helium carrier. Throughput increased from 20 to 100 analyses per day and only 5 μg N were required compared with 50 μg N for Kjeldahl-Rittenberg preparation and IRMS analysis. Since 1987, a software controlled automated nitrogen and carbon analyser-mass spectrometer (ANCA-MS) has been developed with which 15N and 13C can be measured to 0.0003 and 0.0002 at.% RSD respectively. Reducing hardware has made it portable, enabling it to be used in the field. Measurement of submicrogram quantities of nitrogen is possible using software control to move the oxygen pulse, with its N2 'blank', out of phase with the sample. Software also allows operation at twice normal speed, enabling plant breeders to screen genotypes for N fixing ability within the flowering period. 35 refs, 6 figs, 7 tabs

  7. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  8. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  9. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author)

  10. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  11. Electric utility value analysis methodology for wind energy conversion systems

    Science.gov (United States)

    Bush, L. R.; Cretcher, C. K.; Davey, T. H.

    1981-09-01

    The methodology summarized in this report was developed in support of a study of the value of augmenting conventional electric energy generation with wind energy conversion systems (WECS). A major objective of the value analysis study is the creation of an analytical methodology to assess WECS installed in a utility's generation system. The pertinent measures of value include both the displacement of conventional fuels measured in volumetric and economic terms, and the potential for capacity credit, realized either through the sale of capacity of deferral of new installations. Recognizing the strong site dependence of wind energy conversion, a second major objective of the value analysis study is the application of the methodology to several candidate utilities. These results will illustrate the practicability of the methodology and identify any deficiencies, alloy a comparison of relative WECS value among the sites studied, and develop background information pertinent to possible selection decisions for WECS siting and level of penetration. The specific purpose of this report is to outline the methodology developed in support of the value analysis study.

  12. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    A synthesized methodology of safety analysis and evaluation for general fusion systems is proposed. In the course of the methodology development, its main frame has been constructed in order to take account of all safety-related items and to ensure a logical consistency. The safety-related items are divided broadly into two groups. One of them is the public protection from radiological hazard, which is introduced as a safety requirement from an external viewpoint for the fusion system. The other items are the matter from an internal viewpoint and are related to the fusion system behavior in itself. These items are composed of the understanding of a fusion system, the safety ensuring principle and the function based safety analysis. All of these items have been mapped on the frame, considering the mutual relations, among them, consistently. To complete the methodology development, the safety evaluation for the actual design of a fusion system has been performed in conformity to this methodology. Thus, it has been demonstrated that the methodology proposed here is appropriate to the safety analysis and evaluation for the fusion system. (author). 9 refs, 4 figs, 2 tabs

  13. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  14. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  15. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  16. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  17. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume I of III: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This report develops and demonstrates the methodology for the National Utility Regulatory (NUREG) Model developed under contract number DEAC-01-79EI-10579. It is accompanied by two supporting volumes. Volume II is a user's guide for operation of the NUREG software. This includes description of the flow of software and data, as well as the formats of all user data files. Finally, Volume III is a software description guide. It briefly describes, and gives a listing of, each program used in NUREG.

  18. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  19. Experiences in Uranium Abundance Analysis Using the MTE Methodology

    OpenAIRE

    BLACK CLAUDIE KATE; Zuleger, Evelyn; HORTA DOMENECH Joan; VARGAS ZUNIGA Martin

    2012-01-01

    The ITU in Karlsruhe routinely analyses a multitude of samples from a wide range of internal and external customers for safeguards. High through-put analysis techniques are employed using meticulous care to ensure accurate, precise and timely results are provided. Thermal ionisation mass spectrometry (TIMS) is used for isotopic analysis of Uranium to determine abundance and concentration information. At ITU we employ the modified total evaporation (MTE) methodology[1] for minor isotope an...

  20. Behavior analysis and training-a methodology for behavior engineering.

    Science.gov (United States)

    Colombetti, M; Dorigo, M; Borghi, G

    1996-01-01

    We propose Behavior Engineering as a new technological area whose aim is to provide methodologies and tools for developing autonomous robots. Building robots is a very complex engineering enterprise that requires the exact definition and scheduling of the activities which a designer, or a team of designers, should follow. Behavior Engineering is, within the autonomous robotics realm, the equivalent of more established disciplines like Software Engineering and Knowledge Engineering. In this article we first give a detailed presentation of a Behavior Engineering methodology, which we call Behavior Analysis and Training (BAT), where we stress the role of learning and training. Then we illustrate the application of the BAT methodology to three cases involving different robots: two mobile robots and a manipulator. Results show the feasibility of the proposed approach. PMID:18263040

  1. Intelligent signal analysis methodologies for nuclear detection, identification and attribution

    Science.gov (United States)

    Alamaniotis, Miltiadis

    Detection and identification of special nuclear materials can be fully performed with a radiation detector-spectrometer. Due to several physical and computational limitations, development of fast and accurate radioisotope identifier (RIID) algorithms is essential for automated radioactive source detection and characterization. The challenge is to identify individual isotope signatures embedded in spectral signature aggregation. In addition, background and isotope spectra overlap to further complicate the signal analysis. These concerns are addressed, in this thesis, through a set of intelligent methodologies recognizing signature spectra, background spectrum and, subsequently, identifying radionuclides. Initially, a method for detection and extraction of signature patterns is accomplished by means of fuzzy logic. The fuzzy logic methodology is applied on three types of radiation signal processing applications, where it exhibits high positive detection, low false alarm rate and very short execution time, while outperforming the maximum likelihood fitting approach. In addition, an innovative Pareto optimal multiobjective fitting of gamma ray spectra using evolutionary computing is presented. The methodology exhibits perfect identification while performs better than single objective fitting. Lastly, an innovative kernel based machine learning methodology was developed for estimating natural background spectrum in gamma ray spectra. The novelty of the methodology lies in the fact that it implements a data based approach and does not require any explicit physics modeling. Results show that kernel based method adequately estimates the gamma background, but algorithm's performance exhibits a strong dependence on the selected kernel.

  2. Current status of methodologies for seismic probabilistic safety analysis

    International Nuclear Information System (INIS)

    This report is a review of the methodology for conducting a seismic-probabilistic safety analysis (PSA) at a nuclear power station. The objective of this review is to provide an up-to-date review of the state-of-the-art of the various sub-methodologies that comprise the overall seismic-PSA methodology for addressing the safety of nuclear power stations, plus an overview of the whole methodological picture. In preparing this review, the author has had in mind several categories of readers and users: policy-level decision-makers (such as managers of nuclear power stations and regulators of nuclear safety), seismic-PSA practitioners, and PSA practitioners more broadly. The review concentrates on evaluating the extent to which today's seismic-PSA methodology produces reliable and useful results and insights, at its current state-of-the-art level, for assessing nuclear-power-station safety. Also, this review paper deals exclusively with seismic-PSA for addressing nuclear-power-station safety. Because the author is based in the U.S., it is natural that this review will contain more emphasis on U.S. experience than on experience in other countries. However, significant experience elsewhere is a major part of the basis for this evaluation

  3. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    Science.gov (United States)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  4. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  5. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  6. Comparative analysis of EPA cost-benefit methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Poch, L.; Gillette, J.; Veil, J.

    1998-05-01

    In recent years, reforming the regulatory process has received much attention from diverse groups such as environmentalists, the government, and industry. A cost-benefit analysis can be a useful way to organize and compare the favorable and unfavorable impacts a proposed action night have on society. Since 1981, two Executive Orders have required the U.S. Environmental Protection Agency (EPA) and other regulatory agencies to perform cost-benefit analyses in support of regulatory decision making. At the EPA, a cost-benefit analysis is published as a document called a regulatory impact analysis (RIA). This report reviews cost-benefit methodologies used by three EPA program offices: Office of Air and Radiation, Office of Solid Waste, and Office of Water. These offices were chosen because they promulgate regulations that affect the policies of this study`s sponsor (U.S. Department of Energy, Office of Fossil Energy) and the technologies it uses. The study was conducted by reviewing 11 RIAs recently published by the three offices and by interviewing staff members in the offices. To draw conclusions about the EPA cost-benefit methodologies, their components were compared with those of a standard methodology (i.e., those that should be included in a comprehensive cost-benefit methodology). This study focused on the consistency of the approaches as well as their strengths and weaknesses, since differences in the cost-benefit methodologies themselves or in their application can cause confusion and preclude consistent comparison of regulations both within and among program offices.

  7. Estimates of emergency operating capacity in US manufacturing and nonmanufacturing industries - Volume 1: Concepts and Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B. (Pacific Northwest Lab., Richland, WA (USA)); Serot, D.E. (D/E/S Research, Richland, WA (USA)); Kellogg, M.A. (ERCE, Inc., Portland, OR (USA))

    1991-03-01

    Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 through 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)

  8. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Three issues are critical to the public acceptability of nuclear fusion as an energy system. These are technological feasibility, economic viability and safety. Safety will be especially important when tritium is used as a fuel and the reactor becomes radioactive. As a result of this study a safety analysis and evaluation methodology for fusion systems were developed. In this all the safety-related issues in the fusion system could be integrated and resolved. A general descriptive model, the three principle items to be assured, an approach to safety assurance based on event categorization and the function based safety analysis are all discussed. The usefulness of the methodology was illustrated by the application of the safety evaluation to the R-Tokamak. (author)

  9. Algebraic parameters identification of DC motors: methodology and analysis

    Science.gov (United States)

    Becedas, J.; Mamani, G.; Feliu, V.

    2010-10-01

    A fast, non-asymptotic, algebraic parameter identification method is applied to an uncertain DC motor to estimate the uncertain parameters: viscous friction coefficient and inertia. In this work, the methodology is developed and analysed, its convergence, a comparative study between the traditional recursive least square method and the algebraic identification method is carried out, and an analysis of the estimator in a noisy system is presented. Computer simulations were carried out to validate the suitability of the identification algorithm.

  10. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    OpenAIRE

    Andi Andi

    2004-01-01

    Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks...

  11. Methodology, the matching law, and applied behavior analysis

    OpenAIRE

    Vyse, Stuart A.

    1986-01-01

    The practical value of the quantitative analysis of behavior is limited by two methodological characteristics of this area of research: the use of (a) steady-state strategies and (b) relative vs. absolute response rates. Applied behavior analysts are concerned with both transition-state and steady-state behavior, and applied interventions are typically evaluated by their effects on absolute response rates. Quantitative analyses of behavior will have greater practical value when methods are de...

  12. Risk Analysis in Construction Projects: A Practical Selection Methodology

    OpenAIRE

    Alberto De Marco; Muhammad Jamaluddin Thaheem

    2013-01-01

    Project Risk Management (PRM) is gaining attention from researchers and practitioners in the form of sophisticated tools and techniques to help construction managers perform risk management. However, the large variety of techniques has made selecting an appropriate solution a complex and risky task in itself. Accordingly, this study proposes a practical framework methodology to assist construction project managers and practitioners in choosing a suitable risk analysis technique based on selec...

  13. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  14. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    Science.gov (United States)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  15. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  16. PANSYSTEMS ANALYSIS: MATHEMATICS, METHODOLOGY,RELATIVITY AND DIALECTICAL THINKING

    Institute of Scientific and Technical Information of China (English)

    郭定和; 吴学谋; 冯向军; 李永礼

    2001-01-01

    Based on new analysis modes and new definitions with relative mathematization and simplification or strengthening forms for concepts of generalized systems,panderivatives , pansymmetry , panbox principle, pansystems relativity, etc. , the framework and related principles of pansystems methodology and pansystems relativity are developed. Related contents include: pansystems with relatively universal mathematizing forns, 200 types of dualities, duality transformation, pansymmetry transformation,pansystems dialectics, the 8-domain method, pansystems mathematical methods,generalized quantification, the principles of approximation-transforming, pan-equivalence theorems , supply-demand analysis, thinking experiment, generalized gray systems, etc.

  17. Towards a Methodology for Identifying Program Constraints During Requirements Analysis

    Science.gov (United States)

    Romo, Lilly; Gates, Ann Q.; Della-Piana, Connie Kubo

    1997-01-01

    Requirements analysis is the activity that involves determining the needs of the customer, identifying the services that the software system should provide and understanding the constraints on the solution. The result of this activity is a natural language document, typically referred to as the requirements definition document. Some of the problems that exist in defining requirements in large scale software projects includes synthesizing knowledge from various domain experts and communicating this information across multiple levels of personnel. One approach that addresses part of this problem is called context monitoring and involves identifying the properties of and relationships between objects that the system will manipulate. This paper examines several software development methodologies, discusses the support that each provide for eliciting such information from experts and specifying the information, and suggests refinements to these methodologies.

  18. SURVEY ON BIG DATA ANALYSIS & PROCESSING WITH DATA MINING METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Bharati Punjabi

    2015-03-01

    Full Text Available Data Mining is an analytic process designed to explore data in search of consistent patterns and/or systematic relationships between variables, and then to validate the findings by applying the detected patterns to new subsets of data. Big Data concern large-volume, complex, growing data sets with multiple, autonomous sources. With the fast development of networking, data storage, and the data collection capacity, Big Data are now rapidly expanding in all science and engineering domains, including physical, biological and biomedical sciences. Big Data is a new term used to identify the datasets that due to their large size and complexity, we cannot manage them with our current methodologies or data mining software tools. Big Data mining is the capability of extracting useful information from these large datasets or streams of data, that due to its volume, variability, and velocity, it was not possible before to do it. The Big Data challenge is becoming one of the most exciting opportunities for the next years.

  19. Capillary Electrophoresis-based Methodology Development for Biomolecule Analysis

    OpenAIRE

    Li, Ni

    2011-01-01

    Capillary electrophoresis (CE) is a separation tool with wide applications in biomolecule analysis. Fast and high-resolution separation requiring minute sample volumes is advantageous to study multiple components in biological samples. Flexible modes and methods can be developed. In this thesis, I focus on developing and applying novel CE methods to study multi-target nucleic acid sensing with high sensitivity (Part I) and interactions between multiple components, i.e. proteins, nanoparticles...

  20. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  1. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  2. SYNTHESIS OF SAFETY ANALYSIS AND FIRE HAZARD ANALYSIS METHODOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Coutts, D

    2007-04-17

    Successful implementation of both the nuclear safety program and fire protection program is best accomplished using a coordinated process that relies on sound technical approaches. When systematically prepared, the documented safety analysis (DSA) and fire hazard analysis (FHA) can present a consistent technical basis that streamlines implementation. If not coordinated, the DSA and FHA can present inconsistent conclusions, which can create unnecessary confusion and can promulgate a negative safety perception. This paper will compare the scope, purpose, and analysis techniques for DSAs and FHAs. It will also consolidate several lessons-learned papers on this topic, which were prepared in the 1990s.

  3. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  4. A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, C S; Estep, D; Sandelin, J; Wang, H

    2009-02-26

    This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

  5. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  6. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  7. Applications of integrated safety analysis methodology to reload safety evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Chan Su; Um, Kil Sup [Korea Nuclear Fuel, Daejeon (Korea, Republic of)

    2011-03-15

    Korea Nuclear Fuel is developing the X-GEN fuel which shows high performance and robust reliability for the worldwide supply. However, the simplified code systems such as CESEC-III which were developed in 1970s are still used in the current Non-LOCA safety analysis of OPR1000 and APR1400 plants. Therefore, it is essential to secure an advanced safety analysis methodology to make the best use of the merits of X-GEN fuel. To accomplish this purpose, the integrated safety analysis methodology (iSAM), is developed by selecting the best-estimate thermal-hydraulic code RETRAN. iSAM possesses remarkable advantages, such as generality, integrity, and designer-friendly features. That is, iSAM can be applied to both OPR1000 and APR1400 plants and uses only one computer code, RETRAN, in the whole scope of the non-LOCA safety analyses. Also the iSAM adopts the unique and automatic initialization and run tool, automatic steady-state initialization and safety analysis tool (ASSIST), to enable unhandy designers to use the new design code RETRAN without difficulty. In this paper, a brief overview of the iSAM is given, and the results of applying the iSAM to typical non-LOCA transients being checked during the reload design are reported. The typical non-LOCA transients selected are the single control element assembly withdrawal (SCEAW) accident, the asymmetric steam generator transients (ASGT), the locked rotor (LR) accident, and bank CEA withdrawal (BCEAW) event. Comparison to current licensing results shows a close resemblance; thus, it reveals that the iSAM can be applied to the non-LOCA safety analysis of OPR1000 and APR1400 plants

  8. Applications of integrated safety analysis methodology to reload safety evaluation

    International Nuclear Information System (INIS)

    Korea Nuclear Fuel is developing the X-GEN fuel which shows high performance and robust reliability for the worldwide supply. However, the simplified code systems such as CESEC-III which were developed in 1970s are still used in the current Non-LOCA safety analysis of OPR1000 and APR1400 plants. Therefore, it is essential to secure an advanced safety analysis methodology to make the best use of the merits of X-GEN fuel. To accomplish this purpose, the integrated safety analysis methodology (iSAM), is developed by selecting the best-estimate thermal-hydraulic code RETRAN. iSAM possesses remarkable advantages, such as generality, integrity, and designer-friendly features. That is, iSAM can be applied to both OPR1000 and APR1400 plants and uses only one computer code, RETRAN, in the whole scope of the non-LOCA safety analyses. Also the iSAM adopts the unique and automatic initialization and run tool, automatic steady-state initialization and safety analysis tool (ASSIST), to enable unhandy designers to use the new design code RETRAN without difficulty. In this paper, a brief overview of the iSAM is given, and the results of applying the iSAM to typical non-LOCA transients being checked during the reload design are reported. The typical non-LOCA transients selected are the single control element assembly withdrawal (SCEAW) accident, the asymmetric steam generator transients (ASGT), the locked rotor (LR) accident, and bank CEA withdrawal (BCEAW) event. Comparison to current licensing results shows a close resemblance; thus, it reveals that the iSAM can be applied to the non-LOCA safety analysis of OPR1000 and APR1400 plants

  9. Volume totalizers analysis of pipelines operated by TRANSPETRO National Operational Control Center; Analise de totalizadores de volume em oleodutos operados pelo Centro Nacional de Controle e Operacao da TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Aramaki, Thiago Lessa; Montalvao, Antonio Filipe Falcao [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Marques, Thais Carrijo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2012-07-01

    This paper aims to present the results and methodology in the analysis of differences in volume totals used in systems such as batch tracking and leak detection of pipelines operated by the National Center for Operational Control (CNCO) at TRANSPETRO. In order to optimize this type of analysis, software was developed to acquisition and processing of historical data using the methodology developed. The methodology developed takes into account the particularities encountered in systems operated by TRANSPETRO, more specifically, by CNCO. (author)

  10. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  11. Limits, choices, expectation: methodological horizons to history textbooks analysis

    Directory of Open Access Journals (Sweden)

    Jean Carlos Moreno

    2012-03-01

    Full Text Available This article presents some possibilities related to History textbooks analysis, in a direction of a construction of a methodology outline that addresses together at the same time different sides of the History Education Disciplinary Code and  its application and the production situation of these cultural objects. In this way, in one hand, the purpose is to know how the textbooks are in the schooling and the printed culture, both that are modernity projects, sharing the book history trajectory, without forget the force field that gives to it sustentation in each context. Concomitantly the content analysis needs to pay attention to the specific aspects of History school teaching, these aspects that show the identity, language, affection, moral development, cognition and expectation horizons.

  12. Methodological progresses in Markovian availability analysis and applications

    International Nuclear Information System (INIS)

    The Markovian model applied to reliability analysis is well known as an effective tool, whenever some dependencies affect the probabilistic behaviour of system's components. Its ability to study the dynamical evolution of systems allows to include human actions into the temporal evolution (inspections, maintenances, including human failure probabilities). The starting point has been the Sstagen-Mmarela code. In spite of the fact that this code already realizes much progresses towards reducing the size of markovian matrices (merging of Markov processes of systems exhibiting symmetries), there is still an imperative need to reduce memory requirements. This implies, as a first step of any realistic analysis, a modularization of the studied system into subsystems, which could be 'coupled'. The methodology is applied to the auxiliary feedwater injection of Doel 3. (orig./HSCH)

  13. Fracture mechanics analysis on VVER1000 RPV with different methodologies

    International Nuclear Information System (INIS)

    The main component that limits the operational life of the (Nuclear Power Plant) NPP is the Reactor pressure Vessel (RPV) because of the property of carbon steel material change during the operational life due to the different causes: high neutron flux in the welding region, thermal aging etc. This results in an increase of RPV embrittlement level that decreases the safety margin for the crack propagation in case of transients with fast cooling rate due to the emergency systems injection, or increase of secondary side heat exchange. This problem is known as Pressurized Thermal Shock (PTS) and constitutes a relevant problem for the safety of the NPP that are in operation from several years. Nowadays, the scientific community is trying to change the approach to the PTS analysis toward a “Best Estimate” (BE) scheme with the aim to remove the excess of conservatism in each step of the analysis coming from the limited knowledge of the phenomena in the eighties when the problem has been considered in the safety analysis. This change has been pushed from the possibility to extend the operational life of some plants and this has been possible due to the availability of always more powerful computer and sophisticated computer codes that allows to the analyst to perform very detailed analysis with very high degree of precision of the mixing phenomena occurring at small scale in the down-comer and to calculate the stress intensity factor at crack tip with very refined mesh of millions of nodes. This paper describes the main steps of a PTS analysis: system thermal-hydraulic calculation, CFD analysis, stress analysis and the Fracture Mechanics analysis for the RPV of a generic VVER1000. In particular the paper shows the comparison of the results of the fracture mechanics analysis performed with different methodology for the calculation of the stress intensity factor at crack tip (KI). (author)

  14. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  15. Best Estimate Analysis Methodology in the STARS Project at PSI

    International Nuclear Information System (INIS)

    The Project performed Best Estimate deterministic safety assessment of the Swiss nuclear power plants on behalf of the Swiss nuclear safety authority (HSK) and the utilities. The overall goal was to provide independent expertise in deterministic safety analysis to HSK and the utilities: - identification and quantification of safety margins in existing Swiss nuclear power plants, - technical input for safety related decisions based on timely and independent analysis. It described the tools and the philosophy of Best Estimate analysis in the STARS Project, the code development model and its assessment, code assessment and code applications. PSI had concluded that Best Estimate codes should be able to handle a wide variety of different transients, operational or unanticipated, in a numerically robust way, with a minimum of manipulation of qualified plant models. Validation and assessment of the adequacy of code models for the intended application was essential to justify the results of any Best Estimate analysis. This could be helped by: - easy access to relevant experimental data: a central, extensive data base (Separate Effects and Integral Test data) would help substantially; - easy access to plant data: these are essential for qualifying plant and code models; - assessment against experimental data, as it provides very valuable information for uncertainty methodologies. The application of uncertainty analysis to Best Estimate results is becoming important for decision-making, e.g. licensing, plant operation and configuration changes, etc

  16. Evaluation of ATHEANA methodology a second generation human reliability analysis

    International Nuclear Information System (INIS)

    Incidents and accidents at nuclear power plants (NPP) have been and always will be considered as undesired occurrences. Human error (REASON, 1990) is probably the major contributor to serious accidents as those that occurred at Three Mile Island NPP, Unit 2 (TMI-2), in 1979, and Chernobyl, Unit 4, in 1986, and (AEOD/E95-01, 1995 at others NPPs. Reviews and analysis of those accidents and others near-misses have shown operators performing actions that are not required for the accident response and, in fact, worsen the plant's condition. This action, where a person does something that it's not supposed to do, believing that it was the right thing to do, resulting in changes in the plant that may be worse than if he had done nothing, is called Error of Commission (EOC). These inappropriate actions are rightly affected (NUREG-1624, Rev.1, 2000) by the off-normal context (i.e., the combination of plant conditions and performance shaping factors) of the event scenario that virtually forces the operator to fail. Considering that this kind of human intervention could be an important failure mode and precursor to more serious events, aggravated by the fact that actual probabilistic risk assessment (PRA), does not consider this kind of error, a new methodology (NUREG-1624, Rev.1, 2000) was developed of Human Reliability Analysis (HRA), called 'A Technique for Human Event Analysis' (ATHEANA). ATHEANA is a multidisciplinary second-generation HRA method and provides an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant accidents. This paper presents this new methodology in order to identified its weak and strong points, to evaluate its advantages and disadvantages and its benefits to safety, and finally, to analyze its application to Angra NPP. (author)

  17. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  18. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  19. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: the development of a domestic human performance database, and the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a domestic human performance database, called OPERA-I (Operator Performance and Reliability Analysis, Part I). In addition, based on OPERA-I, task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were developed. The second main result is the standardization of HRA method. Finally, Misdiagnosis Tree Analysis (MDTA) technique that can analyze EOC has been developed. These results, such as OPERA-I, TACOM, standardized HRA method and MDTA technique, will be used to improve the quality of domestic PSA

  20. SLSF loop handling system. Volume I. Structural analysis

    International Nuclear Information System (INIS)

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision

  1. A normative price for a manufactured product: The SAMICS methodology. Volume 1: Executive summary

    Science.gov (United States)

    Chamberlain, R. G.

    1979-01-01

    A summary for the Solar Array Manufacturing Industry Costing Standards report contains a discussion of capabilities and limitations, a non-technical overview of the methodology, and a description of the input data which must be collected. It also describes the activities that were and are being taken to ensure validity of the results and contains an up-to-date bibliography of related documents.

  2. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  3. Analysis of the low-altitude proton flux asymmetry: methodology

    CERN Document Server

    Kruglanski, M

    1999-01-01

    Existing East-West asymmetry models of the trapped proton fluxes at low altitudes depend on the local magnetic dip angle and a density scale height derived from atmospheric models. We propose an alternative approach which maps the directional flux over a drift shell (B sub m , L) in terms of the local pitch and azimuthal angles alpha and beta, where beta is defined in the local mirror plane as the angle between the proton arrival direction and the surface normal to the drift shell. This approach has the advantage that it only depends on drift shell parameters and does not involve an atmosphere model. A semi-empirical model based on the new methodology is able to reproduce the angular distribution of a set of SAMPEX/PET proton flux measurements. Guidelines are proposed for spacecraft missions and data analysis procedures that are intended to be used for the building of new trapped radiation environment models.

  4. Analysis of the low-altitude proton flux asymmetry: methodology

    International Nuclear Information System (INIS)

    Existing East-West asymmetry models of the trapped proton fluxes at low altitudes depend on the local magnetic dip angle and a density scale height derived from atmospheric models. We propose an alternative approach which maps the directional flux over a drift shell (Bm, L) in terms of the local pitch and azimuthal angles α and β, where β is defined in the local mirror plane as the angle between the proton arrival direction and the surface normal to the drift shell. This approach has the advantage that it only depends on drift shell parameters and does not involve an atmosphere model. A semi-empirical model based on the new methodology is able to reproduce the angular distribution of a set of SAMPEX/PET proton flux measurements. Guidelines are proposed for spacecraft missions and data analysis procedures that are intended to be used for the building of new trapped radiation environment models

  5. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  6. A methodology for chemical hazards analysis at nuclear fuels reprocessing plants

    International Nuclear Information System (INIS)

    The Savannah River Laboratory employs a formal methodology for chemical hazards analysis primarily for use in the risk assessment of its nuclear fuels reprocessing plants. The methodology combines interactive matrices for reactions of available materials, fault tree analysis, human factors, and extensive data banks on the operating history of the plants. Examples illustrate the methodology and a related data bank

  7. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243Am (239Np) and 236Pu or 242Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  8. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  9. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  10. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    OpenAIRE

    L. Renbaum-Wolff; J. W. Grayson; A. K. Bertram

    2012-01-01

    Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions). The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η) ranging between 10−3 and 103 Pascal seconds (Pa s) in samples that cover a range of chemic...

  11. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    OpenAIRE

    L. Renbaum-Wolff; J. W. Grayson; A. K. Bertram

    2013-01-01

    Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions). The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η) ranging between 10−3 and 103 Pascal seconds (Pa s) in samples that cover a range of chemical properties a...

  12. Life prediction methodology for ceramic components of advanced heat engines. Phase 1: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This volume presents the following appendices: ceramic test specimen drawings and schematics, mixed-mode and biaxial stress fracture of structural ceramics for advanced vehicular heat engines (U. Utah), mode I/mode II fracture toughness and tension/torsion fracture strength of NT154 Si nitride (Brown U.), summary of strength test results and fractography, fractography photographs, derivations of statistical models, Weibull strength plots for fast fracture test specimens, and size functions.

  13. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  14. Simplex volume analysis for finding endmembers in hyperspectral imagery

    Science.gov (United States)

    Li, Hsiao-Chi; Song, Meiping; Chang, Chein-I.

    2015-05-01

    Using maximal simplex volume as an optimal criterion for finding endmembers is a common approach and has been widely studied in the literature. Interestingly, very little work has been reported on how simplex volume is calculated. It turns out that the issue of calculating simplex volume is much more complicated and involved than what we may think. This paper investigates this issue from two different aspects, geometric structure and eigen-analysis. The geometric structure is derived from its simplex structure whose volume can be calculated by multiplying its base with its height. On the other hand, eigen-analysis takes advantage of the Cayley-Menger determinant to calculate the simplex volume. The major issue of this approach is that when the matrix is ill-rank where determinant is desired. To deal with this problem two methods are generally considered. One is to perform data dimensionality reduction to make the matrix to be of full rank. The drawback of this method is that the original volume has been shrunk and the found volume of a dimensionality-reduced simplex is not the real original simplex volume. Another is to use singular value decomposition (SVD) to find singular values for calculating simplex volume. The dilemma of this method is its instability in numerical calculations. This paper explores all of these three methods in simplex volume calculation. Experimental results show that geometric structure-based method yields the most reliable simplex volume.

  15. Analysis of service identification in SOA methodologies - with a unification in POSI, perspective oriented service identification

    OpenAIRE

    2008-01-01

    This thesis is written in the context of Model Driven Architectures and SOA, and investigates different methodologies and their ways of identifying and describing a service oriented architecture. Methodologies analyzed are OASIS, ARIS, COMET-S and Archimate. With the knowledge from the analysis a perspective oriented service identification methodology will be proposed- POSI. POSI will be analyzed and later compared with the thesis initial analysis of mentioned methodolog...

  16. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  17. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  18. Methodology for the analysis of dynamic human actions

    International Nuclear Information System (INIS)

    A methodology for the analysis of human actions under accident conditions has been developed, which uses information from plant simulator runs, plant procedures, and plant systems information. The objective is to enhance the completeness of the event sequence model (event trees) with respect to both favorable and unfavorable operator actions. Routine human actions that impact the plant at or below the systems level, such as test and maintenance actions, are handled in the systems analysis. Types of dynamic operator actions analyzed in this paper are actions taken during an event sequence that: supplement the automatic response of plant systems for event mitigation, change or detract from the automatic response of plant systems, or lead to recovery of failed systems. The derived results can be used directly in a probabilistic risk assessment. It is judged that the major cause of possible error is misdiagnosis, which can lead to either errors of omission or errors of commission. Operator mistakes may occur when a situation is misclassified or when inappropriate decisions and response selections are made in the operator action sequences. The operator action sequences are modeled in a natural progression of human response, including observation of plant parameters and the diagnosis of the event or the decision to take action

  19. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    Science.gov (United States)

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  20. A study on safety analysis methodology in spent fuel dry storage facility

    International Nuclear Information System (INIS)

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology

  1. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  2. Optimising the education of responsible shift personnel in nuclear power plants. Volume 1 for Chapter 3: Investigational methodology

    International Nuclear Information System (INIS)

    In line with the usual announcement procedures, an analysis was to be carried out of those activities from which capabilities, knowledge and then learning objectives can be derived in consecutive stages. In this respect, this volume contains articles on the following: the derivation of learning objectives from activities on the themes of capabilities and knowledge; the analysis of professional activity; the appraisal of the descriptors and a textual presentation of the activities. (DG)

  3. Economic Analysis. Volume V. Course Segments 65-79.

    Science.gov (United States)

    Sterling Inst., Washington, DC. Educational Technology Center.

    The fifth volume of the multimedia, individualized course in economic analysis produced for the United States Naval Academy covers segments 65-79 of the course. Included in the volume are discussions of monopoly markets, monopolistic competition, oligopoly markets, and the theory of factor demand and supply. Other segments of the course, the…

  4. Heliostat manufacturing cost analysis. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Drumheller, K; Schulte, S C; Dilbeck, R A; Long, L W

    1979-10-01

    This study has two primary objectives. The first is to provide a detailed cost evaluation of the second generation of DOE heliostats, from which repowering heliostat designs are likely to be derived. A second objective is to provide an analytical foundation for the evaluation of futue heliostat designs. The approach taken for this study was to produce a cost estimate for the production of the McDonnell Douglas prototype design by generating estimates of the materials, labor, overhead, and facilities costs for two different production scenarios, 25,000 heliostats per year and 250,000 heliostats per year. The primary conclusion of this study is that the second generation of heliostat designs should cost approximately $100/m/sup 2/ at volumes of 25,000 units/year. This price falls to approximately $80/m/sup 2/ at volumes of 250,000 units/year. A second conclusion is that cost reduction begins at relatively low production volumes and that many production benefits can be obtained at production rates of 5,000 to 15,000 units/year. A third conclusion is that the SAMICS model and the SAMIS III program can be useful tools in heliostat manufacturing, costing, and economic studies.

  5. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  6. Application of GO methodology for reliability analysis of repairable system

    International Nuclear Information System (INIS)

    GO methodology is a method of system reliability with success-oriented, it has been applied in non-repairable system. The author studies application of GO methodology in repairable system. Considering dependence of components, the quantitative formulas of some GO operations for repairable system have been derived. According to this study, the GO program has been developed. Reliability parameters such as steady availability and failure number can be calculated directly from GO figure and a High Pressure Injection System of Nuclear Power Plant example is given. The study is available for developing and application of GO methodology in repairable system such as nuclear engineering pipeline systems or chemical process systems

  7. Population analysis a methodology for understanding populations in COIN environments

    OpenAIRE

    Self, Eric C.

    2008-01-01

    This thesis outlines a methodology for use by tactical operators to better understand the dynamics of the population whose support they are attempting to gain. In turn, these operators (Army soldiers, Marines, Special Forces, SEALs, Civil Affairs, etc.) can use this information to more effectively develop strategy, plan operations, and conduct tactical missions. Our methodology provides a heuristic model, called the "3 x 5 P.I.G.S.P.E.E.R. Model," that can be applied in any environment and...

  8. Overview of core simulation methodologies for light water reactor analysis

    International Nuclear Information System (INIS)

    The current in-core fuel management calculation methods provide a very efficient route to predict neutronics behavior of light water reactor (LWR) cores and their prediction accuracy for current generation LWRs is generally sufficient. However, since neutronics calculations for LWRs are based on various assumptions and simplifications, we should also recognize many implicit limitations that are 'embedded' in current neutronics calculation methodologies. Continuous effort for improvement of core simulation methodologies is also discussed. (author)

  9. Methodology for the systems engineering process. Volume 1: System functional activities

    Science.gov (United States)

    Nelson, J. H.

    1972-01-01

    Systems engineering is examined in terms of functional activities that are performed in the conduct of a system definition/design, and system development is described in a parametric analysis that combines functions, performance, and design variables. Emphasis is placed on identification of activities performed by design organizations, design specialty groups, as well as a central systems engineering organizational element. Identification of specific roles and responsibilities for doing functions, and monitoring and controlling activities within the system development operation are also emphasized.

  10. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  11. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  12. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  13. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  14. Representing WWW navigational data: a graphical methodology to support qualitative analysis

    OpenAIRE

    Honey Lucas

    1999-01-01

    This paper concerns the development and use of a graphical methodology in the analysis of individuals World Wide Web navigational behaviour. Specifically, it reports on the comparison of this methodology to other methodologies of qualitative analysis. Examples are given of the graphical representations of individuals interactions with particular World Wide Web medical resources, and the integration of these graphical representations with textual data considered. Indications for further develo...

  15. Representing WWW navigational data: a graphical methodology to support qualitative analysis

    Directory of Open Access Journals (Sweden)

    Honey Lucas

    1999-01-01

    Full Text Available This paper concerns the development and use of a graphical methodology in the analysis of individuals’ World Wide Web navigational behaviour. Specifically, it reports on the comparison of this methodology to other methodologies of qualitative analysis. Examples are given of the graphical representations of individuals’ interactions with particular World Wide Web medical resources, and the integration of these graphical representations with textual data considered. Indications for further development in this area are also suggested.

  16. A methodology for chemical hazards analysis at nuclear fuels reprocessing plants

    International Nuclear Information System (INIS)

    The Savannah River Laboratory employs a formal methodology for chemical hazards analysis primarily for use in the risk assessment of its nuclear fuels reprocessing plants. The methodology combines interactive matrices for reactions of available materials, fault tree analysis, human factors, and extensive data banks on the operating history of the plants. Examples illustrate the methodology and a related data bank. 5 refs., 2 figs., 4 tabs

  17. Laser power conversion system analysis, volume 2

    Science.gov (United States)

    Jones, W. S.; Morgan, L. L.; Forsyth, J. B.; Skratt, J. P.

    1979-01-01

    The orbit-to-ground laser power conversion system analysis investigated the feasibility and cost effectiveness of converting solar energy into laser energy in space, and transmitting the laser energy to earth for conversion to electrical energy. The analysis included space laser systems with electrical outputs on the ground ranging from 100 to 10,000 MW. The space laser power system was shown to be feasible and a viable alternate to the microwave solar power satellite. The narrow laser beam provides many options and alternatives not attainable with a microwave beam.

  18. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  19. An Analysis of the Research Methodology of the Ramirez Study.

    Science.gov (United States)

    Thomas, Wayne P.

    1992-01-01

    Analyzes the political, educational, and technical factors that strongly influenced the Ramirez study of bilingual programs. Enumerates strengths and weaknesses of the study's research methodology, along with implications for decision making in language-minority education. Summarizes defensible conclusions of the study that have not yet been…

  20. Seismic hazard analysis. A methodology for the Eastern United States

    International Nuclear Information System (INIS)

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  1. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  2. Response Surface Methodology for the analysis of reactor safety: methodology, development and implementation

    International Nuclear Information System (INIS)

    Reactor safety engineering utilizes RSM techniques as a tool to aid the recovery of information from large simulation codes. The growing interest in the topic, and the need for rigorous methods in risk and reliability analysis have stimulated systematic studies which will lead to the production of an RSM handbook for safety engineering purposes. This paper deals with recent developments in the area which are reported under three main headings: a re-evaluation of the philosophy of using RSM techniques with nuclear safety codes; a comparative study of suitable response functions and experimental design procedures for use in RSM; and a preliminary discussion of multioutput code RSM analysis. The theoretical developments will be shown with reference to their practical applications

  3. Foresight Analysis at the Regional Level - A Participatory Methodological Framework

    OpenAIRE

    Anastasia Stratigea; Chrysaida – Aliki Papadopoulou

    2013-01-01

    The focus of the present paper is on the potential of participatory scenario planning as a tool for regional future studies. More specifically, a methodological framework for participatory scenario planning is presented, integrating an analytical participatory scenario planning approach (the LIPSOR model) with the Focus Groups and Future Workshop participatory tools. This framework is applied to a Greek rural region, for building scenarios and structuring policies for its future rural develop...

  4. Passive system reliability analysis using the APSRA methodology

    International Nuclear Information System (INIS)

    In this paper, we present a methodology known as APSRA (Assessment of Passive System ReliAbility) for evaluation of reliability of passive systems. The methodology has been applied to the boiling natural circulation system in the Main Heat Transport System of the Indian AHWR concept. In the APSRA methodology, the passive system reliability is evaluated from the evaluation of the failure probability of the system to carryout the desired function. The methodology first determines the operational characteristics of the system and the failure conditions by assigning a predetermined failure criteria. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the natural circulation performance. Since applicability of the best estimate codes to passive systems are neither proven nor understood enough, APSRA relies more on experimental data for various aspects of natural circulation such as steady-state natural circulation, flow instabilities, CHF under oscillatory condition, etc. APSRA proposes to compare the code predictions with the test data to generate the uncertainties on the failure parameter prediction, which is later considered in the code for accurate prediction of failure surface of the system. Once the failure surface of the system is predicted, the cause of failure is examined through root diagnosis, which occurs mainly due to failure of mechanical components. The failure probability of these components are evaluated through a classical PSA treatment using the generic data. Reliability of the natural circulation system is evaluated from the probability of availability of the components for the success of natural circulation in the system

  5. Methodological long-term analysis of global bioenergy potential

    OpenAIRE

    Kang, Seungwoo; Selosse, Sandrine; Maïzi, Nadia

    2016-01-01

    This report presents the methodology investigated in order to make more suitable and relevant the representation of bioenergy resources in the long term bottom up optimization model, TIAM-FR. Indeed, the current simplified representation is not suitable for distinguish different use for each proper bioenergy source. Furthermore, considering the important role of global bioenergy trade in energy system particularly for projecting future energy system, disaggregation of these resources appears ...

  6. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is indicated in…

  7. Task analysis of nuclear power plant control room crews. Volume 3. Task data forms

    International Nuclear Information System (INIS)

    A task analysis of nuclear power plant control room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task analysis methodology used in the project is discussed and compared to traditional task analysis and job analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas: (1) human engineering design of control rooms and retrofitting of current control rooms; (2) the numbers and types of control room operators needed with requisite skills and knowledge; (3) operator qualification and training requirements; (4) normal, off-normal, and emergency operating procedures; (5) job performance aids; and (6) communications. The data collection approach focused on a generic structural framework for assembling the multitude of task data that were observed. Control room crew task data were observed and recorded within the context of an operating sequence. The data collection was conducted at eight power plant sites (in simulators and/or in control rooms) by teams comprising human factors and operations personnel. Plants were sampled according to NSSS vendor, vintage, simulator availability, architect-engineer, and control room configuration. The results of the data collection effort were compiled in a computerized task database. Six demonstrations for suitability analysis were subsequently conducted in each of the above areas and are described in this report. Volume 1 details the Project Approach and Methodology. Volume 2 provides the Data Results including a description of the computerized task analysis data format. Volumes 3 and 4 present the Task Data Forms that resulted from the project and are available on a computerized data-base management system

  8. Thermodynamic analysis of a Stirling engine including regenerator dead volume

    Energy Technology Data Exchange (ETDEWEB)

    Puech, Pascal; Tishkova, Victoria [Universite de Toulouse, UPS, CNRS, CEMES, 29 rue Jeanne Marvig, F-31055 Toulouse (France)

    2011-02-15

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine with linear and sinusoidal variations of the volume. The regenerator in a Stirling engine is an internal heat exchanger allowing to reach high efficiency. We used an isothermal model to analyse the net work and the heat stored in the regenerator during a complete cycle. We show that the engine efficiency with perfect regeneration doesn't depend on the regenerator dead volume but this dead volume strongly amplifies the imperfect regeneration effect. An analytical expression to estimate the improvement due to the regenerator has been proposed including the combined effects of dead volume and imperfect regeneration. This could be used at the very preliminary stage of the engine design process. (author)

  9. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    International Nuclear Information System (INIS)

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830

  10. Methodologies for Assessing the Cumulative Environmental Effects of Hydroelectric Development of Fish and Wildlife in the Columbia River Basin, Volume 1, Recommendations, 1987 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Stull, Elizabeth Ann

    1987-07-01

    This volume is the first of a two-part set addressing methods for assessing the cumulative effects of hydropower development on fish and wildlife in the Columbia River Basin. Species and habitats potentially affected by cumulative impacts are identified for the basin, and the most significant effects of hydropower development are presented. Then, current methods for measuring and assessing single-project effects are reviewed, followed by a review of methodologies with potential for use in assessing the cumulative effects associated with multiple projects. Finally, two new approaches for cumulative effects assessment are discussed in detail. Overall, this report identifies and reviews the concepts, factors, and methods necessary for understanding and conducting a cumulative effects assessment in the Columbia River Basin. Volume 2 will present a detailed procedural handbook for performing a cumulative assessment using the integrated tabular methodology introduced in this volume. 308 refs., 18 figs., 10 tabs.

  11. A study on the core analysis methodology for SMART CEA ejection accident-I

    International Nuclear Information System (INIS)

    A methodology to analyze the fuel enthalpy is developed based on MASTER that is a time dependent 3 dimensional core analysis code. Using the proposed methodology, SMART CEA ejection accident is analyzed. Moreover, radiation doses are estimated at the exclusion area boundary and low population zone to confirm the criteria for the accident. (Author). 31 refs., 13 tabs., 18 figs

  12. Human reliability analysis of Three Mile Island II accident considering THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    The main purpose of this work is to perform a human reliability analysis using THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis) methodologies, as well as their application to the development of qualitative and quantitative analysis of a nuclear power plant accident. The accident selected was the one that occurred at the Three Mile Island (TMI) Unit 2 Pressurized Water Reactor (PWR) nuclear power plan. The accident analysis has revealed a series of unsafe actions that resulted in permanent loss of the unit. This study also aims at enhancing the understanding of THERP and ATHEANA methodologies and their possible interactions with practical applications. The TMI accident analysis has pointed out the possibility of integration of THERP and ATHEANA methodologies. In this work, the integration between both methodologies is developed in a way to allow better understanding of the influence of operational context on human errors. (author)

  13. Human reliability analysis of Three Mile Island II accident considering THERP and ATHEANA methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Renato Alves; Alvarenga, Marco Antonio Bayout; Gibelli, Sonia Maria Orlando [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)]. E-mails: rfonseca@cnen.gov.br; bayout@cnen.gov.br; sonia@cnen.gov.br; Alvim, Antonio Carlos Marques; Frutuoso e Melo, Paulo Fernando Ferreira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)]. E-mails: Alvim@con.ufrj.br; frutuoso@con.ufrj.br

    2008-07-01

    The main purpose of this work is to perform a human reliability analysis using THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis) methodologies, as well as their application to the development of qualitative and quantitative analysis of a nuclear power plant accident. The accident selected was the one that occurred at the Three Mile Island (TMI) Unit 2 Pressurized Water Reactor (PWR) nuclear power plan. The accident analysis has revealed a series of unsafe actions that resulted in permanent loss of the unit. This study also aims at enhancing the understanding of THERP and ATHEANA methodologies and their possible interactions with practical applications. The TMI accident analysis has pointed out the possibility of integration of THERP and ATHEANA methodologies. In this work, the integration between both methodologies is developed in a way to allow better understanding of the influence of operational context on human errors. (author)

  14. Volume conduction effects on wavelet cross-bicoherence analysis

    International Nuclear Information System (INIS)

    Cross-bicoherence analysis is one of the important nonlinear signal processing tools which is used to measure quadratic phase coupling between frequencies of two different time series. It is frequently used in the diagnosis of various cognitive and neurological disorders in EEG (Electroencephalography) analysis. Volume conduction effects of various uncorrelated sources present in the brain can produce biased estimates into the estimated values of cross-bicoherence function. Previous studies have discussed volume conduction effects on coherence function which is used to measure linear relationship between EEG signals in terms of their phase and amplitude. However, volume conduction effect on cross-bicoherence analysis which is quite a different technique has not been investigated up to now to the best of our knowledge. This study is divided into two major parts, the first part deals with the investigation of VCUS (Volume Conduction effects due to Uncorrelated Sources) characteristics on EEG-cross-bicoherence analysis. The simulated EEG data due to uncorrelated sources present in the brain was used in this part of study. The next part of study is based upon investigating the effects of VCUS on the statistical analysis of results of EEG-based cross-bicoherence analysis. The study provides an important clinical application because most of studies based on EEG cross-bicoherence analysis have avoided the issue of VCUS. The cross-bicoherence analysis was performed by detecting the change in MSCB (Magnitude Square Cross-Bicoherence Function) between EEG activities of change detection and no-change detection trials. The real EEG signals were used. (author)

  15. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  16. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  17. Methodology of thermal hydraulic analysis for substantiation of reactor vessel brittle fracture resistance

    International Nuclear Information System (INIS)

    Methodology of thermal hydraulic analysis for substantiation of reactor vessel brittle fracture resistance is presented in this article. This procedure was used during PTS study for SUNPP Unit 1 and represents generally accepted international approach.

  18. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M and O 1998a)

  19. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  20. Analysis of the chemical equilibrium of combustion at constant volume

    Directory of Open Access Journals (Sweden)

    Marius BREBENEL

    2014-04-01

    Full Text Available Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant volume is next considered as example of application, observing the changes occurring in the composition of the combustion gases depending on temperature.

  1. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  2. Methodologic research needs in environmental epidemiology: data analysis.

    Science.gov (United States)

    Prentice, R L; Thomas, D

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analysis of aggregate data (ecologic) studies. PMID:8206041

  3. A New Methodology of Spatial Cross-Correlation Analysis

    OpenAIRE

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is deri...

  4. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author)

  5. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  6. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  7. Meta-analysis: Its role in psychological methodology

    Directory of Open Access Journals (Sweden)

    Andrej Kastrin

    2008-11-01

    Full Text Available Meta-analysis refers to the statistical analysis of a large collection of independent observations for the purpose of integrating results. The main objectives of this article are to define meta-analysis as a method of data integration, to draw attention to some particularities of its use, and to encourage researchers to use meta-analysis in their work. The benefits of meta-analysis include more effective exploitation of existing data from independent sources and contribution to more powerful domain knowledge. It may also serve as a support tool to generate new research hypothesis. The idea of combining results of independent studies addressing the same research question dates back to sixteenth century. Metaanalysis was reinvented in 1976 by Glass, to refute the conclusion of an eminent colleague, Eysenck, that psychotherapy was essentially ineffective. We review some major historical landmarks of metaanalysis and its statistical background. We present the concept of effect size measure, the problem of heterogeneity and two models which are used to combine individual effect sizes (fixed and random effect model in great details. Two visualization techniques, forest and funnel plot graphics are demonstrated. We developed RMetaWeb, simple and fast web server application to conduct meta-analysis online. RMetaWeb is the first web meta-analysis application and is completely based on R software environment for statistical computing and graphics.

  8. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 1-Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original ''fresh'' composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized- water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Isotopic densities for spent fuel assemblies in the core were calculated using the SAS2H analytical sequence in SCALE-4. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code sequence was used to extract the necessary isotopic densities from SAS2H results and to provide the data in the format required for SCALE-4 criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (keff) for the critical configuration. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for analysis of each critical configuration. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2

  9. The XMM Cluster Survey: X-ray analysis methodology

    CERN Document Server

    Lloyd-Davies, E J; Hosmer, Mark; Mehrtens, Nicola; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G; Hilton, Matt; Liddle, Andrew R; Viana, Pedro T P; Campbell, Heather C; Collins, Chris A; Dubois, E Naomi; Freeman, Peter; Hoyle, Ben; Kay, Scott T; Kuwertz, Emma; Miller, Christopher J; Nichol, Robert C; Sahlen, Martin; Stanford, S Adam; Stott, John P

    2010-01-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM- Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3669 > 4-{\\sigma} cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2 . Of these, 1022 candidates are detected with >300 X-ray counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these sources, as well as to estimate redshifts from the X-ray data alone. A total of 517 (126) X-ray temperatures to a typical accuracy of <40 (<10) per cent have ...

  10. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  11. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. PMID:26613351

  12. Propulsion system safety analysis methodology for commercial transport aircraft

    OpenAIRE

    Knife, S.

    1997-01-01

    Airworthiness certification of commercial transport aircraft requires a safety analysis of the propulsion system to establish that the probability of a failure jeopardising the safety of the aeroplane is acceptably low. The needs and desired features of such a propulsion system safety analysis are discussed, and current techniques and assumptions employed in such analyses are evaluated. It is concluded that current assumptions and techniques are not well suited to predicting...

  13. Methodology for statistical analysis of SENCAR mouse skin assay data.

    OpenAIRE

    Stober, J A

    1986-01-01

    Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent i...

  14. Translation model, translation analysis, translation strategy: an integrated methodology

    OpenAIRE

    VOLKOVA TATIANA A.

    2014-01-01

    The paper revisits the concepts of translation model, translation analysis, and translation strategy from an integrated perspective: a translation strategy naturally follows translation analysis performed on a given set of textual, discursive and communicative parameters that form a valid translation model. Translation modeling is reconsidered in terms of a paradigm shift and a distinction between a process-oriented (descriptive) model and an action-oriented (prescriptive) model. Following th...

  15. Methodologic research needs in environmental epidemiology: data analysis.

    OpenAIRE

    Prentice, R. L.; Thomas, D.

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analys...

  16. Multi-physics analysis methodologies for signal integrity

    OpenAIRE

    Jiang, L.

    2010-01-01

    This tutorial discusses two mutli-physics problems involved in modern signal integrity technologies: (1) multi-physics issue related to frequencies: it provides fundamental insights about multi-scale problems and the general strategy in dealing with multi-scale simulations. (2) multi-physics thermal electrical coupling analysis for on-chip and packaging structures. Theoretical analysis and numerical benchmarks will both be employed in the tutorial.

  17. Physical data generation methodology for return-to-power steam line break analysis

    International Nuclear Information System (INIS)

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new

  18. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  19. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  20. Principal component analysis based methodology to distinguish protein SERS spectra

    Science.gov (United States)

    Das, G.; Gentile, F.; Coluccio, M. L.; Perri, A. M.; Nicastri, A.; Mecarini, F.; Cojoc, G.; Candeloro, P.; Liberale, C.; De Angelis, F.; Di Fabrizio, E.

    2011-05-01

    Surface-enhanced Raman scattering (SERS) substrates were fabricated using electro-plating and e-beam lithography techniques. Nano-structures were obtained comprising regular arrays of gold nanoaggregates with a diameter of 80 nm and a mutual distance between the aggregates (gap) ranging from 10 to 30 nm. The nanopatterned SERS substrate enabled to have better control and reproducibility on the generation of plasmon polaritons (PPs). SERS measurements were performed for various proteins, namely bovine serum albumin (BSA), myoglobin, ferritin, lysozyme, RNase-B, α-casein, α-lactalbumin and trypsin. Principal component analysis (PCA) was used to organize and classify the proteins on the basis of their secondary structure. Cluster analysis proved that the error committed in the classification was of about 14%. In the paper, it was clearly shown that the combined use of SERS measurements and PCA analysis is effective in categorizing the proteins on the basis of secondary structure.

  1. A simplified methodology for nuclear waste repository thermal analysis

    International Nuclear Information System (INIS)

    A simplified model for repository thermal analysis is presented in this paper. The proposed model is to provide a general capability to efficiently calculate the time dependent temperature field in a geologic repository. The model analyzes both horizontal and vertical emplacement of nuclear waste packages. Verification of the code was performed based on the comparison with detailed numerical method-based standard models. The new model's utility was demonstrated through a case study where a large number of repository-scale thermal analysis calculations is needed.

  2. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  3. Setting events in applied behavior analysis: Toward a conceptual and methodological expansion

    OpenAIRE

    Wahler, Robert G.; Fox, James J.

    1981-01-01

    The contributions of applied behavior analysis as a natural science approach to the study of human behavior are acknowledged. However, it is also argued that applied behavior analysis has provided limited access to the full range of environmental events that influence socially significant behavior. Recent changes in applied behavior analysis to include analysis of side effects and social validation represent ways in which the traditional applied behavior analysis conceptual and methodological...

  4. Vulnerability analysis of interdependent infrastructure systems: A methodological framework

    Science.gov (United States)

    Wang, Shuliang; Hong, Liu; Chen, Xueguang

    2012-06-01

    Infrastructure systems such as power and water supplies make up the cornerstone of modern society which is essential for the functioning of a society and its economy. They become more and more interconnected and interdependent with the development of scientific technology and social economy. Risk and vulnerability analysis of interdependent infrastructures for security considerations has become an important subject, and some achievements have been made in this area. Since different infrastructure systems have different structural and functional properties, there is no universal all-encompassing 'silver bullet solution' to the problem of analyzing the vulnerability associated with interdependent infrastructure systems. So a framework of analysis is required. This paper takes the power and water systems of a major city in China as an example and develops a framework for the analysis of the vulnerability of interdependent infrastructure systems. Four interface design strategies based on distance, betweenness, degree, and clustering coefficient are constructed. Then two types of vulnerability (long-term vulnerability and focused vulnerability) are illustrated and analyzed. Finally, a method for ranking critical components in interdependent infrastructures is given for protection purposes. It is concluded that the framework proposed here is useful for vulnerability analysis of interdependent systems and it will be helpful for the system owners to make better decisions on infrastructure design and protection.

  5. Methodological developments and applications of neutron activation analysis

    Czech Academy of Sciences Publication Activity Database

    Kučera, Jan

    2007-01-01

    Roč. 273, č. 2 (2007), s. 273-280. ISSN 0236-5731 Institutional research plan: CEZ:AV0Z10480505 Keywords : neutron activation analysis * radiochemical separation Subject RIV: BE - Theoretical Physics Impact factor: 0.499, year: 2007

  6. Comparative proteomic analysis of human pancreatic juice: Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, Z.H.; Yang, A.M.; Deng, R.X.; Mai, C.R.; Sang, X.T.; Faber, Klaas Nico; Lu, X.H.

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  7. Comparative proteomic analysis of human pancreatic juice : Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, ZhaoHui; Yang, AiMing; Deng, RuiXue; Mai, CanRong; Sang, XinTing; Faber, Klaas Nico; Lu, XingHua

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  8. Analysis of the chemical equilibrium of combustion at constant volume

    OpenAIRE

    Marius BREBENEL

    2014-01-01

    Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant ...

  9. Graphic analysis of flow-volume curves: a pilot study

    OpenAIRE

    Lee, Jungsil; Lee, Choon-Taek; Lee, Jae Ho; Cho, Young-Jae; Park, Jong Sun; Oh, Yeon-Mok; Lee, Sang-Do; Yoon, Ho Il; ,

    2016-01-01

    Background Conventional spirometric parameters have shown poor correlation with symptoms and health status of chronic obstructive pulmonary disease (COPD). While it is well-known that the pattern of the expiratory flow-volume curve (EFVC) represents ventilatory dysfunction, little attempts have been made to derive quantitative parameters by analyzing the curve. In this study, we aimed to derive useful parameters from EFVC via graphic analysis and tried to validate them in patients with COPD. ...

  10. Criteria for the development and use of the methodology for environmentally-acceptable fossil energy site evaluation and selection. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Eckstein, L.; Northrop, G.; Scott, R.

    1980-02-01

    This report serves as a companion document to the report, Volume 1: Environmentally-Acceptable Fossil Energy Site Evaluation and Selection: Methodology and Users Guide, in which a methodology was developed which allows the siting of fossil fuel conversion facilities in areas with the least environmental impact. The methodology, known as SELECS (Site Evaluation for Energy Conversion Systems) does not replace a site specific environmental assessment, or an environmental impact statement (EIS), but does enhance the value of an EIS by thinning down the number of options to a manageable level, by doing this in an objective, open and selective manner, and by providing preliminary assessment and procedures which can be utilized during the research and writing of the actual impact statement.

  11. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report

  12. Methodological reflections on gesture analysis in SLA and bilingualism research

    OpenAIRE

    Gullberg, Marianne

    2010-01-01

    Gestures, i.e. the symbolic movements that speakers perform while they speak, form a closely interconnected system with speech, where gestures serve both addressee-directed (‘communicative’) and speaker-directed (‘internal’) functions. This article aims (1) to show that a combined analysis of gesture and speech offers new ways to address theoretical issues in second language acquisition (SLA) and bilingualism studies, probing SLA and bilingualism as product and process; and (2) to outline som...

  13. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    OpenAIRE

    Kharchenko, Volodymyr; Shmelova, Tetyana; Sikirda, Yuliya

    2011-01-01

    Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-makin...

  14. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  15. Towards a Domain Analysis Methodology for Collaborative Filtering

    OpenAIRE

    RAFTER, RACHAEL; Smyth, Barry

    2001-01-01

    Collaborative filtering has the ability to make personalised information recommendations in the absence of rich content meta-data, relying instead on collations between the preferences of similar users. However, it depends largely on there being sufficient overlap between the profiles of similar users, and its accuracy is compromised in sparse domains with little profile overlap. We describe an extensive analysis that investigates key domain characteristics that are vital to colla...

  16. Scenario analysis in spatial impact assessment:a methodological approach

    OpenAIRE

    Torrieri, F.; Nijkamp, P.

    2009-01-01

    This paper introduces the concept of Spatial or Territorial Impact Assessment as a new tool for balanced urban or regional planning from a long-term sustainability perspective. It then argues that modern scenario methods may be a useful complement to pro-active and future oriented urban or regional strategic thinking. A cognitive interactive model for scenario analysis is next presented and its advantages are outlined.

  17. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  18. Development of VHTR Core Analysis and Verification Methodology

    International Nuclear Information System (INIS)

    The primary objective of this project is to develop a three dimensional cylindrical geometry code to analyze PBRs using the Analytic Function Expansion Nodal (AFEN) method, and the second objective is to produce the numerical data and to verify the deterministic code from commercial PBR core and prism reactor core using the Monte Carlo method. We developed the TOPS code and verified its validity with various benchmark problems for stead-state and transient conditions. Considering the pebble flow and temperature distribution within the core, the core analysis for commercial pebble-type reactor was carried out by using the Monte Carlo method and the spatial-dependent Dancoff factors, and also was evaluated with the Monte Carlo method. And the optimization of the decay chain model, and implementation of the multi-group cross section processing of DeCART using McCARD for double heterogeneity effect. The TOPS code can be used in VHTR's design and reactor core characteristics evaluation, and the Monte Carlo results of core analysis can be used to the verification of the deterministic code. Furthermore, they are expected that the analysis method can be installed in the deterministic code

  19. Development of VHTR Core Analysis and Verification Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Han Gyu; Kim, Chang Hyo; Park, Ho Jin [Seoul National University, Seoul (Korea, Republic of)] (and others)

    2009-03-15

    The primary objective of this project is to develop a three dimensional cylindrical geometry code to analyze PBRs using the Analytic Function Expansion Nodal (AFEN) method, and the second objective is to produce the numerical data and to verify the deterministic code from commercial PBR core and prism reactor core using the Monte Carlo method. We developed the TOPS code and verified its validity with various benchmark problems for stead-state and transient conditions. Considering the pebble flow and temperature distribution within the core, the core analysis for commercial pebble-type reactor was carried out by using the Monte Carlo method and the spatial-dependent Dancoff factors, and also was evaluated with the Monte Carlo method. And the optimization of the decay chain model, and implementation of the multi-group cross section processing of DeCART using McCARD for double heterogeneity effect. The TOPS code can be used in VHTR's design and reactor core characteristics evaluation, and the Monte Carlo results of core analysis can be used to the verification of the deterministic code. Furthermore, they are expected that the analysis method can be installed in the deterministic code.

  20. Methodology Improvement of Reactor Physics Codes for CANDU Channels Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun; Choi, Geun Suk; Win, Naing; Aung, Tharndaing; Baek, Min Ho; Lim, Jae Yong [Kyunghee University, Seoul (Korea, Republic of)

    2010-04-15

    As the operational time increase, pressure tubes and calandria tubes in CANDU core encounter inevitably a geometrical deformation along the tube length. A pressure tube may be sagged downward within a calandria tube by creep from irradiation. This event can bring about a problem that is serious in integrity of pressure tube. A measurement of deflection state of in-service pressure tube is, therefore, very important for the safety of CANDU reactor. In this paper, evaluation of impacts on nuclear characteristic due to fuel channel deformation were aimed in order to improve nuclear design tools for concerning the local effects from abnormal deformations. It was known that sagged pressure tube can cause the eccentric configuration of fuel bundles in pressure tube by O.6cm maximum. In this case, adverse pin power distribution and reactivity balance can affect reactor safety under normal and accidental condition. Thermal and radiation-induced creep in pressure tube would expand a tube size. It was known that maximum expansion may be 5% in volume. In this case, more coolant make more moderation in the deformed channel resulting in the increase of reactivity. Sagging of pressure tube did not cause considerable change in K-inf values. However, expansion of the pressure tube made relatively large change in K-inf. Modeling of eccentric and enlarged configuration is not easy in preparation of input geometry at both HELlOS and MCNP. On the other hand, there is no way to consider this deformation in one-dimensional homogenization tool such as WIMS code. The way of handling this deformation was suggested as the correction method of expansion effect by adjusting the number density of coolant. The number density of heavy water coolant was set to be increased as the rate of expansion increase. This correction was done in the intact channel without changing geometry. It was found that this correction was very effective in the prediction of K-inf values. In this study, further

  1. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    presented and detailed to a level enabling an analysis of the relation to the impact categories at midpoint level considered in life cycle (LC) methodologies.The interpretation of the definition of sustainability as outlined in Our Common Future (WCED 1987) suggests that the assessment of a product...... large extent considered in any of the LC methodologies. Furthermore, because of the present level of knowledge about what creates and destroys social capital, it is difficult to assess how it relates to the LC methodologies. It is also found that the LCC is only relevant in the context of a life cycle...

  2. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan;

    2016-01-01

    . It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify......This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...

  3. CRITICAL ANALYSIS OF THE EXTREME PROGRAMMING (XP) PROJECT MANAGEMENT METHODOLOGY IN THE INFORMATION TECHNOLOGY FIELD

    OpenAIRE

    Ionel NĂFTĂNĂILĂ; Ivona ORZEA

    2009-01-01

    Extreme Programming represents a modern Project Management methodology, being a part of AGILE methodologies. The present paper has the purpose of making a critical analysis of the Extreme Programming (XP) from the point of view of advantages and disadvantages that it implies, both from a theoretical and practical approach. From the theoretical point of view the paper will present the main contributions in the Extreme Programming literature, analyzing in the same time the main characteristics ...

  4. The Phenomenological Life-World Analysis and the Methodology of the Social Sciences

    OpenAIRE

    Eberle, Thomas S.

    2010-01-01

    This Alfred Schutz Memorial Lecture discusses the relationship between the phenomenological life-world analysis and the methodology of the social sciences, which was the central motive of Schutz's work. I have set two major goals in this lecture. The first is to scrutinize the postulate of adequacy, as this postulate is the most crucial of Schutz's methodological postulates. Max Weber devised the postulate ‘adequacy of meaning' in analogy to the postulate of ‘causal adequacy' (a concept used ...

  5. Using cost – volume – profit analysis by management

    OpenAIRE

    Trifan, A.; Anton, C. E.

    2011-01-01

    Founded on the distinction between variable costs and fixed costs, the analysis of the relationship between the volume of activity, costs and profits is directed to decision-making in order to guide an entity’s management to obtain optimal results. It is known that the models that individualize the development of the expenses at an entity’s level represent the basis of cost analysis. Then, given the fact that foresight imposes taken into account fluctuations in an activity, the grouping of ex...

  6. Using cost – volume – profit analysis by management

    Directory of Open Access Journals (Sweden)

    Trifan, A.

    2011-01-01

    Full Text Available Founded on the distinction between variable costs and fixed costs, the analysis of the relationship between the volume of activity, costs and profits is directed to decision-making in order to guide an entity’s management to obtain optimal results. It is known that the models that individualize the development of the expenses at an entity’s level represent the basis of cost analysis. Then, given the fact that foresight imposes taken into account fluctuations in an activity, the grouping of expenses into variable and fixed will be used for forecasting management, for evaluating an entity’s performance and for analyzing decisional alternatives.

  7. Methodological Approach to the Energy Analysis of Unconstrained Historical Buildings

    Directory of Open Access Journals (Sweden)

    Chiara Burattini

    2015-08-01

    Full Text Available The goal set by the EU of quasi-zero energy buildings is not easy to reach for a country like Italy, as it holds a wide number of UNESCO sites and most of them are entire historical old towns. This paper focuses on the problem of the improvement of energy performance of historical Italian architecture through simple interventions that respect the building without changing its shape and structure. The work starts from an energy analysis of a building located in the historic center of Tivoli, a town close to Rome. The analysis follows the recommendations of the UNI TS 11300-Part1, which indicates how to evaluate the energy consumptions. The calculations were performed only on the building envelope, based on passive solutions and alternatives. Four passive strategies were examined and applied based on the location of the building and the non-alteration of the structure and the landscape. The obtained results impacted positively on the energy performance of the building: the annual energy saving reached a maximum value of 25%. This work shows how it is possible to improve the energy performance of an existing building achieving a significant energy saving with the respect of the building architecture, shape, function and the surrounding landscape.

  8. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  9. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  10. An ATWS Analysis with a Realistic Evaluation Methodology

    International Nuclear Information System (INIS)

    Anticipated Transients Without Scram (ATWS) would occur on failure of all the control and shutdown assemblies to insert into the core following an automatic reactor trip. The major concern of the ATWS derives from consequences of the high primary system pressure which is the characteristic of the transients. According to section 2.4 of YVL guides which are Finnish regulations for safety of nuclear power plants (NPP), the acceptance criterion for the ATWS analysis is that the pressure of the protected item does not exceed a pressure limit that is 1.3 times the design pressure. The main purpose of this paper is to assess its impact on the APR1400 preliminarily, for Europe regulatory environments by applying European Utility Requirements (EUR) for Light Water Reactor Nuclear Power Plants

  11. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    CERN Document Server

    Jacobs, Daniel C; Trott, C M; Dillon, Joshua S; Pindor, B; Sullivan, I S; Pober, J C; Barry, N; Beardsley, A P; Bernardi, G; Bowman, Judd D; Briggs, F; Cappallo, R J; Carroll, P; Corey, B E; de Oliveira-Costa, A; Emrich, D; Ewall-Wice, A; Feng, L; Gaensler, B M; Goeke, R; Greenhill, L J; Hewitt, J N; Hurley-Walker, N; Johnston-Hollitt, M; Kaplan, D L; Kasper, J C; Kim, H S; Kratzenberg, E; Lenc, E; Line, J; Loeb, A; Lonsdale, C J; Lynch, M J; McKinley, B; McWhirter, S R; Mitchell, D A; Morales, M F; Morgan, E; Neben, A R; Thyagarajan, N; Oberoi, D; Offringa, A R; Ord, S M; Paul, S; Prabu, T; Procopio, P; Riding, J; Rogers, A E E; Roshi, A; Shankar, N Udaya; Sethi, Shiv K; Srivani, K S; Subrahmanyan, R; Tegmark, M; Tingay, S J; Waterson, M; Wayth, R B; Webster, R L; Whitney, A R; Williams, A; Williams, C L; Wu, C; Wyithe, J S B

    2016-01-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple, independent, data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregr...

  12. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  13. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  14. Analysis methodology and recent results of the IGS network combination

    Science.gov (United States)

    Ferland, R.; Kouba, J.; Hutchison, D.

    2000-11-01

    A working group of the International GPS Service (IGS) was created to look after Reference Frame (RF) issues and contribute to the densification and improvement of the International Terrestrial Reference Frame (ITRF). One important objective of the Reference Frame Working Group is to generate consistent IGS station coordinates and velocities, Earth Rotation Parameters (ERP) and geocenter estimates along with the appropriate covariance information. These parameters have a direct impact on other IGS products such as the estimation of GPS satellite ephemerides, as well as satellite and station clocks. The information required is available weekly from the Analysis Centers (AC) (cod, emr, esa, gfz, jpl, ngs, sio) and from the Global Network Associate Analysis Centers (GNAAC) (JPL, mit, ncl) using a "Software Independent Exchange Format" (SINEX). The AC are also contributing daily ERPs as part of their weekly submission. The procedure in place simultaneously combines the weekly station coordinates, geocenter and daily ERP estimates. A cumulative solution containing station coordinates and velocity is also updated with each weekly combination. This provides a convenient way to closely monitor the quality of the estimated station coordinates and to have an up to date cumulative solution available at all times. To provide some necessary redundancy, the weekly station coordinates solution is compared against the GNAAC solutions. Each of the 3 GNAAC uses its own software, allowing independent verification of the combination process. The RMS of the coordinate differences in the north, east and up components between the AC/GNAAC and the ITRF97 Reference Frame Stations are 4-10 mm, 5-20 mm and 6-25 mm. The station velocities within continental plates are compared to the NNR-NUVEL1A plate motion model (DeMets et al., 1994). The north, east and up velocity RMS are 2 mm/y, 3 mm/y and 8 mm/y. Note that NNR-NUVEL1A assumes a zero vertical velocity.

  15. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    OpenAIRE

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues.

  16. Parallel runway requirement analysis study. Volume 2: Simulation manual

    Science.gov (United States)

    Ebrahimi, Yaghoob S.; Chun, Ken S.

    1993-01-01

    This document is a user manual for operating the PLAND_BLUNDER (PLB) simulation program. This simulation is based on two aircraft approaching parallel runways independently and using parallel Instrument Landing System (ILS) equipment during Instrument Meteorological Conditions (IMC). If an aircraft should deviate from its assigned localizer course toward the opposite runway, this constitutes a blunder which could endanger the aircraft on the adjacent path. The worst case scenario would be if the blundering aircraft were unable to recover and continue toward the adjacent runway. PLAND_BLUNDER is a Monte Carlo-type simulation which employs the events and aircraft positioning during such a blunder situation. The model simulates two aircraft performing parallel ILS approaches using Instrument Flight Rules (IFR) or visual procedures. PLB uses a simple movement model and control law in three dimensions (X, Y, Z). The parameters of the simulation inputs and outputs are defined in this document along with a sample of the statistical analysis. This document is the second volume of a two volume set. Volume 1 is a description of the application of the PLB to the analysis of close parallel runway operations.

  17. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    Science.gov (United States)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  18. A new methodology for the CFD uncertainty analysis

    Institute of Scientific and Technical Information of China (English)

    YAO Zhen-qiu; SHEN Hong-cui; GAO Hui

    2013-01-01

    With respect to the measurement uncertainty,this paper discusses the definition,the sources,the classification and the expressions of the CFD uncertainty.Based on the orthogonal design and the statistics inference theory,a new verification and validation method and the related procedures in the CFD simulation are developed.With the method,two examples of the CFD verification and validation are studied for the drag coefficient and the nominal wake fraction,and the calculation factors and their interactions which would significantly affect the simulation results are obtained.Moreover,the sizes of all uncertainty components resulting from the controlled and un-controlled calculation factors are determined,and the optimal combination of the calculation factors is obtained by an effect estimation in the orthogonal experiment design.It is shown that the new method can be used for the verification in the CFD uncertainty analysis,and can reasonably and definitely judge the credibility of the simulative result.As for CFD simulation of the drag coefficient and the nominal wake fraction,the results predicted can be validated.Although there is still some difference between the simulation results and the experiment results,its approximate level and credibility can be accepted.

  19. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. PMID:26186171

  20. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  1. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  2. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  3. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 3: Surry Unit 1 Cycle 2

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using selected critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations in this report is based on the codes and data provided in the SCALE-4 code system. This volume of the report documents the SCALE system analysis of two reactor critical configurations for Surry Unit 1 Cycle 2. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted a direct comparison of criticality calculations using the utility-calculated isotopics, with those using, the isotopics generated by the SCALE-4 SAS2H sequence. These reactor critical benchmarks have been reanalyzed using the methodology described above. The two benchmark critical calculations were the beginning-of-cycle (BOC) startup at hot, zero-power (HZP) and an end-of-cycle (EOC) critical at hot, full-power (HFP) critical conditions. These calculations were used to check for consistency in the calculated results for different burnup, downtime, temperature, xenon, and boron conditions. The keff results were 1.0014 and 1.0113, respectively, with a standard deviation of 0.0005

  4. Safety assessment methodologies for near surface disposal facilities. Results of a co-ordinated research project (ISAM). Volume 1: Review and enhancement of safety assessment approaches and tools. Volume 2: Test cases

    International Nuclear Information System (INIS)

    the Safety Guide on 'Safety Assessment for Near Surface Disposal of Radioactive Waste' (Safety Standards Series No. WS-G- 1.1). The report of this CRP is presented in two volumes; Volume 1 contains a summary and a complete description of the ISAM project methodology and Volume 2 presents the application of the methodology to three hypothetical test cases

  5. Application of 3D Scanned Imaging Methodology for Volume, Surface Area, and Envelope Density Evaluation of Densified Biomass

    Science.gov (United States)

    Measurement of surface area, volume, and density is an essential for quantifying, evaluating, and designing the biomass densification, storage, and transport operations. Acquiring accurate and repeated measurements of these parameters for hygroscopic densified biomass are not straightforward and on...

  6. Analysis of volume holographic storage allowing large-angle illumination

    Science.gov (United States)

    Shamir, Joseph

    2005-05-01

    Advanced technological developments have stimulated renewed interest in volume holography for applications such as information storage and wavelength multiplexing for communications and laser beam shaping. In these and many other applications, the information-carrying wave fronts usually possess narrow spatial-frequency bands, although they may propagate at large angles with respect to each other or a preferred optical axis. Conventional analytic methods are not capable of properly analyzing the optical architectures involved. For mitigation of the analytic difficulties, a novel approximation is introduced to treat narrow spatial-frequency band wave fronts propagating at large angles. This approximation is incorporated into the analysis of volume holography based on a plane-wave decomposition and Fourier analysis. As a result of the analysis, the recently introduced generalized Bragg selectivity is rederived for this more general case and is shown to provide enhanced performance for the above indicated applications. The power of the new theoretical description is demonstrated with the help of specific examples and computer simulations. The simulations reveal some interesting effects, such as coherent motion blur, that were predicted in an earlier publication.

  7. Analysis of Cavity Volumes in Proteins Using Percolation Theory

    Science.gov (United States)

    Green, Sheridan; Jacobs, Donald; Farmer, Jenny

    Molecular packing is studied in a diverse set of globular proteins in their native state ranging in size from 34 to 839 residues An new algorithm has been developed that builds upon the classic Hoshen-Kopelman algorithm for site percolation combined with a local connection criterion that classifies empty space within a protein as a cavity when large enough to hold a spherical shaped probe of radius, R, otherwise a microvoid. Although microvoid cannot fit an object (e.g. molecule or ion) that is the size of the probe or larger, total microvoid volume is a major contribution to protein volume. Importantly, the cavity and microvoid classification depends on probe radius. As probe size decreases, less microvoid forms in favor of more cavities. As probe size is varied from large to small, many disconnected cavities merge to form a percolating path. For fixed probe size, microvoid, cavity and solvent accessible boundary volume properties reflect conformational fluctuations. These results are visualized on three-dimensional structures. Analysis of the cluster statistics within the framework of percolation theory suggests interconversion between microvoid and cavity pathways regulate the dynamics of solvent penetration during partial unfolding events important to protein function.

  8. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  9. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  10. Application of seismic analysis methodology to small modular integral reactor internals

    International Nuclear Information System (INIS)

    The fluid-structure interaction (FSI) effect should be carefully considered in a seismic analysis of nuclear reactor internals to obtain the appropriate seismic responses because the dynamic characteristics of reactor internals change when they are submerged in the reactor coolant. This study suggests that a seismic analysis methodology considered the FSI effect in an integral reactor, and applies the methodology to the System-Integrated Modular Advanced Reactor (SMART) developed in Korea. In this methodology, we especially focus on constructing a numerical analysis model that can represent the dynamic behaviors considered in the FSI effect. The effect is included in the simplified seismic analysis model by adopting the fluid elements at the gap between the structures. The overall procedures of the seismic analysis model construction are verified by using dynamic characteristics extracted from a scaled-down model, and then the time history analysis is carried out using the constructed seismic analysis model, applying the El Centro earthquake input in order to obtain the major seismic responses. The results show that the seismic analysis model can clearly provide the seismic responses of the reactor internals. Moreover, the results emphasize the importance of the consideration of the FSI effect in the seismic analysis of the integral reactor. (author)

  11. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  12. Parallel runway requirement analysis study. Volume 1: The analysis

    Science.gov (United States)

    Ebrahimi, Yaghoob S.

    1993-01-01

    The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable

  13. Applications of Innovative Safety Analysis Methodology (ISAM) to Reload Safety Evaluation

    International Nuclear Information System (INIS)

    KNF has developed the Innovative Safety Analysis Methodology (ISAM) using RETRAN code for Non-LOCA transient analysis during three years from 2006. The first objective of this project is to secure safety analysis methodology required to the export of X-GEN Fuel which KNF is developing. The second is to set up the improved methodology to be applied to the licensing safety analyses for all the OPR1000 and APR1400 plants. The ISAM possesses the characteristics of a designer-friendly methodology. To verify its applicability to the reload safety evaluation, most transients for safety analysis report and for COLSS/CPC setpoint have been analyzed and compared with current safety analysis results. Comparison results show good agreement between them, and it is concluded that the ISAM can be used in the licensing calculations for all the OPR1000 and APR1400 plants. In this paper, presented are the application results of the transients for COLSS/CPC setpoint such as the single CEA withdrawal (SCEAW) event and the asymmetric steam generator transients (ASGT)

  14. Applications of Innovative Safety Analysis Methodology (ISAM) to Reload Safety Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Chan Su; Um, Kil Sup [Korea Nuclear Fuel, Daejeon (Korea, Republic of)

    2009-05-15

    KNF has developed the Innovative Safety Analysis Methodology (ISAM) using RETRAN code for Non-LOCA transient analysis during three years from 2006. The first objective of this project is to secure safety analysis methodology required to the export of X-GEN Fuel which KNF is developing. The second is to set up the improved methodology to be applied to the licensing safety analyses for all the OPR1000 and APR1400 plants. The ISAM possesses the characteristics of a designer-friendly methodology. To verify its applicability to the reload safety evaluation, most transients for safety analysis report and for COLSS/CPC setpoint have been analyzed and compared with current safety analysis results. Comparison results show good agreement between them, and it is concluded that the ISAM can be used in the licensing calculations for all the OPR1000 and APR1400 plants. In this paper, presented are the application results of the transients for COLSS/CPC setpoint such as the single CEA withdrawal (SCEAW) event and the asymmetric steam generator transients (ASGT)

  15. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  16. Reliability analysis of repairable system based on GO-FLOW methodology

    International Nuclear Information System (INIS)

    A quantitative analysis method named GO-FLOW is introduced to analyze the reliability of system with priority in maintenance and the amount of repairman limited. Approximate formulas model that can be applied to the GO-FLOW calculation is derived for the reliability parameters of repairable assembly. Then the model's feasibility is validated, and its error is analyzed. An example of redundancy pump component is presented, and the result achieved by GO-FLOW is compared with that by GO methodology. The results show that GO-FLOW Methodology can be used for quantitative analysis of this sort of repairable system; The model of GO-FLOW is effective and the algorithm is more convenient compared with GO methodology. (authors)

  17. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  18. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and "Braslet-M" Occlusion Cuffs

    Science.gov (United States)

    Bogomolov, V. V.; Duncan, J. M.; Alferova, I. V.; Dulchavsky, S. A.; Ebert, D.; Hamilton, D. R.; Matveev, V. P.; Sargsyan, A. E.

    2008-01-01

    Recent advances in remotely guided imaging techniques on ISS allow the acquisition of high quality ultrasound data using crewmember operators with no medical background and minimal training. However, ongoing efforts are required to develop and validate methodology for complex imaging protocols to ensure their repeatability, efficiency, and suitability for use aboard the ISS. This Station Developmental Test Objective (SDTO) tests a cardiovascular evaluation methodology that takes advantage of the ISS Ultrasound capability, the Braslet-M device, and modified respiratory maneuvers (Valsalva and Mueller), to broaden the spectrum of anatomical and functional information on human cardiovascular system during long-duration space missions. The proposed methodology optimizes and combines new and previously demonstrated methods, and is expected to benefit medically indicated assessments, operational research protocols, and data collections for science. Braslet-M is a current Russian operational countermeasure that compresses the upper thigh to impede the venous return from lower extremities. The goal of the SDTO is to establish and validate a repeatable ultrasound-based methodology for the assessment of a number of cardiovascular criteria in microgravity. Braslet-M device is used as a means to acutely alter volume distribution while focused ultrasound measurements are performed. Modified respiratory maneuvers are done upon volume manipulations to record commensurate changes in anatomical and functional parameters. The overall cardiovascular effects of the Braslet-M device are not completely understood, and although not a primary objective of this SDTO, this effort will provide pilot data regarding the suitability of Braslet-M for its intended purpose, effects, and the indications for its use.

  19. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  20. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and Braslet-M Occlusion Cuffs

    Science.gov (United States)

    Hamilton, Douglas; Sargsyan, Ashot E.; Ebert, Douglas; Duncan, Michael; Bogomolov, Valery V.; Alferova, Irina V.; Matveev, Vladimir P.; Dulchavsky, Scott A.

    2010-01-01

    The objective of this joint U.S. - Russian project was the development and validation of an in-flight methodology to assess a number of cardiac and vascular parameters associated with circulating volume and its manipulation in long-duration space flight. Responses to modified Valsalva and Mueller maneuvers were measured by cardiac and vascular ultrasound (US) before, during, and after temporary volume reduction by means of Braslet-M thigh occlusion cuffs (Russia). Materials and Methods: The study protocol was conducted in 14 sessions on 9 ISS crewmembers, with an average exposure to microgravity of 122 days. Baseline cardiovascular measurements were taken by echocardiography in multiple modes (including tissue Doppler of both ventricles) and femoral and jugular vein imaging on the International Space Station (ISS). The Braslet devices were then applied and measurements were repeated after >10 minutes. The cuffs were then released and the hemodynamic recovery process was monitored. Modified Valsalva and Mueller maneuvers were used throughout the protocol. All US data were acquired by the HDI-5000 ultrasound system aboard the ISS (ATL/Philips, USA) during remotely guided sessions. The study protocol, including the use of Braslet-M for this purpose, was approved by the ISS Human Research Multilateral Review Board (HRMRB). Results: The effects of fluid sequestration on a number of echocardiographic and vascular parameters were readily detectable by in-flight US, as were responses to respiratory maneuvers. The overall volume status assessment methodology appears to be valid and practical, with a decrease in left heart lateral E (tissue Doppler) as one of the most reliable measures. Increase in the femoral vein cross-sectional areas was consistently observed with Braslet application. Other significant differences and trends within the extensive cardiovascular data were also observed. (Decreased - RV and LV preload indices, Cardiac Output, LV E all maneuvers, LV Stroke

  1. Environmentally-acceptable fossil energy site evaluation and selection: methodology and user's guide. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Northrop, G.M.

    1980-02-01

    This report is designed to facilitate assessments of environmental and socioeconomic impacts of fossil energy conversion facilities which might be implemented at potential sites. The discussion of methodology and the User's Guide contained herein are presented in a format that assumes the reader is not an energy technologist. Indeed, this methodology is meant for application by almost anyone with an interest in a potential fossil energy development - planners, citizen groups, government officials, and members of industry. It may also be of instructional value. The methodology is called: Site Evaluation for Energy Conversion Systems (SELECS) and is organized in three levels of increasing sophistication. Only the least complicated version - the Level 1 SELECS - is presented in this document. As stated above, it has been expressly designed to enable just about anyone to participate in evaluating the potential impacts of a proposed energy conversion facility. To accomplish this objective, the Level 1 calculations have been restricted to ones which can be performed by hand in about one working day. Data collection and report preparation may bring the total effort required for a first or one-time application to two to three weeks. If repeated applications are made in the same general region, the assembling of data for a different site or energy conversion technology will probably take much less time.

  2. Analysis of the processes in training groups: A methodological proposal and an empirical exemplification

    OpenAIRE

    Florinda Picone; Giuseppe Ruvolo

    2014-01-01

    Authors propose a new methodology for group process analysis trough a code-grid of a text of group experience. The method proposed has been constructed on the basis of the CCRT model elaborated by L. Luborsky and some analythic categories suggested by Lieberman and Whitaker in their Focal Group Conflict Theory. Authors proposed also an empirical application sample of their method to a text of an analythic training group. Keywords: group process analysis, training group, empirical group reasea...

  3. The risks analysis like a practice of secure software development : A revision of models and methodologies

    OpenAIRE

    Carrillo Verdún, José; Gasca Hurtado, Gloria; Tovar Caro, Edmundo; Vega Zepeda, Vianca

    2006-01-01

    The following document, presents and analyzes the Risks Analysis in the whole software development life cycle, framed like one of the recommended practices for secure software development. It present and compare a set of Risk Analysis methodologies and strategies, considering like criteria some classifications propose by different authors and the objectives that they persecute to orient them towards of evaluation criterion for the secure software development.

  4. Complete Photoionization Experiments via Ultrafast Coherent Control with Polarization Multiplexing II: Numerics & Analysis Methodologies

    CERN Document Server

    Hockett, P; Lux, C; Baumert, T

    2015-01-01

    The feasibility of complete photoionization experiments, in which the full set of photoionization matrix elements are determined, using multiphoton ionization schemes with polarization-shaped pulses has recently been demonstrated [Hockett et. al., Phys. Rev. Lett. 112, 223001 (2014)]. Here we extend on our previous work to discuss further details of the numerics and analysis methodology utilised, and compare the results directly to new tomographic photoelectron measurements, which provide a more sensitive test of the validity of the results. In so doing we discuss in detail the physics of the photoionziation process, and suggest various avenues and prospects for this coherent multiplexing methodology.

  5. Can the capability approach be evaluated within the frame of mainstream economics?: A methodological analysis

    Directory of Open Access Journals (Sweden)

    Karaçay Çakmak Hatice

    2010-01-01

    Full Text Available The aim of this article is to examine the capability approach of Amartya Sen and mainstream economic theory in terms of their epistemological, methodological and philosophical/cultural aspects. The reason for undertaking this analysis is the belief that Sen's capability approach, contrary to some economists' claim, is uncongenial to mainstream economic views on epistemology and methodology (not on ontologically. However, while some social scientists regard that Sen, on the whole, is a mainstream economist, his own approach strongly criticizes both the theory and practice of mainstream economics.

  6. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  7. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  8. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  9. A Methodological Review for the Analysis of Divide and Conquer Based Sorting/ Searching Algorithms

    Directory of Open Access Journals (Sweden)

    Deepak Abhyankar

    2011-09-01

    Full Text Available This paper develops a practical methodology for the analysis of sorting/searching algorithms. To achieve this objective an analytical study of Quicksort and searching problem was undertaken. This work explains that asymptotic analysis can be misleading if applied slovenly. The study provides a fresh insight into the working of Quicksort and Binary search. Also this presents an exact analysis of Quicksort. Our study finds that asymptotic analysis is a sort of approximation and may hide many useful facts. It was shown that infinite inefficient algorithms can easily be classified with a few efficient algorithms using asymptotic approach.

  10. A methodology for the analysis of protection against overpressure using the Ramona-3 B code

    International Nuclear Information System (INIS)

    A methodology to carry out the most severe overpressure transient that could happen at Laguna Verde Nuclear power Plant is presented, this study is a requirement as a part of the licensing analysis in a fuel reload. The analysis is put into effect with the Ramona 3-B code. The results are compared against the safety analysis report of the Laguna Verde Nuclear Power Plant. The aim of the analysis is to determine the maximum pressure reached on the reactor vessel during operational events, in order to demonstrate agreement with the ASME code for Containers and Pressure Vessels. (Author)

  11. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  12. Performance analysis of complex repairable industrial systems using PSO and fuzzy confidence interval based methodology.

    Science.gov (United States)

    Garg, Harish

    2013-03-01

    The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. PMID:23098922

  13. Analysis of target volumes for gliomas; Volumes-cibles anatomocliniques (GTV et CTV) des tumeurs gliales

    Energy Technology Data Exchange (ETDEWEB)

    Kantor, G. [Centre Regional de Lutte Contre le Cancer, Service de Radiotherapie, Institut Bergonie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France); Loiseau, H. [Hopital Pellegrin-Tripode, Service de Neurochirurgie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France)

    2005-06-15

    Gliomas are the most frequent tumors of the central nervous system of the adult. These intra-parenchymal tumors are infiltrative and the most important criterion for definition of GTV and CTV is the extent of infiltration. Delineation of GTV and CTV for untreated and resected glioma remains a controversial and difficult issue because of the discrepancy between real tumor invasion and that estimated by CT or MRI. Is particularly helpful a joint analysis of the four different methods as histopathological correlations with CT and MRI, use of new modality imaging, pattern of relapses after treatment and interobserver studies. The presence of isolated tumor cells in intact brain, oedema or adjacent structures requires the definition of two different options for CTV: i) a geometrical option with GTV defined as the tumor mass revealed by the contrast-enhanced zone on CT or MRI and a CTV with an expanded margin of 2 or 3 cm; ii) an anatomic option including the entire zone of oedema or isolated tumor cell infiltration extending at least as far as the limits of the hyperintense zone on T2-weighted MRI. Inclusion of adjacent structures (such as white matter, corpus callosum, subarachnoid spaces) in the CTV mainly depends on the site of the tumor and size of the volume is generally enlarged. (authors)

  14. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  15. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rebollo, L. (Union Fenosa, Madrid (Spain))

    1993-07-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests.

  16. Coal gasification systems engineering and analysis. Volume 1: Executive summary

    Science.gov (United States)

    1980-01-01

    Feasibility analyses and systems engineering studies for a 20,000 tons per day medium Btu (MBG) coal gasification plant to be built by TVA in Northern Alabama were conducted. Major objectives were as follows: (1) provide design and cost data to support the selection of a gasifier technology and other major plant design parameters, (2) provide design and cost data to support alternate product evaluation, (3) prepare a technology development plan to address areas of high technical risk, and (4) develop schedules, PERT charts, and a work breakdown structure to aid in preliminary project planning. Volume one contains a summary of gasification system characterizations. Five gasification technologies were selected for evaluation: Koppers-Totzek, Texaco, Lurgi Dry Ash, Slagging Lurgi, and Babcock and Wilcox. A summary of the trade studies and cost sensitivity analysis is included.

  17. Dependability analysis of a very large volume neutrino telescope

    International Nuclear Information System (INIS)

    This work considers a first order approximation to the dependability analysis of complex large scale installations. The dependability criterion used here is quantitative unavailability, and an appropriate unavailability model is presented. The model assumes that the system is symmetrical, has various levels of hierarchy, and components found in the same level are similar and function independently. The application example comes from very large volume neutrino telescopes installed under water or ice, consisting of several thousands of optical modules. The readout architecture of the detector has several levels of multiplexing including optical detection towers, branches and tower sectors. The paper presents results for various alternative detector layouts and distances of the detector from the onshore facilities. It also develops dependability requirements for major components and/or subsystems consistent with an overall system performance target. The results depict the dependence of the system unavailability on the number of optical modules and the alternative deep sea infrastructure configurations for transferring the measured signals.

  18. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  19. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  20. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  1. Synfuel program analysis. Volume 1: Procedures-capabilities

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    The analytic procedures and capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative are described. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specify cases and interpret outputs. It contains an explicit description (with examples) of the types of results which can be obtained when applied for the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. The objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  2. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  3. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  4. Two-dimensional thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Y.N.; Silva, Mario A.B. da; Lira, Carlos A.B. de O., E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamaento de Energia Nuclear

    2015-07-01

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  5. Two-dimensional thermal analysis of a fuel rod by finite volume method

    International Nuclear Information System (INIS)

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  6. Travel time impacts analysis of system-wide signal timing optimization methodology

    OpenAIRE

    Ainchil Cayuela, Luis María

    2014-01-01

    This study analyzes the economic impact that users would experience with the travel time variation due to system-wide signal timing optimization. To do this, a comprehensive analysis of travel time user benefits is conducted using traffic volume, speed and other attributes of road network, before and after signal timing optimization.

  7. Improved robotic stereotactic body radiation therapy plan quality and planning efficacy for organ-confined prostate cancer utilizing overlap-volume histogram-driven planning methodology

    International Nuclear Information System (INIS)

    Background and purpose: This study is to determine if the overlap-volume histogram (OVH)-driven planning methodology can be adapted to robotic SBRT (CyberKnife Robotic Radiosurgery System) to further minimize the bladder and rectal doses achieved in plans manually-created by clinical planners. Methods and materials: A database containing clinically-delivered, robotic SBRT plans (7.25 Gy/fraction in 36.25 Gy) of 425 patients with localized prostate cancer was used as a cohort to establish an organ’s distance-to-dose model. The OVH-driven planning methodology was refined by adding the PTV volume factor to counter the target’s dose fall-off effect and incorporated into Multiplan to automate SBRT planning. For validation, automated plans (APs) for 12 new patients were generated, and their achieved dose/volume values were compared to the corresponding manually-created, clinically-delivered plans (CPs). A two-sided, Wilcoxon rank-sum test was used for statistical comparison with a significance level of p < 0.05. Results: PTV’s V(36.25 Gy) was comparable: 95.6% in CPs comparing to 95.1% in APs (p = 0.2). On average, the refined approach lowered V(18.12 Gy) to the bladder and rectum by 8.2% (p < 0.05) and 6.4% (p = 0.14). A physician confirmed APs were clinically acceptable. Conclusions: The improvements in APs could further reduce toxicities observed in SBRT for organ-confined prostate cancer

  8. Reliability analysis of repairable systems using Petri nets and vague Lambda-Tau methodology.

    Science.gov (United States)

    Garg, Harish

    2013-01-01

    The main objective of the paper is to developed a methodology, named as vague Lambda-Tau, for reliability analysis of repairable systems. Petri net tool is applied to represent the asynchronous and concurrent processing of the system instead of fault tree analysis. To enhance the relevance of the reliability study, vague set theory is used for representing the failure rate and repair times instead of classical(crisp) or fuzzy set theory because vague sets are characterized by a truth membership function and false membership functions (non-membership functions) so that sum of both values is less than 1. The proposed methodology involves qualitative modeling using PN and quantitative analysis using Lambda-Tau method of solution with the basic events represented by intuitionistic fuzzy numbers of triangular membership functions. Sensitivity analysis has also been performed and the effects on system MTBF are addressed. The methodology improves the shortcomings of the existing probabilistic approaches and gives a better understanding of the system behavior through its graphical representation. The washing unit of a paper mill situated in a northern part of India, producing approximately 200 ton of paper per day, has been considered to demonstrate the proposed approach. The results may be helpful for the plant personnel for analyzing the systems' behavior and to improve their performance by adopting suitable maintenance strategies. PMID:22789401

  9. Health monitoring methodology based on exergetic analysis for building mechanical systems

    International Nuclear Information System (INIS)

    Exergetic analysis is not often performed in the context of retrocommissioning (RCX); this research provides insight into the benefits of incorporating this approach. Data collected from a previously developed RCX test for an air handling unit (AHU) on a college campus are used in an advanced thermodynamic analysis. The operating data is analyzed using the first and second laws and retrofit design solutions are recommended for improved system performance; the second law analysis is particularly helpful because it requires few additional calculations or data collections. The thermodynamic methodology is extended to a building's cooling plant, which uses a vapor compression refrigeration cycle (VCRC) chiller. Existing chiller data collected for the design of automated fault detection and diagnosis methodology is used. As with the AHU analysis, the second law analysis locates irreversibilities that would not be determined from a first law analysis alone. Plant data representing both normal and faulty operation is used to develop a chiller model for assessing performance and health monitoring. Data is analyzed to determine the viability of health monitoring by performing an exergy analysis on existing data. Conclusions are drawn about the usefulness of exergetic analysis for improving system operations of energy intensive building mechanical systems.

  10. Development of high pressure two-phase choked flow analysis methodology in complex piping system

    International Nuclear Information System (INIS)

    Choked flow mechanism, characteristics of two-phase flow sound velocity and compressibility effects on flow through various piping system components are studied to develop analysis methodology for high pressure two-phase choked flow in complex piping system which allows choking flow rate evaluation and piping system design related analysis. Piping flow can be said choked if Mach number is equal to 1 and compressibility effects can be accounted through modified incompressible formula in momentum equation. Based on these findings, overall analysis system is developed to study thermal-hydraulic effects on steady-state piping system flow and future research items are presented. (Author)

  11. Predicted costs of environmental controls for a commercial oil shale industry. Volume 1. An engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nevens, T.D.; Culbertson, W.J. Jr.; Wallace, J.R.; Taylor, G.C.; Jovanovich, A.P.; Prien, C.H.; Hicks, R.E.; Probstein, R.F.; Domahidy, G.

    1979-07-01

    The pollution control costs for a commercial oil shale industry were determined in a joint effort by Denver Research Institute, Water Purification Associates of Cambridge, and Stone and Webster Engineering of Boston and Denver. Four commercial oil shale processes were considered. The results in terms of cost per barrel of syncrude oil are predicted to be as follows: Paraho Process, $0.67 to $1.01; TOSCO II Process, $1.43 to $1.91; MIS Process, $2.02 to $3.03; and MIS/Lurgi-Ruhrgas Process, $1.68 to $2.43. Alternative pollution control equipment and integrated pollution control strategies were considered and optimal systems selected for each full-scale plant. A detailed inventory of equipment (along with the rationale for selection), a detailed description of control strategies, itemized costs and predicted emission levels are presented for each process. Capital and operating cost data are converted to a cost per barrel basis using detailed economic evaluation procedures. Ranges of cost are determined using a subjective self-assessment of uncertainty approach. An accepted methodology for probability encoding was used, and cost ranges are presented as subjective probability distributions. Volume I presents the detailed engineering results. Volume II presents the detailed analysis of uncertainty in the predicted costs.

  12. 3-D volume reconstruction of skin lesions for melanin and blood volume estimation and lesion severity analysis.

    Science.gov (United States)

    D'Alessandro, Brian; Dhawan, Atam P

    2012-11-01

    Subsurface information about skin lesions, such as the blood volume beneath the lesion, is important for the analysis of lesion severity towards early detection of skin cancer such as malignant melanoma. Depth information can be obtained from diffuse reflectance based multispectral transillumination images of the skin. An inverse volume reconstruction method is presented which uses a genetic algorithm optimization procedure with a novel population initialization routine and nudge operator based on the multispectral images to reconstruct the melanin and blood layer volume components. Forward model evaluation for fitness calculation is performed using a parallel processing voxel-based Monte Carlo simulation of light in skin. Reconstruction results for simulated lesions show excellent volume accuracy. Preliminary validation is also done using a set of 14 clinical lesions, categorized into lesion severity by an expert dermatologist. Using two features, the average blood layer thickness and the ratio of blood volume to total lesion volume, the lesions can be classified into mild and moderate/severe classes with 100% accuracy. The method therefore has excellent potential for detection and analysis of pre-malignant lesions. PMID:22829392

  13. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  14. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  15. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2011-10-01

    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  16. Path Constitution Analysis: A Methodology for Understanding Path Dependence and Path Creation

    Directory of Open Access Journals (Sweden)

    Jörg Sydow

    2012-11-01

    Full Text Available Although an increasing number of studies of technological, institutional and organizational change refer to the concepts of path dependence and path creation, few attempts have been made to consider these concepts explicitly in their methodological accounts. This paper addresses this gap and contributes to the literature by developing a comprehensive methodology that originates from the concepts of path dependence and path creation – path constitution analysis (PCA – and allows for the integration of multi-actor constellations on multiple levels of analysis within a process perspective. Based upon a longitudinal case study in the field of semiconductors, we illustrate PCA ‘in action’ as a template for other researchers and critically examine its adequacy. We conclude with implications for further path-oriented inquiries.

  17. Social representations, correspondence factor analysis and characterization questionnaire: a methodological contribution.

    Science.gov (United States)

    Lo Monaco, Grégory; Piermattéo, Anthony; Guimelli, Christian; Abric, Jean-Claude

    2012-11-01

    The characterization questionnaire is inspired by Q-sort methodologies (i.e. qualitative sorting). It consists in asking participants to give their opinion on a list of items by sorting them into categories depending on their level of characterization of the object. This technique allows us to obtain distributions for each item and each response modality (i.e. characteristic vs. not chosen vs. not characteristic). This contribution intends to analyze these frequencies by means of correspondence factor analysis. The originality of this contribution lies in the fact that this kind of analysis has never been used to process data collected by means of this questionnaire. The procedure will be detailed and exemplified by means of two empirical studies on social representations of the good wine and the good supermarket. The interests of such a contribution will be discussed from both methodological points of view and an applications perspective. PMID:23156928

  18. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    OpenAIRE

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H; Madigan, David; Ryan, Patrick; Friedman, Carol

    2012-01-01

    Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis.

  19. Methodological reflections on gesture analysis in second language acquisition and bilingualism research

    OpenAIRE

    Gullberg, M.

    2010-01-01

    Gestures, the symbolic movements speakers perform while they speak, form a closely inter-connected system with speech where gestures serve both addressee-directed (‘communicative’) and speaker-directed (’internal’) functions. This paper aims (1) to show that a combined analysis of gesture and speech offers new ways to address theoretical issues in SLA and bilingualism studies, probing SLA and bilingualism as product and process; and (2) to outline some methodological concerns and desiderata t...

  20. A journalistic corpus: a methodology for the analysis of the financial crisis in Spain

    OpenAIRE

    Botella Trelis, Ana Paloma; Stuart, Keith Douglas Charles; Gadea, Lucía

    2015-01-01

    In this paper, we propose a methodological approach to the linguistic study of a journalistic corpus. It analyzes the monitoring of the financial crisis in Spain in 2012 by two of the most important Spanish newspapers. The paper describes ongoing research into expressions of opinion in the discourse of the news about the financial crisis in Spain. In other words, this corpus-driven study investigates the expression of opinions through language in order to develop a semantic analysis of newspa...

  1. Social Exchange Concept As A Methodological Framework For Employment Relations Analysis

    OpenAIRE

    Azer Efendiev; Anna Gogoleva; Evgeniya Balabanova

    2014-01-01

    The goal of the paper is to suggest the methodological framework of social exchange for analysis of employment relations. Our literature review revealed confusion concerning the definition of social exchange in the context of labor processes and employee-organization relations. The latter are complex and imply all elements of reciprocal and negotiated exchange as well as economic and social forms of exchange. We focus on rules and means of exchange as well as power-dependence relations during...

  2. THE METHODOLOGY OF SYNDROME ANALYSIS WITHIN THE PARADIGM OF "QUALITAT IVE RESEARCH" IN CLINICAL PSYCHOLOGY

    OpenAIRE

    Elena I. Pervichko; Yuri P. Zinchenko

    2012-01-01

    This article considers the potential for applying contemporary philosophical theories (which distinguish classical, nonclassical, and postnonclassical types of scientific rationality) to the specification of theoretical methodological principles in the study of clinical psychology. We prove that psychological syndrome analysis (developed by the Vygotsky-Luria-Zeigarnik school), taken as a system of principles for organizing research as well as for interpreting its results, conforms to the epi...

  3. Methodology of demand forecast by market analysis of electric power and load curves

    International Nuclear Information System (INIS)

    A methodology for demand forecast of consumer classes and their aggregation is presented. An analysis of the actual attended market can be done by appropriate measures and load curves studies. The suppositions for the future market behaviour by consumer classes (industrial, residential, commercial, others) are shown, and the actions for optimise this market are foreseen, obtained by load curves modulations. The process of future demand determination is obtained by the appropriate aggregation of this segmented demands. (C.G.C.)

  4. Methodological Aspects of Qualitative-Quantitative Analysis of Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Gawlik Remigiusz

    2016-06-01

    Full Text Available The paper aims at recognizing the possibilities and perspectives of application of qualitative-quantitative research methodology in the field of economics, with a special focus on production engineering management processes. The main goal of the research is to define the methods that would extend the research apparatus of economists and managers by tools that allow the inclusion of qualitative determinants into quantitative analysis. Such approach is justified by qualitative character of many determinants of economic occurrences. At the same time quantitative approach seems to be predominant in production engineering management, although methods of transposition of qualitative decision criteria can be found in literature. Nevertheless, international economics and management could profit from a mixed methodology, incorporating both types of determinants into joint decision-making models. The research methodology consists of literature review and own analysis of applicability of mixed qualitative-quantitative methods for managerial decision-making. The expected outcome of the research is to find which methods should be applied to include qualitative-quantitative analysis into multicriteria decision-making models in the fields of economics, with a special regard to production engineering management.

  5. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  6. Safety analysis methodology with assessment of the impact of the prediction errors of relevant parameters

    International Nuclear Information System (INIS)

    The best estimate plus uncertainty approach (BEAU) requires the use of extensive resources and therefore it is usually applied for cases in which the available safety margin obtained with a conservative methodology can be questioned. Outside the BEAU methodology, there is not a clear approach on how to deal with the issue of considering the uncertainties resulting from prediction errors in the safety analyses performed for licensing submissions. However, the regulatory document RD-310 mentions that the analysis method shall account for uncertainties in the analysis data and models. A possible approach is presented, that is simple and reasonable, representing just the author's views, to take into account the impact of prediction errors and other uncertainties when performing safety analysis in line with regulatory requirements. The approach proposes taking into account the prediction error of relevant parameters. Relevant parameters would be those plant parameters that are surveyed and are used to initiate the action of a mitigating system or those that are representative of the most challenging phenomena for the integrity of a fission barrier. Examples of the application of the methodology are presented involving a comparison between the results with the new approach and a best estimate calculation during the blowdown phase for two small breaks in a generic CANDU 6 station. The calculations are performed with the CATHENA computer code. (author)

  7. Atlas based brain volumetry: How to distinguish regional volume changes due to biological or physiological effects from inherent noise of the methodology.

    Science.gov (United States)

    Opfer, Roland; Suppa, Per; Kepp, Timo; Spies, Lothar; Schippling, Sven; Huppertz, Hans-Jürgen

    2016-05-01

    Fully-automated regional brain volumetry based on structural magnetic resonance imaging (MRI) plays an important role in quantitative neuroimaging. In clinical trials as well as in clinical routine multiple MRIs of individual patients at different time points need to be assessed longitudinally. Measures of inter- and intrascanner variability are crucial to understand the intrinsic variability of the method and to distinguish volume changes due to biological or physiological effects from inherent noise of the methodology. To measure regional brain volumes an atlas based volumetry (ABV) approach was deployed using a highly elastic registration framework and an anatomical atlas in a well-defined template space. We assessed inter- and intrascanner variability of the method in 51 cognitively normal subjects and 27 Alzheimer dementia (AD) patients from the Alzheimer's Disease Neuroimaging Initiative by studying volumetric results of repeated scans for 17 compartments and brain regions. Median percentage volume differences of scan-rescans from the same scanner ranged from 0.24% (whole brain parenchyma in healthy subjects) to 1.73% (occipital lobe white matter in AD), with generally higher differences in AD patients as compared to normal subjects (e.g., 1.01% vs. 0.78% for the hippocampus). Minimum percentage volume differences detectable with an error probability of 5% were in the one-digit percentage range for almost all structures investigated, with most of them being below 5%. Intrascanner variability was independent of magnetic field strength. The median interscanner variability was up to ten times higher than the intrascanner variability. PMID:26723849

  8. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  9. TRAC-BF1/NEM stability methodology for BWR core wide and regional stability analysis

    International Nuclear Information System (INIS)

    A time-series analysis stability methodology is presented based on the TRAC-BF1/NEM coupled code. The methodology presented has a potential application for BWR core-wide and regional stability studies allowed by the 3D capabilities of the code. The stability analysis is performed at two different levels: using the TRAC-BF1 point kinetics model and employing the three-dimensional neutronic transient capability of the NEM code. Point kinetics calculations show power fluctuations when white noise is applied to the inlet mass flow rate of each of the channel components. These fluctuations contain information about the system stability, and are subsequently studied with time-series analysis methods. The analysis performed showed that the reactor core has a low-frequency resonance typical of BWRs. Analysis of preliminary three-dimensional calculations indicates that the power fluctuations do not contain the typical resonance at low frequency. This fact may be related to the limitation of the thermal-hydraulic (T-H) feedback representation through the use of two-dimensional tables for the cross-sections needed for 3D kinetics calculations. The results suggest that a more accurate table look-up should be used, which includes a three-dimensional representation of the feedback parameters (namely, average fuel temperature, average moderator temperature, and void fraction of the T-H cell of interest). Further research is being conducted on improving the cross-section modeling methodology, used to feed the neutron kinetics code for both steady state and transient cases. Also a comprehensive analysis of the code transient solution is being conducted to investigate the nature of the weak dependence of the power response on T-H variations during the performed 3D stability transient calculations

  10. Finite-Volume Analysis for the Cahn-Hilliard equation with Dynamic boundary conditions

    OpenAIRE

    Nabet, Flore

    2014-01-01

    This work is devoted to the convergence analysis of a finite-volume approximation of the 2D Cahn-Hilliard equation with dynamic boundary conditions. The method that we propose couples a 2d-finite-volume method in a bounded, smooth domain and a 1d-finite-volume method on its boundary. We prove convergence of the sequence of approximate solutions.

  11. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    International Nuclear Information System (INIS)

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π(1+√w-1), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at "maximum power density operating pressure" requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  12. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  13. Common cause failure analysis of PWR containment spray system by GO-FLOW methodology

    International Nuclear Information System (INIS)

    Highlights: • Identification of particular causes of failure for common cause failure analysis. • Evaluation of dynamic reliability of PWR containment sprays system (standard case). • Selection of important parameters by sensitivity analysis for CCF analysis. • Calculated dynamic reliability has significantly worsened than that of standard case. • GO-FLOW with it advance function can be used alternatively to FAT and ET tree. -- Abstract: Common cause failure (CCF) is the simultaneous failure of multiple components due to some particular cause of failure and has long been recognized as an important issue in the probabilistic safety assessment (PSA). Sometimes CCFs have an important contribution to system unreliability. In this study, Common Cause Failure has been considered in the reliability analysis and procedure of CCF analysis is treated by GO-FLOW methodology. As the sample system, PWR containment spray system has been taken. It is shown that dynamic reliability of the containment spray system has been significantly decreased by common cause failures

  14. Using the HRA Calculator in Human Reliability Analysis done with Methodology Described in NUREG-1921

    International Nuclear Information System (INIS)

    The HRA Calculator is a tool designed to help to the preparation and documentation of the analysis of human reliability in the probabilistic safety analysis (APS) of fire. Collect the tasks required to develop and quantify the probabilities of error in accordance with the methodology described in the NUREG-1921. The HRA Calculator is a database that includes the tasks indicated in the NUREG-1921 for the analysis of human reliability. For the task of quantitative analysis the HRA Calculator includes several methods of quantification for human actions that are performed before and after an accident, one of the main advantages of the HRA Calculator is the systematization and standardization of the analysis of human reliability of fire. It is also a tool that allows the use of criteria more objective to define and quantify human actions, so that models collected, to the extent possible, the reality of the plant as is as operated. (Author)

  15. Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)

    Science.gov (United States)

    Coote, A. M.; Bernknopf, R.; Smart, A.

    2013-12-01

    Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

  16. Instrumental and methodological developments for isotope dilution analysis of gaseous mercury species

    OpenAIRE

    Larsson, Tom

    2007-01-01

    This thesis deals with instrumental and methodological developments for speciation analysis of gaseous mercury (Hg(g)), based on isotope dilution analysis (IDA). The studied species include Hg0, (CH3)2Hg, CH3HgX and HgX2 (where X symbolises a negatively charged counter ion in the form of a halide or hydroxyl ion). Gas chromatography hyphenated with inductively coupled plasma mass spectrometry (GC-ICPMS) was used for separation and detection of Hg(g) species. Permeation tubes were used for the...

  17. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  18. Spatial and model-order based reactor signal analysis methodology for BWR core stability evaluation

    International Nuclear Information System (INIS)

    A new methodology for the boiling water reactor core stability evaluation from measured noise signals has been recently developed and adopted at the Paul Scherrer Institut (PSI). This methodology consists in a general reactor noise analysis where as much as possible information recorded during the tests is investigated prior to determining core representative stability parameters, i.e. the decay ratio (DR) and the resonance frequency, along with an associated estimate of the uncertainty range. A central part in this approach is that the evaluation of the core stability parameters is performed not only for a few but for ALL recorded neutron flux signals, allowing thereby the assessment of signal-related uncertainties. In addition, for each signal, three different model-order optimization methods are systematically employed to take into account the sensitivity upon the model-order. The current methodology is then applied to the evaluation of the core stability measurements performed at the Leibstadt NPP, Switzerland, during cycles 10, 13 and 19. The results show that as the core becomes very stable, the method-related uncertainty becomes the major contributor to the overall uncertainty range while for intermediate DR values, the signal-related uncertainty becomes dominant. However, as the core stability deteriorates, the method-related and signal-related spreads have similar contributions to the overall uncertainty, and both are found to be small. The PSI methodology identifies the origin of the different contributions to the uncertainty. Furthermore, in order to assess the results obtained with the current methodology, a comparative study is for completeness carried out with respect to results from previously developed and applied procedures. The results show a good agreement between the current method and the other methods

  19. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  20. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    International Nuclear Information System (INIS)

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  1. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Faculdade de Medicina. Dept. de Biologia Molecular], e-mail: mejia_famerp@yahoo.com.br; Braga, J. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Div. de Astrofisica; Correa, R. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Ciencia Espacial e Atmosferica; Leite, J.P. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Neurologia, Psiquiatria e Psicologia Medica; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica

    2009-08-15

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  2. A Theoretical and Empirical Analysis of Challenges and motivations of Interdisciplinary Studies Implying on Interdisciplinary Methodology

    Directory of Open Access Journals (Sweden)

    M. Fatehrad

    2012-01-01

    Full Text Available Purpose, although literature has focused on explaining theoretical background in the area of interdisciplinary studies, a few empirical studies has paid attention to the main reasons of performing interdisciplinary projects. To fill the research gap, current study tries to identify challenges and motivations of interdisciplinary studies for researchers to participate in interdisciplinary projects. The main purposes of the research are: 1 identifying the challenges and motivations of interdisciplinary studies implying on interdisciplinary methodology, 2 determining the degree of importance of the above elements regarding interdisciplinary studies. Methodology, current study is an exploratory in terms of methodology, descriptive in terms of research design, and development in terms of purpose. This study employing an interdisciplinary team firstly investigated theoretical background and prior research on interdisciplinary studies, identified challenges and motivations of the studies. Secondly, a structured interview performed with five interdisciplinary researchers. To investigate the identified elements empirically, a self-administrated questionnaire was used based on the literature and interviews. The questionnaire was distributed among 64 researchers from different disciplines who had experienced interdisciplinary activities. Data was analyzed by confirmatory factor analysis, composite reliability, one-sample t-test, Friedman test and analysis of variance (ANOVA using LISREL and SPSS. Results indicated that recognized motivations except for public learning and satisfying psychological needs, and identified challenges have considerable effect on performing interdisciplinary studies.

  3. Volume-Rendering-Based Interactive 3D Measurement for Quantitative Analysis of 3D Medical Images

    OpenAIRE

    Yakang Dai; Jian Zheng; Yuetao Yang; Duojie Kuai; Xiaodong Yang

    2013-01-01

    3D medical images are widely used to assist diagnosis and surgical planning in clinical applications, where quantitative measurement of interesting objects in the image is of great importance. Volume rendering is widely used for qualitative visualization of 3D medical images. In this paper, we introduce a volume-rendering-based interactive 3D measurement framework for quantitative analysis of 3D medical images. In the framework, 3D widgets and volume clipping are integrated with volume render...

  4. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    Science.gov (United States)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  5. A gap analysis methodology for collecting crop genepools: a case study with phaseolus beans.

    Directory of Open Access Journals (Sweden)

    Julián Ramírez-Villegas

    Full Text Available BACKGROUND: The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis of crop wild relatives as a means to guide efficient and effective collecting activities. METHODOLOGY/PRINCIPAL FINDINGS: The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5% are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap "hotspots", representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. CONCLUSIONS/SIGNIFICANCE: Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding. Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources.

  6. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  7. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  8. Volume component analysis for classification of LiDAR data

    Science.gov (United States)

    Varney, Nina M.; Asari, Vijayan K.

    2015-03-01

    One of the most difficult challenges of working with LiDAR data is the large amount of data points that are produced. Analysing these large data sets is an extremely time consuming process. For this reason, automatic perception of LiDAR scenes is a growing area of research. Currently, most LiDAR feature extraction relies on geometrical features specific to the point cloud of interest. These geometrical features are scene-specific, and often rely on the scale and orientation of the object for classification. This paper proposes a robust method for reduced dimensionality feature extraction of 3D objects using a volume component analysis (VCA) approach.1 This VCA approach is based on principal component analysis (PCA). PCA is a method of reduced feature extraction that computes a covariance matrix from the original input vector. The eigenvectors corresponding to the largest eigenvalues of the covariance matrix are used to describe an image. Block-based PCA is an adapted method for feature extraction in facial images because PCA, when performed in local areas of the image, can extract more significant features than can be extracted when the entire image is considered. The image space is split into several of these blocks, and PCA is computed individually for each block. This VCA proposes that a LiDAR point cloud can be represented as a series of voxels whose values correspond to the point density within that relative location. From this voxelized space, block-based PCA is used to analyze sections of the space where the sections, when combined, will represent features of the entire 3-D object. These features are then used as the input to a support vector machine which is trained to identify four classes of objects, vegetation, vehicles, buildings and barriers with an overall accuracy of 93.8%

  9. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    Science.gov (United States)

    Naik, Kartiki S; Stenstrom, Michael K

    2016-03-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option. PMID:26730575

  10. Analysis of kyoto university reactor physics critical experiments using NCNSRC calculation methodology

    International Nuclear Information System (INIS)

    The kyoto university reactor physics experiments on the university critical assembly is used to benchmark validate the NCNSRC calculations methodology. This methodology has two lines, diffusion and Monte Carlo. The diffusion line includes the codes WIMSD4 for cell calculations and the two dimensional diffusion code DIXY2 for core calculations. The transport line uses the MULTIKENO-Code vax Version. Analysis is performed for the criticality, and the temperature coefficients of reactivity (TCR) for the light water moderated and reflected cores, of the different cores utilized in the experiments. The results of both Eigen value and TCR approximately reproduced the experimental and theoretical Kyoto results. However, some conclusions are drawn about the adequacy of the standard wimsd4 library. This paper is an extension of the NCNSRC efforts to assess and validate computer tools and methods for both Et-R R-1 and Et-MMpr-2 research reactors. 7 figs., 1 tab

  11. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  12. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  13. A methodology for the analysis on a national scale of environmental parameters

    International Nuclear Information System (INIS)

    It is here described a methodology for the research over the Italian territory of areas suitable for the siting of nuclear power plants. This methodology is designed for the analysis of a wide territory and it makes use of all the parameters available with a continuous character all over Italy (i.e. demographical and hydrographical data); the consideration of the missing parameters is deferred at the moment of the study on the areas resulting from the first review. The authors underline the usefulness of a territorial ''data-bank'', both for sorting out siting zones for nuclear power plants and for more genetic environmental and sanitary evaluations. In this report are also presented thematic charts, deriving from the elaboration of some parameters

  14. A consistent probabilistic methodology for the Seabrook Station containment event tree analysis

    International Nuclear Information System (INIS)

    A containment response analysis incudes a quantification of accident progression, containment failure, and source terms. A consistent and fully probabilistic methodology was developed and applied in the Seabrook probabilistic risk assessment (PRA). A wide range of deterministic accident analyses were used as input to the probabilistic quantification of the containment event tree. The methodology embraces the definition of initial conditions, criteria for selecting top events, a probabilistic quantification of top event split fractions and the interpretation and propagation of uncertainties. The correct probabilistic interpretation of top event split fractions as uncertainties of deterministic events instead of frequencies of random events has important implications in the final risk results, when risk is stated in a probability of frequency format; that is, when the uncertainties in the frequency of consequences are explicitly quantified

  15. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-08-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  16. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  17. A Review and Analysis on Mobile Application Development Processes using Agile Methodologies

    Directory of Open Access Journals (Sweden)

    Harleen K. Flora

    2013-07-01

    currently in use for the development of mobile applications. This paper provides a detailed review and analysis on the use of agile methodologies in the proposed processes associated with mobile application skills and highlights its benefit and constraints. In addition, based on this analysis, future research needs are identified and discussed.

  18. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R. (INEEL); Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K. (SNL); Rath, J.S. (New Mexico Engineering Research Institute)

    1998-10-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  19. Analysis methodology for RBMK-1500 core safety and investigations on corium coolabiblty during a LWR sever accidnet

    OpenAIRE

    Jasiulevicius, Audrius

    2004-01-01

    This thesis presents the work involving two broad aspectswithin the field of nuclear reactor analysis and safety. Theseare: - development of a fully independent reactor dynamics andsafety analysis methodology of the RBMK-1500 core transientaccidents and - experiments on the enhancement of coolabilityof a particulate bed or a melt pool due to heat removal throughthe control rod guide tubes. The first part of the thesis focuses on the development ofthe RBMK-1500 analysis methodology based on th...

  20. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  1. The methodological fundaments of development state analysis of surface engineering technologies

    Directory of Open Access Journals (Sweden)

    A. Dobrzańska-Danikiewicz

    2010-06-01

    Full Text Available Purpose: The goal of this paper is to present the authority methodological fundaments of development state analysis of surface engineering technologies against a background of macro- and microenvironment. That analysis is carried out as a part of the project entitled “The foresight of surface properties formation leading technologies of engineering materials and biomaterials”. The research project called FORSURF is co-founded by European Regional Development Fund.Design/methodology/approach: The foresight is the whole activity focused on choosing the best future vision and showing ways of that vision realisation using the right methods. However, the approach called technology foresight is the process concentrating scientists, engineers, industrialists, Government officials and others in order to identify areas of strategic research and the leading technologies, which in long term will contribute to the greatest economic and social benefits and sustain industrial competitiveness. The considered FORSURF project belongs to the set of technology foresights.Findings: The set of the crucial technologies in each considered research scope is an expected result of the carried out development state analysis of surface engineering technology against a background of macro- and microenvironment. There are fourteen research scopes in the FORSURF project.Research limitations/implications: The results of the development state analysis of surface engineering technologies are the basis conditioning subject matter of the first research iteration of Delphi method carried out within the framework of the FORSURF project. The main research implication of the whole FORSURF project is an identification of strategic research directions crucial in the next 20 years in the field of surface engineering.Practical implications: The practical implication of the definition of the methodological fundaments of development state analysis of surface engineering technology is to

  2. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  3. The next generation analysis methodology for cracked pipe systems subjected to dynamic loads

    International Nuclear Information System (INIS)

    Evaluation procedures for cracked piping systems under dynamic loads, seismic and water hammer in particular, have evolved over the years from assuming an instantaneous brittle double-ended pipe break in the early years of the nuclear industry, to using peak load from a linear elastic analysis with elastic-plastic fracture mechanics over the past 10 years. With improvements in computing power and developments in the fracture mechanics analysis of cracked pipe, it is now possible, using work station computers, to perform nonlinear time-history cracked-pipe analyses that can predict, not only the maximum load, but an make reasonably accurate, but slightly conservative, predictions of the time to surface crack penetration, how far the crack may propagate around the circumference, and the crack opening history for leak rate, decompression, and jet-force calculations. This new analysis methodology, the so-called cracked-pipe element nonlinear FEA fracture analysis is currently being formulated and refined. The analytical methods and comparison with experimental cracked-pipe data, as well as some specific applications illustrating margins obtained by this methodology versus more traditional analyses will be summarized

  4. Symbolic Dynamics Analysis: a new methodology for foetal heart rate variability analysis

    OpenAIRE

    Improta, Giovanni

    2015-01-01

    Cardiotocography (CTG) is a widespread foetal diagnostic methods. However, it lacks of objectivity and reproducibility since its dependence on observer's expertise. To overcome these limitations, more objective methods for CTG interpretation have been proposed. In particular, many developed techniques aim to assess the foetal heart rate variability (FHRV). Among them, some methodologies from nonlinear systems theory have been applied to the study of FHRV. All the techniques have proved to be ...

  5. Methodology for Analysis, Modeling and Simulation of Airport Gate-waiting Delays

    Science.gov (United States)

    Wang, Jianfeng

    This dissertation presents methodologies to estimate gate-waiting delays from historical data, to identify gate-waiting-delay functional causes in major U.S. airports, and to evaluate the impact of gate operation disruptions and mitigation strategies on gate-waiting delay. Airport gates are a resource of congestion in the air transportation system. When an arriving flight cannot pull into its gate, the delay it experiences is called gate-waiting delay. Some possible reasons for gate-waiting delay are: the gate is occupied, gate staff or equipment is unavailable, the weather prevents the use of the gate (e.g. lightning), or the airline has a preferred gate assignment. Gate-waiting delays potentially stay with the aircraft throughout the day (unless they are absorbed), adding costs to passengers and the airlines. As the volume of flights increases, ensuring that airport gates do not become a choke point of the system is critical. The first part of the dissertation presents a methodology for estimating gate-waiting delays based on historical, publicly available sources. Analysis of gate-waiting delays at major U.S. airports in the summer of 2007 identifies the following. (i) Gate-waiting delay is not a significant problem on majority of days; however, the worst delay days (e.g. 4% of the days at LGA) are extreme outliers. (ii) The Atlanta International Airport (ATL), the John F. Kennedy International Airport (JFK), the Dallas/Fort Worth International Airport (DFW) and the Philadelphia International Airport (PHL) experience the highest gate-waiting delays among major U.S. airports. (iii) There is a significant gate-waiting-delay difference between airlines due to a disproportional gate allocation. (iv) Gate-waiting delay is sensitive to time of a day and schedule peaks. According to basic principles of queueing theory, gate-waiting delay can be attributed to over-scheduling, higher-than-scheduled arrival rate, longer-than-scheduled gate-occupancy time, and reduced gate

  6. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  7. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author)

  8. Multiscale Entropy Analysis of Center-of-Pressure Dynamics in Human Postural Control: Methodological Considerations

    Directory of Open Access Journals (Sweden)

    Brian J. Gow

    2015-11-01

    Full Text Available Multiscale entropy (MSE is a widely used metric for characterizing the nonlinear dynamics of physiological processes. Significant variability, however, exists in the methodological approaches to MSE which may ultimately impact results and their interpretations. Using publications focused on balance-related center of pressure (COP dynamics, we highlight sources of methodological heterogeneity that can impact study findings. Seventeen studies were systematically identified that employed MSE for characterizing COP displacement dynamics. We identified five key methodological procedures that varied significantly between studies: (1 data length; (2 frequencies of the COP dynamics analyzed; (3 sampling rate; (4 point matching tolerance and sequence length; and (5 filtering of displacement changes from drifts, fidgets, and shifts. We discuss strengths and limitations of the various approaches employed and supply flowcharts to assist in the decision making process regarding each of these procedures. Our guidelines are intended to more broadly inform the design and analysis of future studies employing MSE for continuous time series, such as COP.

  9. Uncertainty quantification methodology development for the best-estimate safety analysis

    International Nuclear Information System (INIS)

    This study deals with two approaches to uncertainty quantification methodology. In the first approach, an uncertainty quantification methodology is proposed and applied to the estimation of nuclear reactor fuel peak cladding temperature (PCT) uncertainty. The proposed method adopts the use of Latin hypercube sampling (LHS). The independency between the input variables is verified through a correlation coefficient test. The uncertainty of the output variables is estimated through a goodness-of-fit test on the sample data. In the application, the approach taken to quantifying the total mean and total 95% probability PCTs is given. Emphasis is placed upon the PCT uncertainty estimation due to models' or correlations' uncertainties with the assumption that significant sources of PCT uncertainty are determined. In the second approach, an uncertainty quantification methodology is proposed for a severe accident analysis which has large uncertainties. The proposed method adopts the concept of probabilistic belief measure to transform an analyst's belief on a top event into the equivalent probability of that top event. For the purpose of comparison, analyses are done by 1) applying probability theory regarding the occurring probability of top event as a physical probability or a frequency, 2) applying fuzzy set theory with fuzzy numbered occurring probability of top event, and 3) transforming the analysts' belief on the top event into equivalent probability by the probabilistic belief measure method

  10. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  11. Development of CFD Analysis Methodology of Hydraulic Load Evaluation in POSRV Piping System

    International Nuclear Information System (INIS)

    APR1400 has been improved as an advanced light water reactor that adopts new technology's. One of major technologies is IRWST(In-containment Refueling Water Storage Tank) placed inside containment. In order to adjust the new technology when POSRV(Pilot Operated Safety Relief Valve) is opened, POSRV-IRWST linked line must be kept safe. Theoretical solution and experimental data are needed for structure integrity, but proven data are insufficient from the viewpoint of hydrodynamics. The hydrodynamic flow analysis and the thermodynamic behavior analysis should be performed by using CFD. The objective of this study is to develop the CFD analysis methodology of hydraulic load evaluation in IRWST piping system. This method is a basic hydraulic load evaluation in POSRV piping system. Also, this will help to analyze fluid-structural interface and to predict special phenomena. Therefore, that can be used as a basis to the most suitable design

  12. Adaptation of SW-846 methodology for the organic analysis of radioactive mixed wastes

    International Nuclear Information System (INIS)

    Modifications to SW-846 sample preparation methodology permit the organic analysis of radioactive mixed waste with minimum personal radiation exposure and equipment contamination. This paper describes modifications to SW-846 methods 5030 and 3510-3550 for sample preparation in radiation-zoned facilities (hood, glove box, and hot cell) and GC-MS analysis of the decontaminated organic extracts in a conventional laboratory for volatile and semivolatile organics by methods 8240 and 8270 (respectively). Results will be presented from the analysis of nearly 70 nuclear waste storage tank liquids and 17 sludges. Regulatory organics do not account for the organic matter suggested to be present by total organic carbon measurements. 7 refs., 5 tabs

  13. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  14. Strengths and weakness of the methodologies for the innovative nuclear systems analysis

    International Nuclear Information System (INIS)

    At present nuclear energy is in a transition situation. New efforts are being used trying to find the way to build more nuclear power. As the nuclear energy is an inherently multivariable system, the potential judgment at the ends tends to evolve towards multi criteria analysis methodologies trying to analyze innovative nuclear systems for the future. Nuclear energy has been an active energy player starting 50 years ago, and several times big efforts have been used trying to evaluate the potential of future development of nuclear energy depending on future scenarios using multi criteria analysis methods. Without the intention to be performed an assessment about what finally happens in the evolution of nuclear technology, performing only factual comparison and using only data available in the 1950s, in this work is analyzed if multi criteria analysis methods is sufficient to predict the final success of the currently available well-established commercial reactors. The conclusion is that if uncertainties are not included, the classical multi criteria methodologies evaluated could not be used to predict the successful deployment of PWR, BWR and CANDU, with the status of knowledge of 1956, and without including others factors and external non numerical judgment. Uncertainties produce compatible results with the further historical evolution, but if they have to be included with large margins and as a penalty in the figures of merit

  15. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    Energy Technology Data Exchange (ETDEWEB)

    Armelin, Maria Jose A.; Ferraz, Caue de Mello; Hamada, Margarida M., E-mail: marmelin@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Lab. de Analise por Ativacao Neutronica

    2015-07-01

    Bismuth tri-iodide (BrI{sub 3}) is an attractive material for using as a semiconductor. In this paper, BiI{sub 3} crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI{sub 3} purification methodology. (author)

  16. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Živan Ristić

    2006-12-01

    Full Text Available Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a Methodological aspects of evaluation (kinds of evaluation, metaevaluation and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement (b Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspectives (in the perspectives of learning and development, perspectives of internal processes, perspectives of the consumer/user, and in financial perspectives (c Systems and IT solutions of evaluation and measuring performances of the organization in strategic analysis and control.

  17. Improved Methodology Application for 12-Rad Analysis in a Shielded Facility at SRS

    International Nuclear Information System (INIS)

    The DOE Order 420.1 requires establishing 12-rad evacuation zone boundaries and installing Criticality Accident Alarm System (CAAS) per ANS-8.3 standard for facilities having a probability of criticality greater than 10-6 per year. The H-Canyon at the Savannah River Site (SRS) is one of the reprocessing facilities where SRS reactor fuels, research reactor fuels, and other fissile materials are processed and purified using a modified Purex process called H-Modified or HM Process. This paper discusses an improved methodology for 12-rad zone analysis and its implementation within this large shielded facility that has a large variety of criticality sources and scenarios

  18. Development of a methodology for strontium isotopic analysis on archaeological skeletal tissue

    International Nuclear Information System (INIS)

    Full text: Strontium isotope analysis on skeletal tissue provide information about human migration. By comparing 87Sr/86Sr values for human bone or dental tissue with the local strontium isotope signature, determined by faunal, agricultural and/or environmental samples, it could be possible to identify the migration movements of our ancestors. The present work describes the development of a methodology for bone tissue preparation prior the measurement of strontium isotope ratios by MC-ICPMS. Different extraction procedures were evaluated on old bone samples for the suitable separation of the diagenetic and non diagenetic strontium. (author)

  19. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  20. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    International Nuclear Information System (INIS)

    Bismuth tri-iodide (BrI3) is an attractive material for using as a semiconductor. In this paper, BiI3 crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI3 purification methodology. (author)

  1. A Closed-Loop Optimal Neural-Network Controller to Optimize Rotorcraft Aeromechanical Behaviour. Volume 1; Theory and Methodology

    Science.gov (United States)

    Leyland, Jane Anne

    2001-01-01

    Given the predicted growth in air transportation, the potential exists for significant market niches for rotary wing subsonic vehicles. Technological advances which optimise rotorcraft aeromechanical behaviour can contribute significantly to both their commercial and military development, acceptance, and sales. Examples of the optimisation of rotorcraft aeromechanical behaviour which are of interest include the minimisation of vibration and/or loads. The reduction of rotorcraft vibration and loads is an important means to extend the useful life of the vehicle and to improve its ride quality. Although vibration reduction can be accomplished by using passive dampers and/or tuned masses, active closed-loop control has the potential to reduce vibration and loads throughout a.wider flight regime whilst requiring less additional weight to the aircraft man that obtained by using passive methads. It is ernphasised that the analysis described herein is applicable to all those rotorcraft aeromechanical behaviour optimisation problems for which the relationship between the harmonic control vector and the measurement vector can be adequately described by a neural-network model.

  2. Methodological approaches to analysis of agricultural countermeasures on radioactive contaminated areas: Estimation of effectiveness and comparison of different alternatives

    DEFF Research Database (Denmark)

    Yatsalo, B.I.; Hedemann Jensen, P.; Alexakhin, R.M.

    Methodological aspects of countermeasure analysis in the long-term period after a nuclear accident are discussed for agriculture countermeasures for illustrative purposes. The estimates of effectiveness fbr specific countermeasures as well as methods of justified action levels assessments and...

  3. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  4. Criticality Safety Evaluation of a Swiss wet storage pool using a global uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Highlights: • Uncertainty evaluation of manufacturing tolerances is relevant for wet storage pool. • Analyses based on global modelling can over-estimates the conservatism. • It is important to perturb independently elements relative to storage racks. • PDF sets based on conventional assumption could over-exceed in conservatism. • We suggest caution in applying standard approach when special PDF are involved. - Abstract: Uncertainty quantification is a key component in the Criticality Safety Evaluation (CSE) of spent nuclear fuel systems. An important source of uncertainties is caused by manufacturing and technological parameter tolerances. In this work, such class of uncertainties are evaluated for a Swiss wet storage pool. The selected configuration corresponds to a case where the target criticality eigenvalue is close to the upper criticality safety limits. Although current PSI CSE safety criteria are fulfilled, it is reasonable to apply uncertainty quantification methodologies in order to provide the regulatory authorities with additional information relevant for safety evaluations. The MTUQ (Manufacturing and Technological Uncertainty Quantification) methodology, based on global stochastic sampling was the selected tool for the analysis. Such tool is specifically designed for the treatment of geometrical/material uncertainties for any target system. In particular the MTUQ advanced modelling capability allows the implementation of realistic boundary condition, with a resulting detailed evaluation of statistical quantities of interest in CSE. Therein, the computational code implemented is the MCNP Monte Carlo based neutron transport code. The analysis showed the benefits in using realistic modelling compared to the traditional one-factor-at-time methodology applied to system modelled using repeated structures. A detailed comparison between the 2 approaches is also presented. Finally, it is discussed the role of asymmetrical probability distribution

  5. Effects of immersion on visual analysis of volume data.

    Science.gov (United States)

    Laha, Bireswar; Sensharma, Kriti; Schiffbauer, James D; Bowman, Doug A

    2012-04-01

    Volume visualization has been widely used for decades for analyzing datasets ranging from 3D medical images to seismic data to paleontological data. Many have proposed using immersive virtual reality (VR) systems to view volume visualizations, and there is anecdotal evidence of the benefits of VR for this purpose. However, there has been very little empirical research exploring the effects of higher levels of immersion for volume visualization, and it is not known how various components of immersion influence the effectiveness of visualization in VR. We conducted a controlled experiment in which we studied the independent and combined effects of three components of immersion (head tracking, field of regard, and stereoscopic rendering) on the effectiveness of visualization tasks with two x-ray microscopic computed tomography datasets. We report significant benefits of analyzing volume data in an environment involving those components of immersion. We find that the benefits do not necessarily require all three components simultaneously, and that the components have variable influence on different task categories. The results of our study improve our understanding of the effects of immersion on perceived and actual task performance, and provide guidance on the choice of display systems to designers seeking to maximize the effectiveness of volume visualization applications. PMID:22402687

  6. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  7. Optimization of coagulation-flocculation process for pulp and paper mill effluent by response surface methodological analysis.

    Science.gov (United States)

    Ahmad, A L; Wong, S S; Teng, T T; Zuhairi, A

    2007-06-25

    Coagulation-flocculation is a proven technique for the treatment of high suspended solids wastewater. In this study, the central composite face-centered design (CCFD) and response surface methodology (RSM) have been applied to optimize two most important operating variables: coagulant dosage and pH, in the coagulation-flocculation process of pulp and paper mill wastewater treatment. The treated wastewater with high total suspended solids (TSS) removal, low SVI (sludge volume index) and high water recovery are the main objectives to be achieved through the coagulation-flocculation process. The effect of interactions between coagulant dosage and pH on the TSS removal and SVI are significant, whereas there is no interaction between coagulant dosage and water recovery. Quadratic models have been developed for the response variables, i.e. TSS removal, SVI and water recovery based on the high coefficient of determination (R(2)) value of >0.99 obtained from the analysis of variances (ANOVA). The optimum conditions for coagulant dosage and pH are 1045mgL(-1) and 6.75, respectively, where 99% of TSS removal, SVI of 37mLg(-1) and 82% of water recovery can be obtained. PMID:17161910

  8. Analysis on volume grating induced by femtosecond laser pulses.

    Science.gov (United States)

    Zhou, Keya; Guo, Zhongyi; Ding, Weiqiang; Liu, Shutian

    2010-06-21

    We report on a kind of self-assembled volume grating in silica glass induced by tightly focused femtosecond laser pulses. The formation of the volume grating is attributed to the multiple microexplosion in the transparent materials induced by the femtosecond pulses. The first order diffractive efficiency is in dependence on the energy of the pulses and the scanning velocity of the laser greatly, and reaches as high as 30%. The diffraction pattern of the fabricated grating is numerically simulated and analyzed by a two dimensional FDTD method and the Fresnel Diffraction Integral. The numerical results proved our prediction on the formation of the volume grating, which agrees well with our experiment results. PMID:20588497

  9. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  10. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  11. Methodological basis for analysis and accounting of NPP probabilistic safety analysis uncertainties

    International Nuclear Information System (INIS)

    The paper presents classification of NPP probabilistic safety analysis uncertainties and defines their main sources. It sets forth methods to perform statistical and analytical analysis of different uncertainty classes, proposes sequence of efforts related to analysis and accounting of uncertainties in making decisions on NPP safety

  12. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    Science.gov (United States)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are

  13. Development of core thermal hydraulic analysis methodology using multichannel code system

    International Nuclear Information System (INIS)

    A multi-channel core analysis model using a subchannel code TORC is developed to improve the thermal margin, and is assessed and compared with the existing single-channel analysis model. To apply the TORC code to the w-type reactor core, a hot subchannel DNBR analysis model is developed using the lumping technology. In addition, the sensitivity of TORC to various models and input parameters are carried out to appreciate the code characteristics. The developed core analysis model is applied to the evaluation of the thermal margin for 17 x 17 KOFA loaded core. For this calculation, the KRB1 CHF correlation is developed on the basis of w and Siemens bundle CHF data, and the DNB design limit is established using the STDP method. From the result of the steady-state and transient analysis of the 17 x 17 KOFA loaded core, it is found that the extra 10% DNBR margin can be obtained compared with the existing single-channel analysis methodology. (Author) 65 figs., 12 tabs

  14. Task analysis of nuclear-power-plant control-room crews: project approach methodology

    International Nuclear Information System (INIS)

    A task analysis of nuclear-power-plant control-room crews was performed by General Physics Corporation and BioTechnology, Inc., for the Office of Nuclear Regulatory Research. The task-analysis methodology used in the project is discussed and compared to traditional task-analysis and job-analysis methods. The objective of the project was to conduct a crew task analysis that would provide data for evaluating six areas: (1) human-engineering design of control rooms and retrofitting of current control rooms; (2) the numbers and types of control-room operators needed with requisite skills and knowledge; (3) operator qualification and training requirements; (4) normal, off-normal, and emergency operating procedures; (5) job-performance aids; and (6) communications. The data-collection approach focused on a generic structural framework for assembling the multitude of task data that were observed. The results of the data-collection effort were compiled in a coputerized task database. Six demonstrations for suitability analysis were subsequently conducted in each of the above areas and are described in this report

  15. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    Science.gov (United States)

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  16. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    Science.gov (United States)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  17. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  18. MossWinn—methodological advances in the field of Mössbauer data analysis

    International Nuclear Information System (INIS)

    The methodology of Mössbauer data analysis has been advanced via the development of a novel scientific database system concept and its realization in the field of Mössbauer spectroscopy, as well as by the application of parallel computing techniques for the enhancement of the efficiency of various processes encountered in the practice of Mössbauer data handling and analysis. The present article describes the new database system concept along with details of its realization in the form of the MossWinn Internet Database (MIDB), and illustrates the performance advantage that may be realized on multi-core processor systems by the application of parallel algorithms for the implementation of database system functions.

  19. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  20. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  1. Thermodynamic analysis of a Stirling engine including dead volumes of hot space, cold space and regenerator

    Energy Technology Data Exchange (ETDEWEB)

    Kongtragool, Bancha; Wongwises, Somchai [Fluid Mechanics, Thermal Engineering and Multiphase Flow Research Laboratory (FUTURE), Department of Mechanical Engineering, Faculty of Engineering, King Mongkut' s University of Technology Thonburi, 91 Suksawas 48, Bangmod, Bangkok 10140 (Thailand)

    2006-03-01

    This paper provides a theoretical investigation on the thermodynamic analysis of a Stirling engine. An isothermal model is developed for an imperfect regeneration Stirling engine with dead volumes of hot space, cold space and regenerator that the regenerator effective temperature is an arithmetic mean of the heater and cooler temperature. Numerical simulation is performed and the effects of the regenerator effectiveness and dead volumes are studied. Results from this study indicate that the engine net work is affected by only the dead volumes while the heat input and engine efficiency are affected by both the regenerator effectiveness and dead volumes. The engine net work decreases with increasing dead volume. The heat input increases with increasing dead volume and decreasing regenerator effectiveness. The engine efficiency decreases with increasing dead volume and decreasing regenerator effectiveness. (author)

  2. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha−1 to produce 96.4 GJ ha−1, resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  3. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  4. Automated segmentation and dose-volume analysis with DICOMautomaton

    Science.gov (United States)

    Clark, H.; Thomas, S.; Moiseenko, V.; Lee, R.; Gill, B.; Duzenli, C.; Wu, J.

    2014-03-01

    Purpose: Exploration of historical data for regional organ dose sensitivity is limited by the effort needed to (sub-)segment large numbers of contours. A system has been developed which can rapidly perform autonomous contour sub-segmentation and generic dose-volume computations, substantially reducing the effort required for exploratory analyses. Methods: A contour-centric approach is taken which enables lossless, reversible segmentation and dramatically reduces computation time compared with voxel-centric approaches. Segmentation can be specified on a per-contour, per-organ, or per-patient basis, and can be performed along either an embedded plane or in terms of the contour's bounds (e.g., split organ into fractional-volume/dose pieces along any 3D unit vector). More complex segmentation techniques are available. Anonymized data from 60 head-and-neck cancer patients were used to compare dose-volume computations with Varian's EclipseTM (Varian Medical Systems, Inc.). Results: Mean doses and Dose-volume-histograms computed agree strongly with Varian's EclipseTM. Contours which have been segmented can be injected back into patient data permanently and in a Digital Imaging and Communication in Medicine (DICOM)-conforming manner. Lossless segmentation persists across such injection, and remains fully reversible. Conclusions: DICOMautomaton allows researchers to rapidly, accurately, and autonomously segment large amounts of data into intricate structures suitable for analyses of regional organ dose sensitivity.

  5. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  6. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  7. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 2: Sequoyah Unit 2 Cycle 3

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The other two benchmark critical calculations were the beginning-of-cycle (BOC) startup at both hot, zero-power (HZP) and HFP critical conditions. These latter calculations were used to check for consistency in the calculated results for different burnups and downtimes. The keff results were in the range of 1.00014 to 1.00259 with a standard deviation of less than 0.001

  8. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  9. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports (DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively). At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  10. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-08-31

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  11. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    Science.gov (United States)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may

  12. Thermodynamic and exergoeconomic analysis of a cement plant: Part I – Methodology

    International Nuclear Information System (INIS)

    Highlights: • Energy, exergy and exergoeconomic analysis of a complete cement plant have been investigated. • The first and second law efficiencies based on the energy and exergy analysis are defined for the entire cement plant. • The specific energy consumption of the whole sections of the cement plant have been analyzed. • The specific manufacturing costs of farine, clinker and cement have been determined by the cost analysis. - Abstract: The energy, exergy and exergoeconomic analysis of a cement factory has been studied within two parts. This paper is the first part of the study which includes the thermodynamic and exergoeconomic methodology and formulations developed for such a comprehensive and detailed analysis. The second part of this study is about the application of the developed formulation which considers an actual cement plant located in Gaziantep, Turkey. The energy consumption by the cement industry is about 5% of the total global industrial energy consumption. It is also one of the world’s largest industrial sources of CO2 emissions. In this paper, a cement plant is considered with all main manufacturing units. Mass, energy, and exergy balances are applied to each system. The first and second law efficiencies based on the energy and exergy analysis and performance assessment parameters are defined for the entire cement plant. The formulations for the cost of products, and cost formation and allocation within the system are developed based on exergoeconomic analysis. In order to obtain the optimal marketing price of cement and to decrease specific energy consumption of the whole plant, the cost analysis formulated here have substantial importance

  13. Analysis of maternal and child health policies in Malawi: The methodological perspective.

    Science.gov (United States)

    Daire, J; Khalil, D

    2015-12-01

    The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context. PMID:26955434

  14. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  15. Application of transient analysis methodology to quantify thermal performance of heat exchangers

    International Nuclear Information System (INIS)

    A transient testing technique is developed to evaluate the thermal performance of industrial-scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady-state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of apparent fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady-state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  16. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    Directory of Open Access Journals (Sweden)

    Kara Schick-Makaroff

    2016-03-01

    Full Text Available Background: When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods: We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results: We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions: The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  17. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    Science.gov (United States)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2016-03-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  18. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  19. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Jesus Carlos Pedraza Ortega

    2011-01-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  20. PROPOSAL FOR THE DEVELOPMENT OF AN ANALYSIS METHODOLOGY OF FIELD EVENTS ON A BUSMAKER

    Directory of Open Access Journals (Sweden)

    CLAUDEMIR ROBERTO SILVA

    2012-07-01

    Full Text Available Customer service becomes, day by day, a mandatory and differentiated requirement at the decision moment of a product purchase. For the bus makers, the search for field informations, especially regarding their products, is not so easy. For this reason, bus makers usually develop with their services and parts representatives techniques in order to meet their customers’ needs on a fast and efficient way. However, unfortunately, there is no methodology that may logically measure field occurrences. Therefore, this differential may be one of the most important points for the managers to make decisions, once it involves, mostly, maintenance costs and affects directly the product quality and the company profitability. Thus, researches that identify how bus makers have been dealing with this issue become relevant. Based upon this, the aim of this work is to present an analysis methodology for field occurrences concatenating data on a clear and objective way through a computational system proposed for the company focused on this paper. The customers’ demands will be recorded on a system named “Fatiz 1.0”. As a result it will be possible to control field concurrencies operations, as well as management process improvement. It is concluded that the proposed system will help to improve company productivity, due to its customer integration and interface.

  1. Analysis of airborne radiometric data. Volume 3. Topical reports

    International Nuclear Information System (INIS)

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors

  2. Trading Volume: Definitions, Data Analysis, and Implications of Portfolio Theory

    OpenAIRE

    Lo, Andrew W.; Jiang W. Wang

    2000-01-01

    We examine the implications of portfolio theory for the cross-sectional behavior of equity trading volume. Two-fund separation theorems suggest a natural definition for trading activity: share turnover. If two-fund separation holds, share turnover must be identical for all securities. If (K+1)-fund separation holds, we show that turnover satisfies an approximately linear K-factor structure. These implications are examined empirically using individual weekly turnover data for NYSE and AMEX sec...

  3. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  4. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  5. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  6. CDM afforestation and reforestation baseline methodologies: An analysis of the submission and approval process

    OpenAIRE

    Michaelowa, Axel; Rawat, V. R. S.

    2007-01-01

    Afforestation and Reforestation (A/R), also widely termed LULUCF have been an important field of conflict in the Clean Development Mechanism (CDM) of the Kyoto Protocol. The first methodology for A/R projects has been submitted only by October 2004 and the first project was registered only in November 2006, two years after the first project in the energy sector. Like energy efficiency and transportation methodologies, A/R methodologies also suffer high rejection rate. 20 A/R CDM methodologies...

  7. A Systemic Method for Organisational Stakeholder Identification and Analysis Using Soft Systems Methodology (SSM)

    OpenAIRE

    Wang, Wei; Liu, Wenbin; Mingers, John

    2015-01-01

    This paper presents a systemic methodology for identifying and analysing the stakeholders of an organisation at many different levels. The methodology is based on soft systems methodology and is applicable to all types of organisation, both for profit and non-profit. The methodology begins with the top-level objectives of the organisation, developed through debate and discussion, and breaks these down into the key activities needed to achieve them. A range of stakeholders are identified for e...

  8. Methodology for Impact Analysis of the Mobile Web in Developing Countries: a Pilot Study in Nairobi, Kenya

    OpenAIRE

    Purwandari, Betty; Hall, Wendy; Wills, Gary

    2011-01-01

    In this paper, we describe an impact analysis methodology to measure the impact of the Mobile Web in developing nations. This methodology is needed to anticipate the effects of the Mobile Web on society. Moreover, it can guide advancement of Mobile Web technology to better serve its users. In May 2010, a pilot study to test the methodology was carried out in Nairobi, Kenya. There were 47 students from 3 leading universities participated in the study. Questionnaires were used to ask how they u...

  9. IAEA methodology of the ITDB information analysis from nuclear security perspective

    International Nuclear Information System (INIS)

    The IAEA methodology of the Illicit Trafficking database analyses general and specific risks, trends and patterns. This methodology assist in identification of security needs that are specific to material , activity , location ,country or even regional.Finally the methodology also analyses the lessons learned.

  10. Failure detection and isolation methodology based on the sequential analysis and extended Kalman filter technique

    International Nuclear Information System (INIS)

    A nuclear power plant operation relies on accurate and precise response of the monitoring system in order to assure a safety operational standard during the most predictable operational transients. The signal from the sensor are in general contaminated with noise and also with the randomic fluctuations making a precise plant assessment uncertain, thus with the possibility of erroneous operator decision or even with the false alarm actuation. In practice the noisy environment could even overcome the sensor malfunction misreading the plant operational status. In the present work a new failure detection and isolation (FDI) algorithm has been developed based on the sequential analysis and extended Kalman filter residue monitoring. The present methodology has been applied to both highly redundant monitoring systems and to non redundant systems where high signal reliability is required. (C.M.)

  11. Methodology for adding and amending glycaemic index values to a nutrition analysis package.

    LENUS (Irish Health Repository)

    Levis, Sharon P

    2011-04-01

    Since its introduction in 1981, the glycaemic index (GI) has been a useful tool for classifying the glycaemic effects of carbohydrate foods. Consumption of a low-GI diet has been associated with a reduced risk of developing CVD, diabetes mellitus and certain cancers. WISP (Tinuviel Software, Llanfechell, Anglesey, UK) is a nutrition software package used for the analysis of food intake records and 24 h recalls. Within its database, WISP contains the GI values of foods based on the International Tables 2002. The aim of the present study is to describe in detail a methodology for adding and amending GI values to the WISP database in a clinical or research setting, using data from the updated International Tables 2008.

  12. From continuous flow analysis to programmable Flow Injection techniques. A history and tutorial of emerging methodologies.

    Science.gov (United States)

    Ruzicka, Jaromir Jarda

    2016-09-01

    Automation of reagent based assays, also known as Flow Analysis, is based on sample processing, in which a sample flows towards and through a detector for monitoring of its components. The Achilles heel of this methodology is that the majority of FA techniques use constant continuous forward flow to transport the sample - an approach which continually consumes reagents and generates chemical waste. Therefore the purpose of this report is to highlight recent developments of flow programming that not only save reagents, but also lead by means of advanced sample processing to selective and sensitive assays based on stop flow measurement. Flow programming combined with a novel approach to data harvesting yields a novel approach to single standard calibration, and avoids interference caused by refractive index. Finally, flow programming is useful for sample preparation, such as rapid, extensive sample dilution. The principles are illustrated by selected references to an available online tutorial http://www.flowinjectiontutorial,com/. PMID:27343609

  13. A fuzzy logic methodology for fault-tree analysis in critical safety systems

    International Nuclear Information System (INIS)

    A new approach for fault-tree analysis in critical safety systems employing fuzzy sets for information representation is presented in this paper. The methodology is based on the utilization of the extension principle for mapping crisp measurements to various degrees of membership in the fuzzy set of linguistic Truth. Criticality alarm systems are used in miscellaneous nuclear fuel processing, handling, and storage facilities to reduce the risk associated with fissile material operations. Fault-tree methodologies are graphic illustrations of tile failure logic associated with the development of a particular system failure (top event) from basic subcomponent failures (primary events). The term event denotes a dynamic change of state that occurs to system elements, which may include hardware, software, human, or environmental factors. A fault-tree represents a detailed, deductive, analysis that requires extensive system information. The knowledge incorporated in a fault tree can be articulated in logical rules of the form open-quotes IF A is true THEN B is true.close quotes However, it is well known that this type of syllogism fails to give an answer when the satisfaction of the antecedent clause is only partial. Zadeh suggested a new type of fuzzy conditional inference. This type of syllogism (generalized modus ponens) reads as follows: Premise: A is partially true Implication: IF A is true THEN B is true Conclusion: B is partially-true. In generalized modus ponens, the antecedent is true only to some degree; hence, it is desired to compute the grade to which the consequent is satisfied. Fuzzy sets provide a natural environment for this type of computation because fuzzy variables (e.g., B) can take fuzzy values (e.g., partially-true)

  14. Sensitivity analysis of an experimental methodology to determine radionuclide diffusion coefficients in granite

    International Nuclear Information System (INIS)

    Full text of publication follows: The long-term quantitative analysis of the migration behaviour of the relevant radionuclides (RN) within the geological barrier of a radioactive waste repository requires, amongst other data, the introduction of reliable transport parameters, as diffusion coefficients. Since the determination of diffusion coefficients within crystalline rocks is complex and requires long experimental times even for non-sorbing radionuclides, the data available in the literature are very scarce. The nuclear ion beam technique RBS (Rutherford Backscattering Spectrometry) that is successfully used to determine diffusion profiles in thin film science is here examined as possible suitable technique to determine the diffusion coefficients of different RN within granite. As first step, the technique sensitivity and limitations to analyse diffusion coefficients in granite samples is evaluated, considering that the technique is especially sensitive to heavy elements. The required experimental conditions in terms of experimental times, concentration and methodology of analysis are discussed. The diffusants were selected accounting the RBS sensitivity but also trying to cover different behaviours of critical RN and a wide range of possible oxidation states. In particular, Cs(I) was chosen as representative fission product, while as relevant actinides or homologues, the diffusion of Th(IV), U(IV) and Eu (III) was studied. The diffusion of these above-mentioned cations is compared to the diffusion of Re, and I as representative of anionic species. The methodology allowed evaluating diffusion coefficients in the granite samples and, for most of the elements, the values obtained are in agreement with the values found in the literature. The diffusion coefficients calculated ranged from 10-13 to 10-16 m2/s. It is remarkable that the RBS technique is especially promising to determine diffusion coefficients of high-sorbing RN and it is applicable to a wide range of

  15. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL3 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    As a part of the structural integrity research for aging LWR (Light Water Reactor) components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. The PASCAL code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The continuous development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering recent developments in the fracture mechanics and computer performance. Previous version of PASCAL (PASCAL Ver.2) that was released in 2007 has many functions including the evaluation method for an embedded crack and conditional probabilities of crack initiation and fracture of a RPV, PTS transient database, inspection crack detection probability model and others. Since 2007, the PASCAL Ver. 2 has been improved mainly considering the effects of weld-overlay cladding on the inner surface of RPV. A generalized analysis method is available on the basis of the development of PASCAL Ver.3 and sensitivity analysis results. Graphical user interface (GUI) including a generalized method and some functions of probabilistic fracture mechanics have been also updated for PASCAL3. This report provides the user's manual, examples of analysis and theoretical background of PASCAL Ver.3. (author)

  16. Experimental stress analysis for materials and structures stress analysis models for developing design methodologies

    CERN Document Server

    Freddi, Alessandro; Cristofolini, Luca

    2015-01-01

    This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.

  17. Methodology of economic analysis of evidence of cartel in the resale market of fuels

    International Nuclear Information System (INIS)

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at the constitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  18. Methodology of analysis of economic evidence of cartel in the resale retail of the fuel sector

    International Nuclear Information System (INIS)

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at deconstitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  19. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  20. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    International Nuclear Information System (INIS)

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG ampersand G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort

  1. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    CERN Document Server

    Niven, Robert K

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of "entropy" is then established using Jaynes' maximum entropy method, both in general and in equilibrium thermodynamics. The thermodynamic entropy then gives the "entropy production" concept. Equations for the entropy production are then derived for simple, integral and infinitesimal flow systems. Some technical aspects are examined, including discrete and continuum representations of volume elements, the effect of radiation, and the analysis of systems subdivided into compartments. A Reynolds decomposition of the entropy ...

  2. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  3. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  4. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  5. Geometrical considerations in dose volume analysis in intracavitary treatment

    Energy Technology Data Exchange (ETDEWEB)

    Deshpande, D.D. [Dept. of Medical Physics, Tata Memorial Hospital, Bombay (India); Shrivastava, S.K. [Dept. of Radiation Oncology, Tata Memorial Hospital, Bombay (India); Pradhan, A.S. [Bhabha Atomic Research Centre, Bombay (India); Viswanathan, P.S. [Dept. of Medical Physics, Tata Memorial Hospital, Bombay (India); Dinshaw, K.A. [Dept. of Radiation Oncology, Tata Memorial Hospital, Bombay (India)

    1996-06-01

    The present work was aimed at to study the relationship between the volume enclosed by reference iodose surface and various geometrical parameters of the intracavitary applicator in treatment of carcinoma of cervix. Pearshape volume of the reference isodose derived from the Total Reference Air Kerma (TRAK) and the product of its dimensions, height H, width W and thickness T which is dependent on the applicator geometry, were estimated for 100 intracavitary applications treated by Selectron LDR machine. Orthogonal radiographs taken for each patient were used for measurement of actual geometric dimensions of the applicator and carrying out the dosimetry on TP-11 treatment planning system. The dimensions H, W and T of reference isodose surface (60 Gy) were also noted. Ratio of the product HWT and the pearshape volume was found mainly to be a function of colpostat separation and not of other geometrical parameters like maximum vertical and anterio-posterior dimension of the applicator. The ratio remained almost constant for a particular combination of uterine tandem and colpostat. Variation in the ratios were attributed to the non-standard geometry. The ratio of the volume of reference isodose surface to the product of its dimensions in the applicator depends upon the colpostat separation. (orig./MG) [Deutsch] Die vorliegende Arbeit hatte zum Ziel, die Beziehung zwischen dem von der Referenzisodose umhuellten Volumen und verschiedenen geometrischen Parametern bei der intrakavitaeren Applikation in der Behandlung des Zervixkarzinoms zu untersuchen. Ein birnenfoermiges Volumen, welches von der Referenzisodose umhuellt und von der Total Refernce Air Kerma (TRAK) und dem Produkt der aus der Applikatorgeometrie ableitbaren Dimensionen Hoehe, Breite und Dicke (H, W, T) bestimmt wurde, wurde bei 100 Applikationen (Selectron LDR) abgeschaetzt. Zur Messung der geometrischen Anordnung des Applikators wurden orthogonale Roentgenbilder, die bei jedem Patienten angefertigt wurden

  6. Chemical analysis and volume reduction of radioactive HEPA filter waste

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, In Ho; Choi, Wang Kyu; Lee, Suk Chol; Min, Byung Youn; Yang, Hee Chul; Lee, Kun Woo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-05-15

    According to the active operation of nuclide facilities at KAERI, many spent filters used in a ventilation system of the nuclear facilities have been generated as a spent filter wastes. These spent filter wastes have generally consisted of a HEPA filter after filtering of all the contaminants in the air stream generated during the operation of nuclide facilities. Therefore, this study is conducted to investigate the radionuclide and heavy metals in HEPA filters, and the characteristics of the melting as a decontamination and volume reduction

  7. Methodology for seismic analysis of FBR core assembly using variable added damping

    International Nuclear Information System (INIS)

    It is necessary to consider the fluid-solid coupling effect due to the interaction between coolant and core assembly when analyzing the seismic performance of the FBR core assembly. The added damping was treated mostly as a constant in previous researches. In fact, the effect on assemblies from the coolant depends strongly on the gap between the core assemblies, and the damping should be considered as a variable. In order to simulate the vibration of the core assembly more accurately, the methodology for the seismic analysis of FBR core assemblies using variable added damping was studied. In this paper, the seismic analysis model of one single row of core assemblies (5 assemblies) of FBR was established. By comparing the two kinds of added damping models, the constant and variable ones respectively, the results show that the seismic analysis of the core assemblies with variable added damping is feasible and effective. Meanwhile, the simulation method used in this paper can obtain more precise approximation of the vibration of the core assembly and lays the foundation for more realistically simulating the seismic response of reactor core assemblies. It also helps to reduce the conservative margin of the structural design and is meaningful in engineering application. (authors)

  8. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  9. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL Ver.2 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    As a part of the aging structural integrity research for LWR components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. This code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the recent development in the fracture mechanics and computer performance. PASCAL Ver.1 has functions of optimized sampling in the stratified Monte Carlo simulation, elastic-plastic fracture criterion of the R6 method, crack growth analysis models for a semi-elliptical crack, recovery of fracture toughness due to thermal annealing and so on. Since then, under the contract between the Ministry of Economy, Trading and Industry of Japan and JAEA, we have continued to develop and introduce new functions into PASCAL Ver.2 such as the evaluation method for an embedded crack, KI database for a semi-elliptical crack considering stress discontinuity at the base/cladding interface, PTS transient database, and others. A generalized analysis method is proposed on the basis of the development of PASCAL Ver.2 and results of sensitivity analyses. Graphical user interface (GUI) including a generalized method as default values has been also developed for PASCAL Ver.2. This report provides the user's manual and theoretical background of PASCAL Ver.2. (author)

  10. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  11. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology

    Directory of Open Access Journals (Sweden)

    Hammami MM

    2016-05-01

    Full Text Available Muhammad M Hammami,1,2 Safa Hammami,1 Hala A Amer,1 Nesrine A Khodr1 1Clinical Studies and Empirical Ethics Department, King Faisal Specialist Hospital and Research Centre, 2College of Medicine, Alfaisal University, Riyadh, Saudi Arabia Background: Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females’ end-of-life choices.Methods: A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology.Results: The mean age of the females in the sample was 30.3 years (range, 19–55 years. Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: “physical and emotional privacy concerned, family caring” (younger, lower religiosity, “whole person” (higher religiosity, “pain and informational privacy concerned” (lower life quality, “decisional privacy concerned” (older, higher life quality, and “life quantity concerned, family dependent” (high life quality, low life satisfaction. Out of the

  12. WaterSense Program: Methodology for National Water Savings Analysis Model Indoor Residential Water Use

    Energy Technology Data Exchange (ETDEWEB)

    Whitehead, Camilla Dunham; McNeil, Michael; Dunham_Whitehead, Camilla; Letschert, Virginie; della_Cava, Mirka

    2008-02-28

    The U.S. Environmental Protection Agency (EPA) influences the market for plumbing fixtures and fittings by encouraging consumers to purchase products that carry the WaterSense label, which certifies those products as performing at low flow rates compared to unlabeled fixtures and fittings. As consumers decide to purchase water-efficient products, water consumption will decline nationwide. Decreased water consumption should prolong the operating life of water and wastewater treatment facilities.This report describes the method used to calculate national water savings attributable to EPA?s WaterSense program. A Microsoft Excel spreadsheet model, the National Water Savings (NWS) analysis model, accompanies this methodology report. Version 1.0 of the NWS model evaluates indoor residential water consumption. Two additional documents, a Users? Guide to the spreadsheet model and an Impacts Report, accompany the NWS model and this methodology document. Altogether, these four documents represent Phase One of this project. The Users? Guide leads policy makers through the spreadsheet options available for projecting the water savings that result from various policy scenarios. The Impacts Report shows national water savings that will result from differing degrees of market saturation of high-efficiency water-using products.This detailed methodology report describes the NWS analysis model, which examines the effects of WaterSense by tracking the shipments of products that WaterSense has designated as water-efficient. The model estimates market penetration of products that carry the WaterSense label. Market penetration is calculated for both existing and new construction. The NWS model estimates savings based on an accounting analysis of water-using products and of building stock. Estimates of future national water savings will help policy makers further direct the focus of WaterSense and calculate stakeholder impacts from the program.Calculating the total gallons of water the

  13. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  14. Aerodynamic analysis of flapping foils using volume grid deformation code

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Jin Hwan [Seoul National University, Seoul (Korea, Republic of); Kim, Jee Woong; Park, Soo Hyung; Byun, Do Young [Konkuk University, Seoul (Korea, Republic of)

    2009-06-15

    Nature-inspired flapping foils have attracted interest for their high thrust efficiency, but the large motions of their boundaries need to be considered. It is challenging to develop robust, efficient grid deformation algorithms appropriate for the large motions in three dimensions. In this paper, a volume grid deformation code is developed based on finite macro-element and transfinite interpolation, which successfully interfaces to a structured multi-block Navier-Stokes code. A suitable condition that generates the macro-elements with efficiency and improves the robustness of grid regularity is presented as well. As demonstrated by an airfoil with various motions related to flapping, the numerical results of aerodynamic forces by the developed method are shown to be in good agreement with those of an experimental data or a previous numerical solution

  15. Economic analysis of the space shuttle system, volume 1

    Science.gov (United States)

    1972-01-01

    An economic analysis of the space shuttle system is presented. The analysis is based on economic benefits, recurring costs, non-recurring costs, and ecomomic tradeoff functions. The most economic space shuttle configuration is determined on the basis of: (1) objectives of reusable space transportation system, (2) various space transportation systems considered and (3) alternative space shuttle systems.

  16. Outcomes from the regional Co-operation in the Area of the Safety Analysis Methodology

    International Nuclear Information System (INIS)

    International Atomic Energy Agency (IAEA) carried out the Co-ordinated Research Program (CRP) ON Validation of Accident and Safety Analysis Methodology'' in the period between 1995 and 1998. Three areas of interest identified by the participants referred to the pressurised water reactors of Western and Eastern type (PWR and WWER type). The specific areas of attention were: system behaviour of the primary and secondary loops (PS area), the containment response (CO area) and the severe accidents (SA area). During the CRP it became clear that the technology advancements, the available tools (i.e. codes) and the experimental databases in the above areas are quite different. At the conclusion of the CRP, all objectives of the program have been reached. This paper presents the summary of the regional co-operation in this framework. The CRP activities focused on the codes and expertise available at the participating organisations. This overview therefore summarises their experience related to the state-of-the-art in the field of computational accident analysis. In addition, the paper proposes the recommendations for future activities related to the code usage, the user effects and code development. In pursuing of these goals special attention is given to the importance of the international co-operation. (author)

  17. An Alternative Methodological Approach for Cost-Effectiveness Analysis and Decision Making in Genomic Medicine.

    Science.gov (United States)

    Fragoulakis, Vasilios; Mitropoulou, Christina; van Schaik, Ron H; Maniadakis, Nikolaos; Patrinos, George P

    2016-05-01

    Genomic Medicine aims to improve therapeutic interventions and diagnostics, the quality of life of patients, but also to rationalize healthcare costs. To reach this goal, careful assessment and identification of evidence gaps for public health genomics priorities are required so that a more efficient healthcare environment is created. Here, we propose a public health genomics-driven approach to adjust the classical healthcare decision making process with an alternative methodological approach of cost-effectiveness analysis, which is particularly helpful for genomic medicine interventions. By combining classical cost-effectiveness analysis with budget constraints, social preferences, and patient ethics, we demonstrate the application of this model, the Genome Economics Model (GEM), based on a previously reported genome-guided intervention from a developing country environment. The model and the attendant rationale provide a practical guide by which all major healthcare stakeholders could ensure the sustainability of funding for genome-guided interventions, their adoption and coverage by health insurance funds, and prioritization of Genomic Medicine research, development, and innovation, given the restriction of budgets, particularly in developing countries and low-income healthcare settings in developed countries. The implications of the GEM for the policy makers interested in Genomic Medicine and new health technology and innovation assessment are also discussed. PMID:27096406

  18. Analysis and design methodology for the development of optimized direct detection CO2 DIAL receivers

    Science.gov (United States)

    Cooke, Bradly J.; Laubscher, Bryan E.; Cafferty, Maureen M.; Olivas, Nicholas L.; Schmitt, Mark J.; Fuller, Kenneth R.; Goeller, Roy M.; Mietz, Donald E.; Tiee, Joseph J.; Sander, Robert K.; Vampola, John L.; Price, Stephen L.; Kasai, Ichiro

    1997-10-01

    The analysis methodology and corresponding analytical tools for the design of optimized, low-noise, hard target return CO2 differential absorption lidar (DIAL) receiver systems implementing both single element detectors and multi-pixel imaging arrays for passive/active, remote-sensing applications are presented. System parameters and components composing the receiver include: aperture, focal length, field of view, cold shield requirements, image plane dimensions, pixel dimensions, pixel pitch and fill factor, detection quantum efficiency, optical filter requirements, amplifier and temporal sampling parameters. The performance analysis is accomplished by calculating the system's CO2 laser range response, total noise, optical geometric form factor and optical resolution. The noise components include speckle, photon noise due to signal, scene and atmospheric background, cold shield, and electronic noise. System resolution is simulated through cascaded optical transfer functions and incudes effects due to atmosphere, optics, image sampling, and system motion. Experimental results of a developmental single-element detector receiver designed to detect 100 ns wide laser pulses (10 - 100 kHz pulse repetition rates) backscattered from hard- targets at nominal ranges of 10 km are presented. The receiver sensitivity is near-background noise limited, given an 8.5 - 11.5 micrometer radiant optical bandwidth, with the total noise floor spectrally white for maximum pulse averaging efficiency.

  19. Methodology for the physical and chemical exergetic analysis of steam boilers

    International Nuclear Information System (INIS)

    This paper presents a framework of thermodynamic, energy and exergy, analyses of industrial steam boilers. Mass, energy, and exergy analysis were used to develop a methodology for evaluating thermodynamic properties, energy and exergy input and output resources in industrial steam boilers. Determined methods make available an analytic procedure for the physical and chemical exergetic analysis of steam boilers for appropriate applications. The energy and exergy efficiencies obtained for the entire boiler was 69.56% and 38.57% at standard reference state temperature of 25 °C for an evaporation ratio of 12. Chemical exergy of the material streams was considered to offer a more comprehensive detail on energy and exergy resource allocation and losses of the processes in a steam boiler. - Highlights: ► We evaluated thermodynamic properties and performance variables associated with material streams. ► We analysed resources allocation, and magnitude of exergetic losses in steam boilers. ► Chemical exergy of material streams contributed to improved exergy values. ► High operational parameter will lead to higher boiler exergy. ► Exergy destroyed was higher in the combustion as against the heat exchanging unit

  20. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  1. Application of probabilistic safety analysis methodology to physical security in Unit 1 of Laguna Verde Nuclear Power plant

    International Nuclear Information System (INIS)

    The implementation and application of a methodology for the probabilistic safety analysis in the Vulnerability analysis project of Laguna Verde Nuclear Power plant (CNLV), performed by Comision Federal de Electricidad with the technical support of Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS) is presented in this work. The results obtained by the application of this methodology give the most important targets or fundamental areas of CNLV in which the execution of sabotage actions could set in danger the physical security of CNLV and the population in general. (Author)

  2. Coal gasification systems engineering and analysis, volume 2

    Science.gov (United States)

    1980-01-01

    The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.

  3. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 2-Sequoyah Unit 2 Cycle 3

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The

  4. Statistical analysis of the blast furnace process output parameter using arima control chart with proposed methodology of control limits setting

    OpenAIRE

    Noskievičová, Darja

    2009-01-01

    The paper deals with the statistical analysis of the selected parameter of the blast furnace process. It was proved that analyzed measurements have been autocorrelated. ARIMA control chart was selected for their analysis as a very useful statistical process control (SPC) method. At first the methodology for control limits setting in ARIMA control charts considering the time series outlier analysis as an integral part of the ARIMA model building was designed. Then this proposal was...

  5. Statistical analysis of the blast furnace process output parameter using arima control chart with proposed methodology of control limits setting

    OpenAIRE

    D. Noskievičová

    2009-01-01

    The paper deals with the statistical analysis of the selected parameter of the blast furnace process. It was proved that analyzed measurements have been autocorrelated. ARIMA control chart was selected for their analysis as a very useful statistical process control (SPC) method. At first the methodology for control limits setting in ARIMA control charts considering the time series outlier analysis as an integral part of the ARIMA model building was designed. Then this proposal was applied to ...

  6. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  7. 3RD WP PROBABILISTIC CRITICALITY ANALYSIS: METHODOLOGY FOR BASKET DEGRADATION WITH APPLICANTION TO COMMERICAL SNF

    Energy Technology Data Exchange (ETDEWEB)

    P. Goulib

    1997-09-15

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department to describe the latest version of the probabilistic criticality analysis methodology and its application to the entire commercial waste stream of commercial pressurized water reactor (PWR) spent nuclear fuel (SNF) expected to be emplaced in the repository. The purpose of this particular application is to evaluate the 21 assembly PWR absorber plate waste package (WP) with respect to degraded mode criticality performance. The degradation of principal concern is the borated stainless steel absorber plates which are part of the waste package basket and which constitute a major part of the waste package criticality control. The degradation (corrosion, dissolution) of this material will result in the release of most of the boron from the waste package and increase the possibility of criticality. The results of this evaluation will be expressed in terms of the fraction of the PWR SNF which can exceed a given k{sub eff}, as a function of time and the peak value of that fraction over a time period up to several hundred thousand years. The ultimate purpose of this analysis is to support the waste package design which defines waste packages to cover a range of SNF characteristics. In particular, with respect to PWR criticality the current categories are: (1) no specific criticality control material, (2) borated stainless steel plates in the waste package basket, and (3) zirconium clad boron carbide control rods (Ref. 5.4). The results of this analysis will indicate the coverage provided by the first two categories. With these results, this study will provide the first quantitative estimate of the benefit expected from the control measure consisting of borated stainless steel plates. This document is the third waste package probabilistic criticality analysis. The first two (Ref. 5.12 for the first and Ref. 5.15 for the second) analyses were based primarily on the

  8. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT (VOLUME 1)

    International Nuclear Information System (INIS)

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9

  9. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT [VOLUME 1

    Energy Technology Data Exchange (ETDEWEB)

    FREDERICKSON JR; ROURK RJ; HONEYMAN JO; JOHNSON ME; RAYMOND RE

    2009-01-19

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9.

  10. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  11. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed toget...

  12. Methodological Challenges in Researching Threshold Concepts: A Comparative Analysis of Three Projects

    Science.gov (United States)

    Quinlan, K. M.; Male, S.; Baillie, C.; Stamboulis, A.; Fill, J.; Jaffer, Z.

    2013-01-01

    Threshold concepts were introduced nearly 10 years ago by Ray Land and Jan Meyer. This work has spawned four international conferences and hundreds of papers. Although the idea has clearly gained traction in higher education, this sub-field does not yet have a fully fledged research methodology or a strong critical discourse about methodology.…

  13. Methodology and boundary conditions applied to the analysis on internal flooding for Kozloduy NPP units 5 and 6

    International Nuclear Information System (INIS)

    Within the Modernization Program of Units 5 and 6 of Kozloduy NPP a comprehensive analysis of internal flooding has been carried out for the reactor building outside the containment and for the turbine hall by FRAMATOME ANP and ENPRO Consult. The objective of this presentation is to provide information on the applied methodology and boundary conditions. A separate report called 'Methodology and boundary conditions' has been elaborated in order to provide the fundament for the study. The methodology report provides definitions and advice for the following topics: scope of the study; safety objectives; basic assumptions and postulates (plant conditions, grace periods for manual actions, single failure postulate, etc.); sources of flooding (postulated piping leaks and ruptures, malfunctions and personnel error); main activities of the flooding analysis; study conclusions and suggestions of remedial measures. (authors)

  14. Photovoltaic venture analysis. Final report. Volume I. Executive summary

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    The objective of the study, government programs under investigation, and a brief review of the approach are presented. Potential markets for photovoltaic systems relevant to the study are described. The response of the photovoltaic supply industry is then considered. A model which integrates the supply and demand characteristics of photovoltaics over time was developed. This model also calculates the economic benefits associated with various government subsidy programs. Results are derived under alternative possible supply, demand, and macroeconomic conditions. A probabilistic analysis of the costs and benefits of a $380 million federal photovoltaic procurement initiative, as well as certain alternative strategies, is summarized. Conclusions and recommendations based on the analysis are presented.

  15. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  16. A Framework for Decomposition and Analysis of Agile Methodologies During Their Adaptation

    Science.gov (United States)

    Mikulenas, Gytenis; Kapocius, Kestutis

    In recent years there has been a steady increase of interest in Agile software development methodologies and techniques, which are often positioned as proven alternatives to the traditional plan-driven approaches. However, although there is no shortage of Agile methodologies to choose from, the formal methods for actually choosing or adapting the right one are lacking. The aim of the presented research was to define the formal way of preparing Agile methodologies for adaptation and creating an adaptation process framework. We argue that Agile methodologies can be successfully broken down into individual parts that can be specified on three different levels and later analyzed with regard to problem/concern areas. Results of such decomposition can form the foundation for the decisions on the adaptation of the specific Agile methodology. A case study is included in this chapter to further clarify the proposed approach.

  17. The methodology for developing a prospective meta-analysis in the family planning community

    Directory of Open Access Journals (Sweden)

    Jacobson Janet C

    2011-04-01

    . Conclusions PMA is a novel research method that improves meta-analysis by including several study sites, establishing uniform reporting of specific outcomes, and yet allowing some independence on the part of individual sites with respect to the conduct of research. The inclusion of several sites increases statistical power to address important clinical questions. Compared to multi-center trials, PMA methodology encourages collaboration, aids in the development of new investigators, decreases study costs, and decreases time to publication. Trial Registration ClinicalTrials.gov: NCT00613366, NCT00886834, NCT01001897, NCT01147497 and NCT01307111

  18. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  19. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    International Nuclear Information System (INIS)

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported

  20. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  1. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches.

    Science.gov (United States)

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-01-01

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points. PMID:27136541

  2. Strategic analysis methodology for energy systems with remote island case study

    International Nuclear Information System (INIS)

    A strategic analysis methodology is presented for adaptive energy systems engineering to realize an optimal level of service in the context of a community's social, economic, and environmental position. The groundwork stage involves characterizing the social context, assessing available energy resources, identifying environmental issues, setting eco-resource limits, and quantifying socio-economic constraints for a given region. A spectrum of development options is then constructed according to the range of energy service levels identified for the sector under study. A spectrum of conceptual energy systems is generated and infrastructure investments and resource use are modeled. The outcome is a matrix of energy system investment possibilities for the range of energy demand levels reflecting the values, ideas, and expectations expressed by the community. These models are then used to assess technical feasibility and economic, environmental and social risk. The result is an easily understood graphical depiction of local aspirations, investment options, and risks which clearly differentiates development opportunities from non-viable concepts. The approach was applied to a case study on Rotuma, an isolated Pacific Island. The case study results show a clear development opportunity space for Rotuma where desired energy services are in balance with investment sources, resource availability, and environmental constraints.

  3. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Science.gov (United States)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail

    2014-12-01

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  4. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  5. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  6. Operability probabilistic analysis: methodology for economic improvement through the parallelization of process plants

    International Nuclear Information System (INIS)

    One of the major challenges of the emergent technologies to overcome is the economic competitive with regard to the established technologies ar the present time, since these should not only take advantage efficiently of the energy resources and the raw materials in their productive processes, but also to elevate to the maximum the use of the derived economic resources of the initial investment of the plant. In special cases, like in those related with the electric power generation or fuels, the fixed cost represents a high percentage of the total cost, where is observed a great dependence with the plant factor, parameter that in turn is susceptible to non prospective variations but yes predictable by means of the use of analytic tools, able to relate the failures rates of present elements in the plant with the probability of operation outside times, as the Operability Probabilistic Analysis. In this study were evaluated the implications of changes in the plant configurations, with the purpose of knowing the economic advantages of a major or minor equipment s division in parallel (parallelization); the function general objective is established to evaluate the parallelization alternatives and the basic concepts are presented to carry out this methodology. At the end a study case is developed for a hydrogen production plant in its section of sulfuric acid decomposition. (Author)

  7. A New Approximate Fracture Mechanics Analysis Methodology for Composites with a Crack or Hole

    Science.gov (United States)

    Tsai, H. C.; Arocho, A.

    1990-01-01

    A new approximate theory which links the inherent flaw concept with the theory of crack tip stress singularities at a bi-material interface was developed. Three assumptions were made: (1) the existence of inherent flaw (i.e., damage zone) at the tip of the crack, (2) a fracture of the filamentary composites initiates at a crack lying in the matrix material at the interface of the matrix/filament, and (3) the laminate fails whenever the principal load-carrying laminae fails. This third assumption implies that for a laminate consisting of 0 degree plies, cracks into matrix perpendicular to the 0 degree filaments are the triggering mechanism for the final failure. Based on this theory, a parameter bar K sub Q which is similar to the stress intensity factor for isotropic materials but with a different dimension was defined. Utilizing existing test data, it was found that bar K sub Q can be treated as a material constant. Based on this finding a fracture mechanics analysis methodology was developed. The analytical results are correlated well with test results. This new approximate theory can apply to both brittle and metal matrix composite laminates with crack or hole.

  8. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  9. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    Science.gov (United States)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  10. Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization

    Science.gov (United States)

    Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel

    2013-05-01

    The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.

  11. RADON AND PROGENY ALPHA-PARTICLE ENERGY ANALYSIS USING NUCLEAR TRACK METHODOLOGY

    International Nuclear Information System (INIS)

    A preliminary procedure for alpha energy analysis of radon and progeny using Nuclear Track Methodology (NTM) is described in this paper. The method is based on the relationship between alpha-particle energies deposited in polycarbonate material (CR-39) and the track size developed after a well-established chemical etching process. Track geometry, defined by parameters such as major or minor diameters, track area and overall track length, is shown to correlate with alpha-particle energy over the range 6.00 MeV (218Po) to 7.69 MeV (214Po). Track features are measured and the data analyzed automatically using a digital imaging system and commercial PC software. Examination of particle track diameters in CR-39 exposed to environmental radon reveals a multi-modal distribution. Locations of the maxima in this distribution are highly correlated with alpha particle energies of radon daughters, and the distributions are sufficiently resolved to identify the radioisotopes. This method can be useful for estimating the radiation dose from indoor exposure to radon and its progeny.

  12. Radon and progeny alpha-particle energy analysis using nuclear track methodology

    International Nuclear Information System (INIS)

    A preliminary procedure for alpha-energy analysis of radon and its progeny using nuclear track methodology (NTM) is described in this paper. The method is based on the relationship between alpha-particle energies deposited in polycarbonate material (CR-39) and the track size developed after a well-established chemical etching process. Track geometry, defined by parameters such as major or minor diameters, track area and overall track length, is shown to correlate with alpha-particle energy over the range 6.00 MeV (218Po) to 7.69 MeV (214Po). Track features are measured and the data analyzed automatically using a digital imaging system and commercial PC software. Examination of particle track diameters in CR-39 exposed to environmental radon reveals a multi-modal distribution. Locations of the maxima in this distribution are highly correlated with alphaparticle energies of radon daughters, and the distributions are sufficiently resolved to identify the radioisotopes. This method can be useful for estimating the radiation dose from indoor exposure to radon and its progeny. (author)

  13. Modeling and Analysis of MRR, EWR and Surface Roughness in EDM Milling through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    A. K.M.S. Iqbal

    2010-01-01

    Full Text Available Problem statement: Electrical Discharge Machining (EDM has grown over the last few decades from a novelty to a mainstream manufacturing process. Though, EDM process is very demanding but the mechanism of the process is complex and far from completely understood. It is difficult to establish a model that can accurately predict the performance by correlating the process parameters. The optimum processing parameters are essential to increase the production rate and decrease the machining time, since the materials, which are processed by EDM and even the process is very costly. This research establishes empirical relations regarding machining parameters and the responses in analyzing the machinability of the stainless steel. Approach: The machining factors used are voltage, rotational speed of electrode and feed rate over the responses MRR, EWR and Ra. Response surface methodology was used to investigate the relationships and parametric interactions between the three controllable variables on the MRR, EWR and Ra. Central composite experimental design was used to estimate the model coefficients of the three factors. The responses were modeled using a response surface model based on experimental results. The significant coefficients were obtained by performing Analysis Of Variance (ANOVA at 95% level of significance. Results: The variation in percentage errors for developed models was found within 5%. Conclusion: The developed models show that voltage and rotary motion of electrode are the most significant machining parameters influencing MRR, EWR and Ra. These models can be used to get the desired responses within the experimental range.

  14. Underground Test Area Subproject Phase I Data Analysis Task. Volume II - Potentiometric Data Document Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume II of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the potentiometric data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  15. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  16. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  17. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  18. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using "Braslet-M" Occlusion Cuffs

    Science.gov (United States)

    Hamilton, D. R.; Sargsyan, A. E.; Garcia, K. M.; Ebert, D.; Feiveson, A. H.; Alferova, I. V.; Dulchavsky, S. A.; Matveev, V. P.; Bogomolov, V. V.; Duncan, J. M.

    2011-01-01

    BACKGROUND: The transition to microgravity eliminates the hydrostatic gradients in the vascular system. The resulting fluid redistribution commonly manifests as facial edema, engorgement of the external neck veins, nasal congestion, and headache. This experiment examined the responses to modified Valsalva and Mueller maneuvers as measured by cardiac and vascular ultrasound in a baseline microgravity steady state, and under the influence of thigh occlusion cuffs (Braslet cuffs). METHODS: Nine International Space Station crewmember subjects (Expeditions 16 - 20) were examined in 15 experiment sessions 101 46 days after launch (mean SD; 33 - 185). 27 cardiac and vascular parameters were obtained under three respiratory conditions (baseline, Valsalva, and Mueller) before and after tightening of the Braslet cuffs for a total of 162 data points per session. The quality of cardiac and vascular ultrasound examinations was assured through remote monitoring and guidance by Investigators from the NASA Telescience Center in Houston, TX, USA. RESULTS: Fourteen of the 81 measured conditions were significantly different with Braslet application and were apparently related to cardiac preload reduction or increase in the venous volume sequestered in the lower extremity. These changes represented 10 of the 27 parameters measured. In secondary analysis, 7 of the 27 parameters were found to respond differently to respiratory maneuvers depending on the presence or absence of thigh compression, with a total of 11 differences. CONCLUSIONS: Acute application of Braslet occlusion cuffs causes lower extremity fluid sequestration and exerts proportionate measurable effects on cardiac performance in microgravity. Ultrasound techniques measuring the hemodynamic effects of thigh cuffs in combination with respiratory maneuvers may serve as an effective tool in determining the volume status of a cardiac or hemodynamically compromised patient in microgravity.

  19. Planning manual for energy resource development on Indian lands. Volume I. Benefit--cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-01

    Section II follows a brief introduction and is entitled ''Benefit-Cost Analysis Framework.'' The analytical framework deals with two major steps involved in assessing the pros and cons of energy resource development (or any other type of development). The first is to identify and describe the overall tribal resource planning and decision process. The second is to develop a detailed methodological approach to the assessment of the benefits and costs of energy development alternatives within the context of the tribe's overall planning process. Sections III, IV, and V present the application of the benefit-cost analysis methodology to coal; oil and gas; and uranium, oil shale, and geothermal development, respectively. The methodology creates hypothetical examples that illustrate realistic development opportunities for the majority of tribes that have significant reserves of one or more of the resources that may be economic to develop.

  20. Elaboration of the methodological referential for life cycle analysis of first generation biofuels in the French context

    International Nuclear Information System (INIS)

    This study was made under the particular context of a strong growth of biofuels market, and the implication of French and European public authorities, and certain Member States (Germany, Netherlands, UK), for the development of certification schemes for first generation biofuels. The elaboration of such schemes requires a consensus on the methodology to apply when producing Life Cycle Analysis (LCA) of biofuels. To answer this demand, the study built up the methodological referential for biofuels LCAs in order to assess the Greenhouse Gases (GHG) emissions, fossil fuels consumptions and local atmospheric pollutants emissions induced by the different biofuel production pathways. The work consisted in methodological engineering, and was accomplished thanks to the participation of all the members of the Technical Committee of the study. An initial bibliographic review on biofuels LCAs allowed the identification of the main methodological issues (listed below). For each point, the impact of the methodological choices on the biofuels environmental balances was assessed by several sensitivity analyses. The results of these analyses were taken into account for the elaboration of the recommendations: - Consideration of the environmental burdens associated with buildings, equipments and their maintenance - Quantification of nitrous oxide (N2O) emissions from fields - Impact of the Land Use Change (LUC) - Allocation method for the distribution of the environmental impacts of biofuel production pathways between the different products and coproducts generated. Within the framework of this study, we made no distinction in terms of methodological approach between GHG emissions and local pollutants emissions. This results from the fact that the methodological issues cover all the environmental burdens and do not require specific approaches. This executive summary presents the methodological aspects related to biofuels LCAs. The complete report of the study presents in addition

  1. Methodology for conceptual remote sensing spacecraft technology: insertion analysis balancing performance, cost, and risk

    Science.gov (United States)

    Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.

    1997-12-01

    Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite

  2. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and sustainability analysis

    International Nuclear Information System (INIS)

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO2/MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production

  3. Energy intensive industry for Alaska. Volume II: case analysis.

    Energy Technology Data Exchange (ETDEWEB)

    1978-09-01

    A case analysis of the attractiveness of the primary aluminium metal industry in Alaska is presented. Part 1 provides a discussion on the economics of the industry and provides a conceptual Alaskan smelter including its physical nature, employment, and tax consequences, and its environmental attributes. Part 2 discusses the social and economic impacts. Part 3 discusses the state management options for involvement with the industry. (MCW)

  4. Space tug economic analysis study. Volume 1: Executive summary

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The space tug is defined as any liquid propulsion stage under 100,000 pounds propellant loading that is flown from the space shuttle cargo bay. Two classes of vehicles are the orbit injection stages and reusable space tugs. The vehicle configurations, propellant combinations, and operating modes used for the study are reported. The summary contains data on the study approach, results, conclusions, and recommendations.

  5. Distributed Anomaly Detection using Minimum Volume Elliptical Principal Component Analysis

    OpenAIRE

    O'Reilly, CE; Gluhak, A.; Imran, A.

    2016-01-01

    Principal component analysis and the residual error is an effective anomaly detection technique. In an environment where anomalies are present in the training set, the derived principal components can be skewed by the anomalies. A further aspect of anomaly detection is that data might be distributed across different nodes in a network and their communication to a centralized processing unit is prohibited due to communication cost. Current solutions to distributed anomaly detection rely on a h...

  6. Field screening sampling and analysis strategy and methodology for the 183-H Solar Evaporation Basins: Phase 2, Soils

    International Nuclear Information System (INIS)

    This document provides a sampling/analytical strategy and methodology for Resource Conservation and Recovery Act (RCRA) closure of the 183-H Solar Evaporation Basins within the boundaries and requirements identified in the initial Phase II Sampling and Analysis Plan for RCRA Closure of the 183-H Solar Evaporation Basins

  7. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    Science.gov (United States)

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  8. A Methodological Reflection on the Process of Narrative Analysis: Alienation and Identity in the Life Histories of English Language Teachers

    Science.gov (United States)

    Menard-Warwick, Julia

    2011-01-01

    This article uses data from life-history interviews with English language teachers in Chile and California to illustrate methodological processes in teacher identity research through narrative analysis. To this end, the author describes the steps she took in identifying an issue to be examined, selecting particular narratives as representative of…

  9. The Analysis of Polish Economy’s Transformation to Knowledge Based Economy on the Basis of Knowledge Assessment Methodology

    OpenAIRE

    Sokołowska-Woźniak, Justyna

    2015-01-01

    The main aim of this paper is to analyze the transformation of Poland to knowledge based economy on the basis of World Bank’s Knowledge Assessment Methodology. The analysis of change will be used to compare the performance (strengths and weaknesses) of Polish economy in 1994, 2004 and 2014 with regard to the main aspects of knowledge based economy.

  10. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 2: Economic analysis

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) cost uncertainties, (2) scenario analysis, (3) economic sensitivities, (4) mixed integer programming formulation of the space tug problem, and (5) critical parameters in the evaluation of a public expenditure.

  11. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  12. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  13. Evaluation of potential severe accidents during Low Power and Shutdown Operations at Grand Gulf, Unit 1. Volume 2, Part 1B: Analysis of core damage frequency from internal events for Plant Operational State 5 during a refueling outage, Main report (Section 10)

    International Nuclear Information System (INIS)

    This document contains the accident sequence analysis of internally initiated events for Grand Gulf, Unit 1 as it operates in the Low Power and Shutdown Plant Operational State 5 during a refueling outage. The report documents the methodology used during the analysis, describes the results from the application of the methodology, and compares the results with the results from two full power performed on Grand Gulf. This document, Volume 2, Part 1B, presents chapters Section 10 of this report, Human Reliability Analysis

  14. 'Rosatom' sites vulnerability analysis and assessment of their physical protection effectiveness. Methodology and 'tools'

    International Nuclear Information System (INIS)

    Full text: Enhancement of physical protection (PP) efficiency at nuclear sites (NS) of State Corporation (SC) 'Rosatom' is one of priorities. This issue is reflected in a series of international and Russian documents. PP enhancement at the sites can be achieved through upgrades of both administrative procedures and technical security system. However, in any case it is requisite to initially identify the so called 'objects of physical protection', that is, answer the question of what we need to protect and identify design basis threats (DBT) and adversary models. Answers to these questions constitute the contents of papers on vulnerability analysis (VA) for nuclear sites. Further, it is necessary to answer the question, to what extent we protect these 'objects of physical protection' and site as a whole; and this is the essence of assessment of physical protection effectiveness. In the process of effectiveness assessment at specific Rosatom sites we assess the effectiveness of the existing physical protection system (PPS) and the proposed options of its upgrades. Besides, there comes a possibility to select the optimal option based on 'cost-efficiency' criterion. Implementation of this work is a mandatory requirement as defined in federal level documents. In State Corporation 'Rosatom' there are methodologies in place for vulnerability analysis and effectiveness assessment as well as 'tools' (methods, regulations, computer software), that make it possible to put the above work into practice. There are corresponding regulations developed and approved by the Rosatom senior management. Special software for PPS effectiveness assessment called 'Vega-2' developed by a Rosatom specialized subsidiary - State Enterprise 'Eleron', is designed to assess PPS effectiveness at fixed nuclear sites. It was implemented practically at all the major Rosatom nuclear sites. As of now, this 'Vega-2' software has been certified and prepared for forwarding to corporation's nuclear sites so

  15. Waste Isolation Pilot Plant Safety Analysis Report. Volume 1

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection: Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating control and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  16. Waste Isolation Pilot Plant Safety Analysis Report. Volume 4

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  17. Waste Isolation Pilot Plant Safety Analysis Report. Volume 5

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  18. Waste Isolation Pilot Plant Safety Analysis Report. Volume 2

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  19. Waste Isolation Pilot Plant Safety Analysis Report. Volume 3

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  20. Methodology and Systems Analysis of Data Mining Model for Successful Implementation of Data Warehouse in Tertiary Institutions

    OpenAIRE

    U.F. Eze

    2014-01-01

    This research work with title “Methodology and system analysis of data mining model for successful implementation of data warehouse in tertiary institutions†is a proposal that provides a framework that is used to structure, plan, and control the process involved in information discovery for tertiary institutions. It equally deals with series of steps or procedures which governs the analysis and design of this particular Data Mining Model for Tertiary Institutions. The methods, techniques ...