WorldWideScience

Sample records for analysis methodology volume

  1. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  2. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  3. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume I. Data analysis methodology and hardware description

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and had dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  4. Transport of solid commodities via freight pipeline: demand analysis methodology. Volume IV. First year final report

    Energy Technology Data Exchange (ETDEWEB)

    Allen, W.B.; Plaut, T.

    1976-07-01

    In order to determine the feasibility of intercity freight pipelines, it was necessary to determine whether sufficient traffic flows currently exist between various origins and destinations to justify consideration of a mode whose operating characteristics became competitive under conditions of high-traffic volume. An intercity origin/destination freight-flow matrix was developed for a large range of commodities from published sources. A high-freight traffic-density corridor between Chicago and New York and another between St. Louis and New York were studied. These corridors, which represented 18 cities, had single-direction flows of 16 million tons/year. If trans-shipment were allowed at each of the 18 cities, flows of up to 38 million tons/year were found in each direction. These figures did not include mineral or agricultural products. After determining that such pipeline-eligible freight-traffic volumes existed, the next step was to determine the ability of freight pipeline to penetrate such markets. Modal-split models were run on aggregate data from the 1967 Census of Transportation. Modal-split models were also run on disaggregate data specially collected for this study. The freight pipeline service characteristics were then substituted into both the aggregate and disaggregate models (truck vs. pipeline and then rail vs. pipeline) and estimates of pipeline penetration into particular STCC commodity groups were made. Based on these very preliminary results, it appears that freight pipeline has market penetration potential that is consistent with high-volume participation in the intercity freight market.

  5. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  6. Socioeconomic effects of the DOE Gas Centrifuge Enrichment Plant. Volume 1: methodology and analysis

    International Nuclear Information System (INIS)

    The socioeconomic effects of the Gas Centrifuge Enrichment Plant being built in Portsmouth, Ohio were studied. Chapters are devoted to labor force, housing, population changes, economic impact, method for analysis of services, analysis of service impacts, schools, and local government finance

  7. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  8. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  9. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  10. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    International Nuclear Information System (INIS)

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  11. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  12. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  13. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  14. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  15. Normative price for a manufactured product: the SAMICS methodology. Volume II. Analysis. JPL publication 78-98. [Solar Array Manufacturing Industry Costing Standards

    Energy Technology Data Exchange (ETDEWEB)

    Chamberlain, R.G.

    1979-01-15

    The Solar Array Manufacturing Industry Costing Standards (SAMICS) provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. This document presents the methodology and its theoretical background. It is contended that the model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simulation computer program (SAMIS III, Release 1) is discussed.

  16. Particulate matter test in small volume parenterals: critical aspects in sampling methodology.

    Science.gov (United States)

    Pavanetto, F; Conti, B; Genta, I; Ponci, R; Montanari, L; Grassi, M

    1989-06-01

    The following critical steps of the particulate matter test sampling methodology for small volume parenteral products (SVPs), conduct by light blockage method, were considered: 1) reliability of the small volume aspirator sampler for different sample volumes; 2) particulate matter distribution inside each ampoule in liquid products (8 liquid SVPs tested); 3) influence of the sample preparation method on the evaluation of the final contamination of the sample. Nine liquid SVPs were tested by preparing samples following the three U.S.P. XXI methods: 1) unit as it is (direct analysis), II) unit diluted, III) sample obtained by combining several units. Particles counts were performed by a HIAC/ROYCO model 3000 counter fitted with a small volume sampler. The validation of the sampler shows that it should be improved. A more accurate and strict validation than the one stated by U.S.P. XXI is suggested. The particulate matter distribution in liquid products is found to be uniform inside the ampoule in the size range greater than or equal to 2 microns-greater than or equal to 10 microns; the analysis can be performed examining only a portion of the whole content. The three sample preparation methods lead to significantly different contamination results. The particulate control test should be conduct by direct analysis, as it is carried out under the same conditions as for product use. The combining method (III) is suggested for products of less than 2 ml volume that cannot be examined by direct analysis. PMID:2803449

  17. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  18. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  19. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  20. Similar methodological analysis involving the user experience.

    Science.gov (United States)

    Almeida e Silva, Caio Márcio; Okimoto, Maria Lúcia R L; Tanure, Raffaela Leane Zenni

    2012-01-01

    This article deals with the use of a protocol for analysis of similar methodological analysis related to user experience. For both, were selected articles recounting experiments in the area. They were analyze based on the similar analysis protocol and finally, synthesized and associated.

  1. An Optimization Methodology (DEA Analysis

    Directory of Open Access Journals (Sweden)

    Jibendu Kumar Mantri

    2007-08-01

    Full Text Available In the growing face of deforestation, conservation is the only way to save forest and its precious wild animals, from the human encounter. “Project Tiger “(1973 at Similipal is a welcome step on the direction of tiger conservation, whose population is on the verge of extinction. For the proper protection, preservation and propagation of tiger and forest in the Similipal Tiger Reserve (STR funds have been allocated from time to time by central govt., state govt. & various NGOs of national and international repute. The responsibility of managing the earmarked fund rests with the management of STR. This paper observes the interrelationship of funds with the trend of tiger population & other variables by using suitable econometric model. Some standard results have been explained. Also it examines the level of efficiency of fund utilization for eight financial years taking the help of Data Envelopment Analysis (DEA.

  2. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  3. Methodology for the systems engineering process. Volume 3: Operational availability

    Science.gov (United States)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  4. Exploring participatory methodologies in organizational discourse analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork...

  5. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu...

  6. Comparative analysis of energy costing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    El-Sawy, A.H.; Leigh, J.G.; Trehan, R.K.

    1979-02-01

    The methodologies used for computing levelized busbar costs of electricity from geothermal (hydrothermal) resources used by 16 organizations active in the geothermal area are discussed. The methodologies are compared by (a) comparing the results obtained by using two standard data sets, (b) a theoretical analysis of the mathematical formulation of the embedded models, and (c) an examination of differences in data and assumptions. The objective is to attempt to resolve differences in estimates of geothermal (and conventional) electric power costs, upon which policies may be formulated and research, development and demonstration activities designed and implemented.

  7. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  8. Enhanced recovery of unconventional gas. The methodology--Volume III (of 3 volumes)

    Energy Technology Data Exchange (ETDEWEB)

    Kuuskraa, V. A.; Brashear, J. P.; Doscher, T. M.; Elkins, L. E.

    1979-02-01

    The methodology is described in chapters on the analytic approach, estimated natural gas production, recovery from tight gas sands, recovery from Devonian shales, recovery from coal seams, and recovery from geopressured aquifers. (JRD)

  9. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study.

  10. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  11. LOFT blowdown experiment safety analysis methodology

    International Nuclear Information System (INIS)

    An unprecedented blowdown experiment safety analysis (ESA) has been performed for the first two scheduled nuclear experiments in the Loss-of-Fluid Test (LOFT) facility. The ESA methodology is a unique approach needed to estimate conservatively the maximum consequences that will occur during an experiment. Through use of this information an acceptable risk in terms of adequate protection of the facility, personnel, and general public can be balanced with the requirements of the experiment program objectives. As an example, one of the LOFT program objectives is to evaluate the performance and effectiveness of emergency core cooling systems (ECCS) while relying on the same ECCSs (and backup ECCSs) to effectively perform as plant protection systems (PPS). The purpose of this paper is to present the LOFT blowdown ESA methodology

  12. Spatial analysis methodology applied to rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)

    2006-08-15

    The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)

  13. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  14. Protein crystallography. Methodological development and comprehensive analysis

    International Nuclear Information System (INIS)

    There have been remarkable developments in the methodology for protein structure analysis over the past few decades. Currently, single-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-SAD) is used as a general method for determining protein structure, while the sulfur single-wavelength anomalous diffraction method (S-SAD) using native protein is evolving as a next-generation method. In this paper, we look back on the early applications of multi-wavelength anomalous diffraction phasing of a selenomethionyl derivative (Se-MAD) and introduce the study of ribosomal proteins as an example of the comprehensive analysis that took place in the 1990s. Furthermore, we refer to the current state of development of the S-SAD method as well as automatic structure determination. (author)

  15. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments.

  16. Swept Volume Parameterization for Isogeometric Analysis

    Science.gov (United States)

    Aigner, M.; Heinrich, C.; Jüttler, B.; Pilgerstorfer, E.; Simeon, B.; Vuong, A.-V.

    Isogeometric Analysis uses NURBS representations of the domain for performing numerical simulations. The first part of this paper presents a variational framework for generating NURBS parameterizations of swept volumes. The class of these volumes covers a number of interesting free-form shapes, such as blades of turbines and propellers, ship hulls or wings of airplanes. The second part of the paper reports the results of isogeometric analysis which were obtained with the help of the generated NURBS volume parameterizations. In particular we discuss the influence of the chosen parameterization and the incorporation of boundary conditions.

  17. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  18. A Methodology For Flood Vulnerability Analysis In Complex Flood Scenarios

    Science.gov (United States)

    Figueiredo, R.; Martina, M. L. V.; Dottori, F.

    2015-12-01

    Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a key research topic. In this paper, we propose a methodology for large-scale analysis of flood vulnerability. The methodology is based on a GIS-based index, which considers local topography, terrain roughness and basic information about the flood scenario to reproduce the diffusive behaviour of floodplain flow. The methodology synthetizes the spatial distribution of index values into maps and curves, used to represent the vulnerability in the area of interest. Its application allows for considering different levels of complexity of flood scenarios, from localized flood defence failures to complex hazard scenarios involving river reaches. The components of the methodology are applied and tested in two floodplain areas in Northern Italy recently affected by floods. The results show that the methodology can provide an original and valuable insight of flood vulnerability variables and processes.

  19. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 2: Appendices

    Science.gov (United States)

    Niiya, Karen E.; Walker, Richard E.; Pieper, Jerry L.; Nguyen, Thong V.

    1993-01-01

    This final report includes a discussion of the work accomplished during the period from Dec. 1988 through Nov. 1991. The objective of the program was to assemble existing performance and combustion stability models into a usable design methodology capable of designing and analyzing high-performance and stable LOX/hydrocarbon booster engines. The methodology was then used to design a validation engine. The capabilities and validity of the methodology were demonstrated using this engine in an extensive hot fire test program. The engine used LOX/RP-1 propellants and was tested over a range of mixture ratios, chamber pressures, and acoustic damping device configurations. This volume contains time domain and frequency domain stability plots which indicate the pressure perturbation amplitudes and frequencies from approximately 30 tests of a 50K thrust rocket engine using LOX/RP-1 propellants over a range of chamber pressures from 240 to 1750 psia with mixture ratios of from 1.2 to 7.5. The data is from test configurations which used both bitune and monotune acoustic cavities and from tests with no acoustic cavities. The engine had a length of 14 inches and a contraction ratio of 2.0 using a 7.68 inch diameter injector. The data was taken from both stable and unstable tests. All combustion instabilities were spontaneous in the first tangential mode. Although stability bombs were used and generated overpressures of approximately 20 percent, no tests were driven unstable by the bombs. The stability instrumentation included six high-frequency Kistler transducers in the combustion chamber, a high-frequency Kistler transducer in each propellant manifold, and tri-axial accelerometers. Performance data is presented, both characteristic velocity efficiencies and energy release efficiencies, for those tests of sufficient duration to record steady state values.

  20. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  1. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  2. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  3. PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jin Hui; Wang Jinnuo; Wang Libin

    2003-01-01

    The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.

  4. A New Methodology of Spatial Crosscorrelation Analysis

    CERN Document Server

    Chen, Yanguang

    2015-01-01

    The idea of spatial crosscorrelation was conceived of long ago. However, unlike the related spatial autocorrelation, the theory and method of spatial crosscorrelation analysis have remained undeveloped. This paper presents a set of models and working methods for spatial crosscorrelation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form and by means of mathematical reasoning, I derive a theoretical framework for geographical crosscorrelation analysis. First, two sets of spatial crosscorrelation coefficients are defined, including a global spatial crosscorrelation coefficient and a set of local spatial crosscorrelation coefficients. Second, a pair of scatterplots of spatial crosscorrelation is proposed, and different scatterplots show different relationships between correlated variables. Based on the spatial crosscorrelation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial crosscorrelation) and indirect correlation (sp...

  5. Finite Volume Methods: Foundation and Analysis

    Science.gov (United States)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  6. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels;

    2010-01-01

    for implementation into a computer aided reasoning tool for HAZOP studies to perform root cause and consequence analysis. Such a tool will facilitate finding causes far away from the site of the deviation. A Functional HAZOP Assistant is proposed and investigated in a HAZOP study of an industrial scale Indirect...

  7. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatyana; Ruel, Huub; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the disco

  8. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  9. Climatic analysis methodology of vernacular architecture

    OpenAIRE

    Gil Crespo, Ignacio Javier; Barbero Barrera, María del Mar; Maldonado Ramos, Luis

    2015-01-01

    Vernacular architecture has demonstrated its perfect environmental adaptation through its empirical development and improvement by generations of user-builders. Nowadays, the sustainability of vernacular architecture is the aim of some research projects in which the same method should be applied in order to be comparable. Hence, we propose a research method putting together various steps. Through the analysis of geographical, lithology, economic, cultural and social influence as well as mater...

  10. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  11. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume I of III: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This report develops and demonstrates the methodology for the National Utility Regulatory (NUREG) Model developed under contract number DEAC-01-79EI-10579. It is accompanied by two supporting volumes. Volume II is a user's guide for operation of the NUREG software. This includes description of the flow of software and data, as well as the formats of all user data files. Finally, Volume III is a software description guide. It briefly describes, and gives a listing of, each program used in NUREG.

  12. New methodologies in stable isotope analysis

    International Nuclear Information System (INIS)

    In the 1970s, soil scientists stressed the need for a fast, easy to use 15N analyser to replace the isotope ratio mass spectrometer (IRMS) and Kjeldahl sample preparation. By 1984, three groups had succeeded in interfacing an elemental analyser to an IRMS. 'Continuous flow' Dumas combustion converted N in plant tissue or soil to a pulse of N2 gas, taken to the mass spectrometer by a helium carrier. Throughput increased from 20 to 100 analyses per day and only 5 μg N were required compared with 50 μg N for Kjeldahl-Rittenberg preparation and IRMS analysis. Since 1987, a software controlled automated nitrogen and carbon analyser-mass spectrometer (ANCA-MS) has been developed with which 15N and 13C can be measured to 0.0003 and 0.0002 at.% RSD respectively. Reducing hardware has made it portable, enabling it to be used in the field. Measurement of submicrogram quantities of nitrogen is possible using software control to move the oxygen pulse, with its N2 'blank', out of phase with the sample. Software also allows operation at twice normal speed, enabling plant breeders to screen genotypes for N fixing ability within the flowering period. 35 refs, 6 figs, 7 tabs

  13. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  14. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  15. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author)

  16. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    A synthesized methodology of safety analysis and evaluation for general fusion systems is proposed. In the course of the methodology development, its main frame has been constructed in order to take account of all safety-related items and to ensure a logical consistency. The safety-related items are divided broadly into two groups. One of them is the public protection from radiological hazard, which is introduced as a safety requirement from an external viewpoint for the fusion system. The other items are the matter from an internal viewpoint and are related to the fusion system behavior in itself. These items are composed of the understanding of a fusion system, the safety ensuring principle and the function based safety analysis. All of these items have been mapped on the frame, considering the mutual relations, among them, consistently. To complete the methodology development, the safety evaluation for the actual design of a fusion system has been performed in conformity to this methodology. Thus, it has been demonstrated that the methodology proposed here is appropriate to the safety analysis and evaluation for the fusion system. (author). 9 refs, 4 figs, 2 tabs

  17. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  18. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  19. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  20. Estimates of emergency operating capacity in US manufacturing and nonmanufacturing industries - Volume 1: Concepts and Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B. (Pacific Northwest Lab., Richland, WA (USA)); Serot, D.E. (D/E/S Research, Richland, WA (USA)); Kellogg, M.A. (ERCE, Inc., Portland, OR (USA))

    1991-03-01

    Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 through 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)

  1. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  2. Relationship between stroke volume and pulse pressure during blood volume perturbation: a mathematical analysis.

    Science.gov (United States)

    Bighamian, Ramin; Hahn, Jin-Oh

    2014-01-01

    Arterial pulse pressure has been widely used as surrogate of stroke volume, for example, in the guidance of fluid therapy. However, recent experimental investigations suggest that arterial pulse pressure is not linearly proportional to stroke volume. However, mechanisms underlying the relation between the two have not been clearly understood. The goal of this study was to elucidate how arterial pulse pressure and stroke volume respond to a perturbation in the left ventricular blood volume based on a systematic mathematical analysis. Both our mathematical analysis and experimental data showed that the relative change in arterial pulse pressure due to a left ventricular blood volume perturbation was consistently smaller than the corresponding relative change in stroke volume, due to the nonlinear left ventricular pressure-volume relation during diastole that reduces the sensitivity of arterial pulse pressure to perturbations in the left ventricular blood volume. Therefore, arterial pulse pressure must be used with care when used as surrogate of stroke volume in guiding fluid therapy.

  3. Sensitivity analysis of volume scattering phase functions.

    Science.gov (United States)

    Tuchow, Noah; Broughton, Jennifer; Kudela, Raphael

    2016-08-01

    To solve the radiative transfer equation and relate inherent optical properties (IOPs) to apparent optical properties (AOPs), knowledge of the volume scattering phase function is required. Due to the difficulty of measuring the phase function, it is frequently approximated. We explore the sensitivity of derived AOPs to the phase function parameterization, and compare measured and modeled values of both the AOPs and estimated phase functions using data from Monterey Bay, California during an extreme "red tide" bloom event. Using in situ measurements of absorption and attenuation coefficients, as well as two sets of measurements of the volume scattering function (VSF), we compared output from the Hydrolight radiative transfer model to direct measurements. We found that several common assumptions used in parameterizing the radiative transfer model consistently introduced overestimates of modeled versus measured remote-sensing reflectance values. Phase functions from VSF data derived from measurements at multiple wavelengths and a single scattering single angle significantly overestimated reflectances when using the manufacturer-supplied corrections, but were substantially improved using newly published corrections; phase functions calculated from VSF measurements using three angles and three wavelengths and processed using manufacture-supplied corrections were comparable, demonstrating that reasonable predictions can be made using two commercially available instruments. While other studies have reached similar conclusions, our work extends the analysis to coastal waters dominated by an extreme algal bloom with surface chlorophyll concentrations in excess of 100 mg m-3. PMID:27505819

  4. Latest developments on safety analysis methodologies at the Juzbado plant

    Energy Technology Data Exchange (ETDEWEB)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A. [ENUSA Industrias Avanzadas S. A., Juzbado Nuclear Fuel Fabrication Plant, Ctra. Salamanca-Ledesma, km. 26, 37015 Juzbado, Salamanca (Spain)

    2010-07-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  5. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    OpenAIRE

    Andi Andi

    2004-01-01

    Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks...

  6. Computational stress analysis using finite volume methods

    OpenAIRE

    Fallah, Nosrat Allah

    2000-01-01

    There is a growing interest in applying finite volume methods to model solid mechanics problems and multi-physics phenomena. During the last ten years an increasing amount of activity has taken place in this area. Unlike the finite element formulation, which generally involves volume integrals, the finite volume formulation transfers volume integrals to surface integrals using the divergence theorem. This transformation for convection and diffusion terms in the governing equations, ensures...

  7. An Efficient Analysis Methodology for Fluted-Core Composite Structures

    Science.gov (United States)

    Oremont, Leonard; Schultz, Marc R.

    2012-01-01

    The primary loading condition in launch-vehicle barrel sections is axial compression, and it is therefore important to understand the compression behavior of any structures, structural concepts, and materials considered in launch-vehicle designs. This understanding will necessarily come from a combination of test and analysis. However, certain potentially beneficial structures and structural concepts do not lend themselves to commonly used simplified analysis methods, and therefore innovative analysis methodologies must be developed if these structures and structural concepts are to be considered. This paper discusses such an analysis technique for the fluted-core sandwich composite structural concept. The presented technique is based on commercially available finite-element codes, and uses shell elements to capture behavior that would normally require solid elements to capture the detailed mechanical response of the structure. The shell thicknesses and offsets using this analysis technique are parameterized, and the parameters are adjusted through a heuristic procedure until this model matches the mechanical behavior of a more detailed shell-and-solid model. Additionally, the detailed shell-and-solid model can be strategically placed in a larger, global shell-only model to capture important local behavior. Comparisons between shell-only models, experiments, and more detailed shell-and-solid models show excellent agreement. The discussed analysis methodology, though only discussed in the context of fluted-core composites, is widely applicable to other concepts.

  8. PANSYSTEMS ANALYSIS: MATHEMATICS, METHODOLOGY,RELATIVITY AND DIALECTICAL THINKING

    Institute of Scientific and Technical Information of China (English)

    郭定和; 吴学谋; 冯向军; 李永礼

    2001-01-01

    Based on new analysis modes and new definitions with relative mathematization and simplification or strengthening forms for concepts of generalized systems,panderivatives , pansymmetry , panbox principle, pansystems relativity, etc. , the framework and related principles of pansystems methodology and pansystems relativity are developed. Related contents include: pansystems with relatively universal mathematizing forns, 200 types of dualities, duality transformation, pansymmetry transformation,pansystems dialectics, the 8-domain method, pansystems mathematical methods,generalized quantification, the principles of approximation-transforming, pan-equivalence theorems , supply-demand analysis, thinking experiment, generalized gray systems, etc.

  9. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  10. Simplifying multivariate survival analysis using global score test methodology

    Science.gov (United States)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz

    2015-12-01

    In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve multiple endpoints, and this situation further complicates the analysis of survival data. In the case of tumor patients, endpoints concerning survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For each patient, these endpoints are correlated, and the estimation of the correlation between two score statistics is fundamental in derivation of overall treatment advantage. In this paper, the bivariate survival analysis method using the global score test methodology is extended to multivariate setting.

  11. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  12. Capillary Electrophoresis-based Methodology Development for Biomolecule Analysis

    OpenAIRE

    Li, Ni

    2011-01-01

    Capillary electrophoresis (CE) is a separation tool with wide applications in biomolecule analysis. Fast and high-resolution separation requiring minute sample volumes is advantageous to study multiple components in biological samples. Flexible modes and methods can be developed. In this thesis, I focus on developing and applying novel CE methods to study multi-target nucleic acid sensing with high sensitivity (Part I) and interactions between multiple components, i.e. proteins, nanoparticles...

  13. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  14. Volume totalizers analysis of pipelines operated by TRANSPETRO National Operational Control Center; Analise de totalizadores de volume em oleodutos operados pelo Centro Nacional de Controle e Operacao da TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Aramaki, Thiago Lessa; Montalvao, Antonio Filipe Falcao [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Marques, Thais Carrijo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2012-07-01

    This paper aims to present the results and methodology in the analysis of differences in volume totals used in systems such as batch tracking and leak detection of pipelines operated by the National Center for Operational Control (CNCO) at TRANSPETRO. In order to optimize this type of analysis, software was developed to acquisition and processing of historical data using the methodology developed. The methodology developed takes into account the particularities encountered in systems operated by TRANSPETRO, more specifically, by CNCO. (author)

  15. Registering a methodology for imaging and analysis of residual-limb shape after transtibial amputation

    Directory of Open Access Journals (Sweden)

    Alexander S. Dickinson, PhD

    2016-03-01

    Full Text Available Successful prosthetic rehabilitation following lower-limb amputation depends upon a safe and comfortable socket-residual limb interface. Current practice predominantly uses a subjective, iterative process to establish socket shape, often requiring several visits to a prosthetist. This study proposes an objective methodology for residual-limb shape scanning and analysis by high-resolution, automated measurements. A 3-D printed "analog" residuum was scanned with three surface digitizers on 10 occasions. Accuracy was measured by the scan-height error between repeat analog scans and the computer-aided design (CAD geometry and the scan versus CAD volume. Subsequently, 20 male residuum casts from ambulatory individuals with transtibial amputation were scanned by two observers, and 10 were repeat-scanned by one observer. The shape files were aligned spatially, and geometric measurements were extracted. Repeatability was evaluated by intraclass correlation, Bland-Altman analysis of scan volumes, and pairwise root-mean-square error ranges of scan area and width profiles. Submillimeter accuracy was achieved when scanning the analog shape using white light and laser scanning technologies. Scanning male residuum casts was highly repeatable within and between observers. The analysis methodology technique provides clinical researchers and prosthetists the capability to establish their own quantitative, objective, multipatient datasets. This could provide an evidence base for training, long-term follow-up, and interpatient outcome comparison, for decision support in socket design.

  16. Loss of Coolant Accident Analysis Methodology for SMART-P

    Energy Technology Data Exchange (ETDEWEB)

    Bae, K. H.; Lee, G. H.; Yang, S. H.; Yoon, H. Y.; Kim, S. H.; Kim, H. C

    2006-02-15

    The analysis methodology on the Loss-of-coolant accidents (LOCA's) for SMART-P is described in this report. SMART-P is an advanced integral type PWR producing a maximum thermal power of 65.5 MW with metallic fuel. LOCA's are hypothetical accidents that would result from the loss of reactor coolant, at a rate in excess of the capability of the reactor coolant makeup system, from breaks in pipes in the reactor coolant pressure boundary up to and including a break equivalent in size to the double-ended rupture of the largest pipe in the reactor coolant system. Since SMART-P contains the major primary circuit components in a single Reactor Pressure Vessel (RPV), the possibility of a large break LOCA (LBLOCA) is inherently eliminated and only the small break LOCA is postulated. This report describes the outline and acceptance criteria of small break LOCA (SBLOCA) for SMART-P and documents the conservative analytical model and method and the analysis results using the TASS/SMR code. This analysis method is applied in the SBLOCA analysis performed for the ECCS performance evaluation which is described in the section 6.3.3 of the safety analysis report. The prediction results of SBLOCA analysis model of SMART-P for the break flow, system's pressure and temperature distributions, reactor coolant distribution, single and two-phase natural circulation phenomena, and the time of major sequence of events, etc. should be compared and verified with the applicable separate and integral effects test results. Also, it is required to set-up the feasible acceptance criteria applicable to the metallic fueled integral reactor of SMART-P. The analysis methodology for the SBLOCA described in this report will be further developed and validated as the design and licensing status of SMART-P evolves.

  17. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  18. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  19. A Posteriori Analysis for Hydrodynamic Simulations Using Adjoint Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Woodward, C S; Estep, D; Sandelin, J; Wang, H

    2009-02-26

    This report contains results of analysis done during an FY08 feasibility study investigating the use of adjoint methodologies for a posteriori error estimation for hydrodynamics simulations. We developed an approach to adjoint analysis for these systems through use of modified equations and viscosity solutions. Targeting first the 1D Burgers equation, we include a verification of the adjoint operator for the modified equation for the Lax-Friedrichs scheme, then derivations of an a posteriori error analysis for a finite difference scheme and a discontinuous Galerkin scheme applied to this problem. We include some numerical results showing the use of the error estimate. Lastly, we develop a computable a posteriori error estimate for the MAC scheme applied to stationary Navier-Stokes.

  20. Evaluating some Reliability Analysis Methodologies in Seismic Design

    Directory of Open Access Journals (Sweden)

    A. E. Ghoulbzouri

    2011-01-01

    Full Text Available Problem statement: Accounting for uncertainties that are present in geometric and material data of reinforced concrete buildings is performed in this study within the context of performance based seismic engineering design. Approach: Reliability of the expected performance state is assessed by using various methodologies based on finite element nonlinear static pushover analysis and specialized reliability software package. Reliability approaches that were considered included full coupling with an external finite element code and surface response based methods in conjunction with either first order reliability method or importance sampling method. Various types of probability distribution functions that model parameters uncertainties were introduced. Results: The probability of failure according to the used reliability analysis method and to the selected distribution of probabilities was obtained. Convergence analysis of the importance sampling method was performed. The required duration of analysis as function of the used reliability method was evaluated. Conclusion/Recommendations: It was found that reliability results are sensitive to the used reliability analysis method and to the selected distribution of probabilities. Durations of analysis for coupling methods were found to be higher than those associated to surface response based methods; one should however include time needed to derive these lasts. For the reinforced concrete building considered in this study, it was found that significant variations exist between all the considered reliability methodologies. The full coupled importance sampling method is recommended, but the first order reliability method applied on a surface response model can be used with good accuracy. Finally, the distributions of probabilities should be carefully identified since giving the mean and the standard deviation were found to be insufficient.

  1. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  2. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  3. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  4. Life prediction methodology for ceramic components of advanced heat engines. Phase 1: Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    This volume presents the following appendices: ceramic test specimen drawings and schematics, mixed-mode and biaxial stress fracture of structural ceramics for advanced vehicular heat engines (U. Utah), mode I/mode II fracture toughness and tension/torsion fracture strength of NT154 Si nitride (Brown U.), summary of strength test results and fractography, fractography photographs, derivations of statistical models, Weibull strength plots for fast fracture test specimens, and size functions.

  5. Methodological progresses in Markovian availability analysis and applications

    International Nuclear Information System (INIS)

    The Markovian model applied to reliability analysis is well known as an effective tool, whenever some dependencies affect the probabilistic behaviour of system's components. Its ability to study the dynamical evolution of systems allows to include human actions into the temporal evolution (inspections, maintenances, including human failure probabilities). The starting point has been the Sstagen-Mmarela code. In spite of the fact that this code already realizes much progresses towards reducing the size of markovian matrices (merging of Markov processes of systems exhibiting symmetries), there is still an imperative need to reduce memory requirements. This implies, as a first step of any realistic analysis, a modularization of the studied system into subsystems, which could be 'coupled'. The methodology is applied to the auxiliary feedwater injection of Doel 3. (orig./HSCH)

  6. Segment clustering methodology for unsupervised Holter recordings analysis

    Science.gov (United States)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  7. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  8. Fracture mechanics analysis on VVER1000 RPV with different methodologies

    International Nuclear Information System (INIS)

    The main component that limits the operational life of the (Nuclear Power Plant) NPP is the Reactor pressure Vessel (RPV) because of the property of carbon steel material change during the operational life due to the different causes: high neutron flux in the welding region, thermal aging etc. This results in an increase of RPV embrittlement level that decreases the safety margin for the crack propagation in case of transients with fast cooling rate due to the emergency systems injection, or increase of secondary side heat exchange. This problem is known as Pressurized Thermal Shock (PTS) and constitutes a relevant problem for the safety of the NPP that are in operation from several years. Nowadays, the scientific community is trying to change the approach to the PTS analysis toward a “Best Estimate” (BE) scheme with the aim to remove the excess of conservatism in each step of the analysis coming from the limited knowledge of the phenomena in the eighties when the problem has been considered in the safety analysis. This change has been pushed from the possibility to extend the operational life of some plants and this has been possible due to the availability of always more powerful computer and sophisticated computer codes that allows to the analyst to perform very detailed analysis with very high degree of precision of the mixing phenomena occurring at small scale in the down-comer and to calculate the stress intensity factor at crack tip with very refined mesh of millions of nodes. This paper describes the main steps of a PTS analysis: system thermal-hydraulic calculation, CFD analysis, stress analysis and the Fracture Mechanics analysis for the RPV of a generic VVER1000. In particular the paper shows the comparison of the results of the fracture mechanics analysis performed with different methodology for the calculation of the stress intensity factor at crack tip (KI). (author)

  9. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  10. Optimising the education of responsible shift personnel in nuclear power plants. Volume 1 for Chapter 3: Investigational methodology

    International Nuclear Information System (INIS)

    In line with the usual announcement procedures, an analysis was to be carried out of those activities from which capabilities, knowledge and then learning objectives can be derived in consecutive stages. In this respect, this volume contains articles on the following: the derivation of learning objectives from activities on the themes of capabilities and knowledge; the analysis of professional activity; the appraisal of the descriptors and a textual presentation of the activities. (DG)

  11. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243Am (239Np) and 236Pu or 242Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  12. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  13. The Ohio River Basin energy facility siting model. Volume 1: Methodology

    Science.gov (United States)

    Fowler, G. L.; Bailey, R. E.; Gordon, S. I.; Jansen, S. D.; Randolph, J. C.; Jones, W. W.

    1981-04-01

    The siting model developed for ORBES is specifically designed for regional policy analysis. The region includes 423 counties in an area that consists of all of Kentucky and substantial portions of Illinois, Indiana, Ohio, Pennsylvania, and West Virginia.

  14. Analysis of service identification in SOA methodologies - with a unification in POSI, perspective oriented service identification

    OpenAIRE

    2008-01-01

    This thesis is written in the context of Model Driven Architectures and SOA, and investigates different methodologies and their ways of identifying and describing a service oriented architecture. Methodologies analyzed are OASIS, ARIS, COMET-S and Archimate. With the knowledge from the analysis a perspective oriented service identification methodology will be proposed- POSI. POSI will be analyzed and later compared with the thesis initial analysis of mentioned methodolog...

  15. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  16. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2012-10-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  17. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  18. Methodology for comparative analysis of sustainability in agroforestry systems

    Directory of Open Access Journals (Sweden)

    Saulo Barbosa Lopes

    2003-03-01

    Full Text Available The work analyzes the political/institutional links and technological formats of agroforestry systems in the Caí and Taquari river valleys of the Brazilian state of Rio Grande do Sul in an effort to determine a value for their ability to endure: their sustainability. Sustainability indicators are created for the analysis of these systems. The indicator values will be used to create a Sustainability Index (SI for each studied agroforestry system and to identify each system’s sustainability pattern. The different agroforestry systems will also be classified according to their structural characteristics. Analysis of the identified patterns and indexes will reveal the adequacy of the methodology employed and the consequence of each system’s institutional arrangement, technological format, and sustainability pattern. The agro forestry system that combines exotic forest species with watermelon and the system that combines native forest species with citrus fruit stand out as being most sustainable while, from an institutional perspective, those systems that were linked in an "associative" arrangement had the highest sustainability index values.

  19. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  20. Methodology for the analysis of dynamic human actions

    International Nuclear Information System (INIS)

    A methodology for the analysis of human actions under accident conditions has been developed, which uses information from plant simulator runs, plant procedures, and plant systems information. The objective is to enhance the completeness of the event sequence model (event trees) with respect to both favorable and unfavorable operator actions. Routine human actions that impact the plant at or below the systems level, such as test and maintenance actions, are handled in the systems analysis. Types of dynamic operator actions analyzed in this paper are actions taken during an event sequence that: supplement the automatic response of plant systems for event mitigation, change or detract from the automatic response of plant systems, or lead to recovery of failed systems. The derived results can be used directly in a probabilistic risk assessment. It is judged that the major cause of possible error is misdiagnosis, which can lead to either errors of omission or errors of commission. Operator mistakes may occur when a situation is misclassified or when inappropriate decisions and response selections are made in the operator action sequences. The operator action sequences are modeled in a natural progression of human response, including observation of plant parameters and the diagnosis of the event or the decision to take action

  1. Methodology for the systems engineering process. Volume 1: System functional activities

    Science.gov (United States)

    Nelson, J. H.

    1972-01-01

    Systems engineering is examined in terms of functional activities that are performed in the conduct of a system definition/design, and system development is described in a parametric analysis that combines functions, performance, and design variables. Emphasis is placed on identification of activities performed by design organizations, design specialty groups, as well as a central systems engineering organizational element. Identification of specific roles and responsibilities for doing functions, and monitoring and controlling activities within the system development operation are also emphasized.

  2. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    Science.gov (United States)

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  3. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  4. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  5. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  6. Methodology for Using 3-Dimensional Sonography to Measure Fetal Adrenal Gland Volumes in Pregnant Women With and Without Early Life Stress.

    Science.gov (United States)

    Kim, Deborah; Epperson, C Neill; Ewing, Grace; Appleby, Dina; Sammel, Mary D; Wang, Eileen

    2016-09-01

    Fetal adrenal gland volumes on 3-dimensional sonography have been studied as potential predictors of preterm birth. However, no consistent methodology has been published. This article describes the methodology used in a study that is evaluating the effects of maternal early life stress on fetal adrenal growth to allow other researchers to compare methodologies across studies. Fetal volumetric data were obtained in 36 women at 20 to 22 and 28 to 30 weeks' gestation. Two independent examiners measured multiple images of a single fetal adrenal gland from each sonogram. Intra- and inter-rater consistency was examined. In addition, fetal adrenal volumes between male and female fetuses were reported. The intra- and inter-rater reliability was satisfactory when the mean of 3 measurements from each rater was used. At 20 weeks' gestation, male fetuses had larger average adjusted adrenal volumes than female fetuses (mean, 0.897 versus 0.638; P = .004). At 28 weeks' gestation, the fetal weight was more influential in determining values for adjusted fetal adrenal volume (0.672 for male fetuses versus 0.526 for female fetuses; P = .034). This article presents a methodology for assessing fetal adrenal volume using 3-dimensional sonography that can be used by other researchers to provide more consistency across studies.

  7. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  8. Overview of core simulation methodologies for light water reactor analysis

    International Nuclear Information System (INIS)

    The current in-core fuel management calculation methods provide a very efficient route to predict neutronics behavior of light water reactor (LWR) cores and their prediction accuracy for current generation LWRs is generally sufficient. However, since neutronics calculations for LWRs are based on various assumptions and simplifications, we should also recognize many implicit limitations that are 'embedded' in current neutronics calculation methodologies. Continuous effort for improvement of core simulation methodologies is also discussed. (author)

  9. Using Cost-Volume-Profit Analysis in Decision Making

    OpenAIRE

    IONELA-CLAUDIA DINA; GABRIELA BUŞAN

    2009-01-01

    The cost-volume-profit study the manner how evolve the total revenues, the total costs and operating profit, as changes occur in volume production, sale price, the unit variable cost and / or fixed costs of a product. Managers use this analysis to answer different questions like: How will incomes and costs be affected if we still sell 1.000 units? But if you expand or reduce selling prices? If we expand our business in foreign markets?

  10. Geometrical-Based Navigation System Performance Assessment in the Space Service Volume Using a Multiglobal Navigation Satellite System Methodology

    Science.gov (United States)

    Welch, Bryan W.

    2016-01-01

    NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.

  11. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is indicated in…

  12. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  13. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  14. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  15. Methodologies for Assessing the Cumulative Environmental Effects of Hydroelectric Development of Fish and Wildlife in the Columbia River Basin, Volume 1, Recommendations, 1987 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Stull, Elizabeth Ann

    1987-07-01

    This volume is the first of a two-part set addressing methods for assessing the cumulative effects of hydropower development on fish and wildlife in the Columbia River Basin. Species and habitats potentially affected by cumulative impacts are identified for the basin, and the most significant effects of hydropower development are presented. Then, current methods for measuring and assessing single-project effects are reviewed, followed by a review of methodologies with potential for use in assessing the cumulative effects associated with multiple projects. Finally, two new approaches for cumulative effects assessment are discussed in detail. Overall, this report identifies and reviews the concepts, factors, and methods necessary for understanding and conducting a cumulative effects assessment in the Columbia River Basin. Volume 2 will present a detailed procedural handbook for performing a cumulative assessment using the integrated tabular methodology introduced in this volume. 308 refs., 18 figs., 10 tabs.

  16. An Analysis of the Research Methodology of the Ramirez Study.

    Science.gov (United States)

    Thomas, Wayne P.

    1992-01-01

    Analyzes the political, educational, and technical factors that strongly influenced the Ramirez study of bilingual programs. Enumerates strengths and weaknesses of the study's research methodology, along with implications for decision making in language-minority education. Summarizes defensible conclusions of the study that have not yet been…

  17. Estimation of cell volume and biomass of penicillium chrysogenum using image analysis.

    Science.gov (United States)

    Packer, H L; Keshavarz-Moore, E; Lilly, M D; Thomas, C R

    1992-02-20

    A methodology for the estimation of biomass for the penicillin fermentation using image analysis is presented. Two regions of hyphae are defined to describe the growth of mycelia during fermentation: (1) the cytoplasmic region, and (2) the degenerated region including large vacuoles. The volume occupied by each of these regions in a fixed volume of sample is estimated from area measurements using image analysis. Areas are converted to volumes by treating the hyphae as solid cylinders with the hyphal diameter as the cylinder diameter. The volumes of the cytoplasmic and degenerated regions are converted into dry weight estimations using hyphal density values available from the literature. The image analysis technique is able to estimate biomass even in the presence of nondissolved solids of a concentration of up to 30 gL(-1). It is shown to estimate successfully concentrations of mycelia from 0.03 to 38 gL(-1). Although the technique has been developed for the penicillin fermentation, it should be applicable to other (nonpellected) fungal fermentations.

  18. Response Surface Methodology for the analysis of reactor safety: methodology, development and implementation

    International Nuclear Information System (INIS)

    Reactor safety engineering utilizes RSM techniques as a tool to aid the recovery of information from large simulation codes. The growing interest in the topic, and the need for rigorous methods in risk and reliability analysis have stimulated systematic studies which will lead to the production of an RSM handbook for safety engineering purposes. This paper deals with recent developments in the area which are reported under three main headings: a re-evaluation of the philosophy of using RSM techniques with nuclear safety codes; a comparative study of suitable response functions and experimental design procedures for use in RSM; and a preliminary discussion of multioutput code RSM analysis. The theoretical developments will be shown with reference to their practical applications

  19. Reactor Accident Analysis Methodology for the Advanced Test Reactor Critical Facility Documented Safety Analysis Upgrade

    International Nuclear Information System (INIS)

    The regulatory requirement to develop an upgraded safety basis for a DOE Nuclear Facility was realized in January 2001 by issuance of a revision to Title 10 of the Code of Federal Regulations Section 830 (10 CFR 830). Subpart B of 10 CFR 830, ''Safety Basis Requirements,'' requires a contractor responsible for a DOE Hazard Category 1, 2, or 3 nuclear facility to either submit by April 9, 2001 the existing safety basis which already meets the requirements of Subpart B, or to submit by April 10, 2003 an upgraded facility safety basis that meets the revised requirements. 10 CFR 830 identifies Nuclear Regulatory Commission (NRC) Regulatory Guide 1.70, ''Standard Format and Content of Safety Analysis Reports for Nuclear Power Plants'' as a safe harbor methodology for preparation of a DOE reactor documented safety analysis (DSA). The regulation also allows for use of a graded approach. This report presents the methodology that was developed for preparing the reactor accident analysis portion of the Advanced Test Reactor Critical Facility (ATRC) upgraded DSA. The methodology was approved by DOE for developing the ATRC safety basis as an appropriate application of a graded approach to the requirements of 10 CFR 830

  20. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  1. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  2. A study on the core analysis methodology for SMART CEA ejection accident-I

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Kim, Kyo Yoon; Cho, Byung Oh

    1999-04-01

    A methodology to analyze the fuel enthalpy is developed based on MASTER that is a time dependent 3 dimensional core analysis code. Using the proposed methodology, SMART CEA ejection accident is analyzed. Moreover, radiation doses are estimated at the exclusion area boundary and low population zone to confirm the criteria for the accident. (Author). 31 refs., 13 tabs., 18 figs.

  3. A study on the core analysis methodology for SMART CEA ejection accident-I

    International Nuclear Information System (INIS)

    A methodology to analyze the fuel enthalpy is developed based on MASTER that is a time dependent 3 dimensional core analysis code. Using the proposed methodology, SMART CEA ejection accident is analyzed. Moreover, radiation doses are estimated at the exclusion area boundary and low population zone to confirm the criteria for the accident. (Author). 31 refs., 13 tabs., 18 figs

  4. Human reliability analysis of Three Mile Island II accident considering THERP and ATHEANA methodologies

    International Nuclear Information System (INIS)

    The main purpose of this work is to perform a human reliability analysis using THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis) methodologies, as well as their application to the development of qualitative and quantitative analysis of a nuclear power plant accident. The accident selected was the one that occurred at the Three Mile Island (TMI) Unit 2 Pressurized Water Reactor (PWR) nuclear power plan. The accident analysis has revealed a series of unsafe actions that resulted in permanent loss of the unit. This study also aims at enhancing the understanding of THERP and ATHEANA methodologies and their possible interactions with practical applications. The TMI accident analysis has pointed out the possibility of integration of THERP and ATHEANA methodologies. In this work, the integration between both methodologies is developed in a way to allow better understanding of the influence of operational context on human errors. (author)

  5. Human reliability analysis of Three Mile Island II accident considering THERP and ATHEANA methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Renato Alves; Alvarenga, Marco Antonio Bayout; Gibelli, Sonia Maria Orlando [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)]. E-mails: rfonseca@cnen.gov.br; bayout@cnen.gov.br; sonia@cnen.gov.br; Alvim, Antonio Carlos Marques; Frutuoso e Melo, Paulo Fernando Ferreira [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil)]. E-mails: Alvim@con.ufrj.br; frutuoso@con.ufrj.br

    2008-07-01

    The main purpose of this work is to perform a human reliability analysis using THERP (Technique for Human Error Prediction) and ATHEANA (A Technique for Human Error Analysis) methodologies, as well as their application to the development of qualitative and quantitative analysis of a nuclear power plant accident. The accident selected was the one that occurred at the Three Mile Island (TMI) Unit 2 Pressurized Water Reactor (PWR) nuclear power plan. The accident analysis has revealed a series of unsafe actions that resulted in permanent loss of the unit. This study also aims at enhancing the understanding of THERP and ATHEANA methodologies and their possible interactions with practical applications. The TMI accident analysis has pointed out the possibility of integration of THERP and ATHEANA methodologies. In this work, the integration between both methodologies is developed in a way to allow better understanding of the influence of operational context on human errors. (author)

  6. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  7. Method for measuring anterior chamber volume by image analysis

    Science.gov (United States)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  8. Grinding analysis of Indian coal using response surface methodology

    Institute of Scientific and Technical Information of China (English)

    Twinkle Singh; Aishwarya Awasthi; Pranjal Tripathi; Shina Gautam; Alok Gautam

    2016-01-01

    The present work discusses a systematic approach to model grinding parameters of coal in a ball mill.A three level Box-Behnken design combined with response surface methodology using second order model was applied to the experiments done according to the model requirement.Three parameters ball charge (numbers 10-20),coal content (100-200 g) and the grinding time (4-8 min) were chosen for the experiments as well as for the modeling work.Coal fineness is defined as the d80 (80 % passing size).A quadratic model was developed to show the effect of parameters and their interaction with fineness of the product.Three different sizes (4,1 and 0.65 mm) of Indian coal were used.The model equations for each fraction were developed and different sets of experiments were performed.The predicted values of the fineness of coal were in good agreement with the experimental results (R2 values of d80 varies between 0.97 and 0.99).Fine size of three different coal sizes were obtained with larger ball charge with less grinding time and less solid content.This work represents the efficient use of response surface methodology and the Box-Behnken design use for grinding of Indian coal.

  9. Analysis of MSLB for APR1400 using SPACE Mass and Energy Release Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Seong Min; Ahn, Hyoung Kyoun; Park, Seok Jeong; Park, Chan Eok [KEPCO EnC, Daejeon (Korea, Republic of)

    2015-05-15

    Main Steam Line Break (MSLB) accident is most important Design Basis Accident for containment building functional design. Mass and Energy Release (MER) analysis methodology using Safety and Performance Analysis CodE (SPACE) has been developing. The SPACE which is best estimated code for safety analysis is used to analyze nuclear steam supply system with Containment Analysis Package (CAP). Mass and energy release is predicted for APR1400 during MSLB accident. The peak pressure and temperature are compared with previous MER analysis methodology, KEPCO EnC improved mass and energy release analysis (KIMERA). Various kinds of MSLB for APR1400 are analyzed with MSIV failure and LCC for 102%, 75%, 50%, 20% power. MSLB analysis is performed using linked calculation between SPACE and CAP. As a conclusion, the maximum peak pressure is less conservative than peak pressure by previous methodology. The maximum peak temperature is more conservative than peak temperature by previous methodology. Further work to compare with mass and energy release by previous methodology is needed to establish SPACE MER analysis methodology.

  10. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  11. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  12. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  13. Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    J. Scaglione

    1999-09-09

    This report, ''Summary Report of Laboratory Critical Experiment Analyses Performed for the Disposal Criticality Analysis Methodology'', contains a summary of the laboratory critical experiment (LCE) analyses used to support the validation of the disposal criticality analysis methodology. The objective of this report is to present a summary of the LCE analyses' results. These results demonstrate the ability of MCNP to accurately predict the critical multiplication factor (keff) for fuel with different configurations. Results from the LCE evaluations will support the development and validation of the criticality models used in the disposal criticality analysis methodology. These models and their validation have been discussed in the ''Disposal Criticality Analysis Methodology Topical Report'' (CRWMS M&O 1998a).

  14. Criteria for the development and use of the methodology for environmentally-acceptable fossil energy site evaluation and selection. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Eckstein, L.; Northrop, G.; Scott, R.

    1980-02-01

    This report serves as a companion document to the report, Volume 1: Environmentally-Acceptable Fossil Energy Site Evaluation and Selection: Methodology and Users Guide, in which a methodology was developed which allows the siting of fossil fuel conversion facilities in areas with the least environmental impact. The methodology, known as SELECS (Site Evaluation for Energy Conversion Systems) does not replace a site specific environmental assessment, or an environmental impact statement (EIS), but does enhance the value of an EIS by thinning down the number of options to a manageable level, by doing this in an objective, open and selective manner, and by providing preliminary assessment and procedures which can be utilized during the research and writing of the actual impact statement.

  15. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author)

  16. Local Analysis of Shock Capturing Using Discontinuous Galerkin Methodology

    Science.gov (United States)

    Atkins, H. L.

    1997-01-01

    The compact form of the discontinuous Galerkin method allows for a detailed local analysis of the method in the neighborhood of the shock for a non-linear model problem. Insight gained from the analysis leads to new flux formulas that are stable and that preserve the compactness of the method. Although developed for a model equation, the flux formulas are applicable to systems such as the Euler equations. This article presents the analysis for methods with a degree up to 5. The analysis is accompanied by supporting numerical experiments using Burgers' equation and the Euler equations.

  17. Substance precedes methodology: on cost-benefit analysis and equity

    NARCIS (Netherlands)

    Martens, C.J.C.M.

    2011-01-01

    While distributive aspects have been a topic of discussion in relation to cost–benefit analysis (CBA), little systematic thought has been given in the CBA literature to the focus of such an equity analysis in evaluating transport projects. The goal of the paper is to provide an overview of the vario

  18. The XMM Cluster Survey: X-ray analysis methodology

    CERN Document Server

    Lloyd-Davies, E J; Hosmer, Mark; Mehrtens, Nicola; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G; Hilton, Matt; Liddle, Andrew R; Viana, Pedro T P; Campbell, Heather C; Collins, Chris A; Dubois, E Naomi; Freeman, Peter; Hoyle, Ben; Kay, Scott T; Kuwertz, Emma; Miller, Christopher J; Nichol, Robert C; Sahlen, Martin; Stanford, S Adam; Stott, John P

    2010-01-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM- Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3669 > 4-{\\sigma} cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2 . Of these, 1022 candidates are detected with >300 X-ray counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these sources, as well as to estimate redshifts from the X-ray data alone. A total of 517 (126) X-ray temperatures to a typical accuracy of <40 (<10) per cent have ...

  19. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  20. Meta-analysis: Its role in psychological methodology

    Directory of Open Access Journals (Sweden)

    Andrej Kastrin

    2008-11-01

    Full Text Available Meta-analysis refers to the statistical analysis of a large collection of independent observations for the purpose of integrating results. The main objectives of this article are to define meta-analysis as a method of data integration, to draw attention to some particularities of its use, and to encourage researchers to use meta-analysis in their work. The benefits of meta-analysis include more effective exploitation of existing data from independent sources and contribution to more powerful domain knowledge. It may also serve as a support tool to generate new research hypothesis. The idea of combining results of independent studies addressing the same research question dates back to sixteenth century. Metaanalysis was reinvented in 1976 by Glass, to refute the conclusion of an eminent colleague, Eysenck, that psychotherapy was essentially ineffective. We review some major historical landmarks of metaanalysis and its statistical background. We present the concept of effect size measure, the problem of heterogeneity and two models which are used to combine individual effect sizes (fixed and random effect model in great details. Two visualization techniques, forest and funnel plot graphics are demonstrated. We developed RMetaWeb, simple and fast web server application to conduct meta-analysis online. RMetaWeb is the first web meta-analysis application and is completely based on R software environment for statistical computing and graphics.

  1. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system.

  2. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. PMID:26613351

  3. Multi-physics analysis methodologies for signal integrity

    OpenAIRE

    Jiang, L.

    2010-01-01

    This tutorial discusses two mutli-physics problems involved in modern signal integrity technologies: (1) multi-physics issue related to frequencies: it provides fundamental insights about multi-scale problems and the general strategy in dealing with multi-scale simulations. (2) multi-physics thermal electrical coupling analysis for on-chip and packaging structures. Theoretical analysis and numerical benchmarks will both be employed in the tutorial.

  4. Methodology for statistical analysis of SENCAR mouse skin assay data.

    OpenAIRE

    Stober, J A

    1986-01-01

    Various response measures and statistical methods appropriate for the analysis of data collected in the SENCAR mouse skin assay are examined. The characteristics of the tumor response data do not readily lend themselves to the classical methods for hypothesis testing. The advantages and limitations of conventional methods of analysis and methods recommended in the literature are discussed. Several alternative response measures that were developed specifically to answer the problems inherent i...

  5. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan;

    2016-01-01

    process operating at Novozymes A/S. Following the FUPCR methodology, the final product concentration could be predicted with an average prediction error of 7.4%. Multiple iterations of preprocessing were applied by implementing the methodology to identify the best data handling methods for the model......This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production....... It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify...

  6. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  7. Principal component analysis based methodology to distinguish protein SERS spectra

    Science.gov (United States)

    Das, G.; Gentile, F.; Coluccio, M. L.; Perri, A. M.; Nicastri, A.; Mecarini, F.; Cojoc, G.; Candeloro, P.; Liberale, C.; De Angelis, F.; Di Fabrizio, E.

    2011-05-01

    Surface-enhanced Raman scattering (SERS) substrates were fabricated using electro-plating and e-beam lithography techniques. Nano-structures were obtained comprising regular arrays of gold nanoaggregates with a diameter of 80 nm and a mutual distance between the aggregates (gap) ranging from 10 to 30 nm. The nanopatterned SERS substrate enabled to have better control and reproducibility on the generation of plasmon polaritons (PPs). SERS measurements were performed for various proteins, namely bovine serum albumin (BSA), myoglobin, ferritin, lysozyme, RNase-B, α-casein, α-lactalbumin and trypsin. Principal component analysis (PCA) was used to organize and classify the proteins on the basis of their secondary structure. Cluster analysis proved that the error committed in the classification was of about 14%. In the paper, it was clearly shown that the combined use of SERS measurements and PCA analysis is effective in categorizing the proteins on the basis of secondary structure.

  8. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  9. Comparative proteomic analysis of human pancreatic juice : Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, ZhaoHui; Yang, AiMing; Deng, RuiXue; Mai, CanRong; Sang, XinTing; Faber, Klaas Nico; Lu, XingHua

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  10. Comparative proteomic analysis of human pancreatic juice: Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, Z.H.; Yang, A.M.; Deng, R.X.; Mai, C.R.; Sang, X.T.; Faber, Klaas Nico; Lu, X.H.

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  11. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    Science.gov (United States)

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  12. Application of 3D Scanned Imaging Methodology for Volume, Surface Area, and Envelope Density Evaluation of Densified Biomass

    Science.gov (United States)

    Measurement of surface area, volume, and density is an essential for quantifying, evaluating, and designing the biomass densification, storage, and transport operations. Acquiring accurate and repeated measurements of these parameters for hygroscopic densified biomass are not straightforward and on...

  13. Methodology Improvement of Reactor Physics Codes for CANDU Channels Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Myung Hyun; Choi, Geun Suk; Win, Naing; Aung, Tharndaing; Baek, Min Ho; Lim, Jae Yong [Kyunghee University, Seoul (Korea, Republic of)

    2010-04-15

    As the operational time increase, pressure tubes and calandria tubes in CANDU core encounter inevitably a geometrical deformation along the tube length. A pressure tube may be sagged downward within a calandria tube by creep from irradiation. This event can bring about a problem that is serious in integrity of pressure tube. A measurement of deflection state of in-service pressure tube is, therefore, very important for the safety of CANDU reactor. In this paper, evaluation of impacts on nuclear characteristic due to fuel channel deformation were aimed in order to improve nuclear design tools for concerning the local effects from abnormal deformations. It was known that sagged pressure tube can cause the eccentric configuration of fuel bundles in pressure tube by O.6cm maximum. In this case, adverse pin power distribution and reactivity balance can affect reactor safety under normal and accidental condition. Thermal and radiation-induced creep in pressure tube would expand a tube size. It was known that maximum expansion may be 5% in volume. In this case, more coolant make more moderation in the deformed channel resulting in the increase of reactivity. Sagging of pressure tube did not cause considerable change in K-inf values. However, expansion of the pressure tube made relatively large change in K-inf. Modeling of eccentric and enlarged configuration is not easy in preparation of input geometry at both HELlOS and MCNP. On the other hand, there is no way to consider this deformation in one-dimensional homogenization tool such as WIMS code. The way of handling this deformation was suggested as the correction method of expansion effect by adjusting the number density of coolant. The number density of heavy water coolant was set to be increased as the rate of expansion increase. This correction was done in the intact channel without changing geometry. It was found that this correction was very effective in the prediction of K-inf values. In this study, further

  14. Respirable crystalline silica: Analysis methodologies; Silice cristalina respirable: Metodologias de analisis

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Tena, M. P.; Zumaquero, E.; Ibanez, M. J.; Machi, C.; Escric, A.

    2012-07-01

    This paper describes different analysis methodologies in occupational environments and raw materials. A review is presented of the existing methodologies, the approximations made, some of the constraints involved, as well as the best measurement options for the different raw materials. In addition, the different factors that might affect the precision and accuracy of the results are examined. With regard to the methodologies used for the quantitative analysis of any of the polymorph s, particularly of quartz, the study centres particularly on the analytical X-ray diffraction method. Simplified methods of calculation and experimental separation are evaluated for the estimation of this fraction in the raw materials, such as separation methods by centrifugation, sedimentation, and dust generation in controlled environments. In addition, a review is presented of the methodologies used for the collection of respirable crystalline silica in environmental dust. (Author)

  15. Towards a Domain Analysis Methodology for Collaborative Filtering

    OpenAIRE

    RAFTER, RACHAEL; Smyth, Barry

    2001-01-01

    Collaborative filtering has the ability to make personalised information recommendations in the absence of rich content meta-data, relying instead on collations between the preferences of similar users. However, it depends largely on there being sufficient overlap between the profiles of similar users, and its accuracy is compromised in sparse domains with little profile overlap. We describe an extensive analysis that investigates key domain characteristics that are vital to colla...

  16. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  17. Path Constitution Analysis: A Methodology for Understanding Path Dependence and Path Creation

    OpenAIRE

    Sydow, Jörg; Windeler, Arnold; Müller-Seitz, Gordon; Lange, Knut

    2012-01-01

    Although an increasing number of studies of technological, institutional and organizational change refer to the concepts of path dependence and path creation, few attempts have been made to consider these concepts explicitly in their methodological accounts. This paper addresses this gap and contributes to the literature by developing a comprehensive methodology that originates from the concepts of path dependence and path creation – path constitution analysis (PCA) – and allows for the integ...

  18. The Phenomenological Life-World Analysis and the Methodology of the Social Sciences

    OpenAIRE

    Eberle, Thomas S.

    2010-01-01

    This Alfred Schutz Memorial Lecture discusses the relationship between the phenomenological life-world analysis and the methodology of the social sciences, which was the central motive of Schutz's work. I have set two major goals in this lecture. The first is to scrutinize the postulate of adequacy, as this postulate is the most crucial of Schutz's methodological postulates. Max Weber devised the postulate ‘adequacy of meaning' in analogy to the postulate of ‘causal adequacy' (a concept used ...

  19. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  20. An ATWS Analysis with a Realistic Evaluation Methodology

    International Nuclear Information System (INIS)

    Anticipated Transients Without Scram (ATWS) would occur on failure of all the control and shutdown assemblies to insert into the core following an automatic reactor trip. The major concern of the ATWS derives from consequences of the high primary system pressure which is the characteristic of the transients. According to section 2.4 of YVL guides which are Finnish regulations for safety of nuclear power plants (NPP), the acceptance criterion for the ATWS analysis is that the pressure of the protected item does not exceed a pressure limit that is 1.3 times the design pressure. The main purpose of this paper is to assess its impact on the APR1400 preliminarily, for Europe regulatory environments by applying European Utility Requirements (EUR) for Light Water Reactor Nuclear Power Plants

  1. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  2. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  3. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    CERN Document Server

    Jacobs, Daniel C; Trott, C M; Dillon, Joshua S; Pindor, B; Sullivan, I S; Pober, J C; Barry, N; Beardsley, A P; Bernardi, G; Bowman, Judd D; Briggs, F; Cappallo, R J; Carroll, P; Corey, B E; de Oliveira-Costa, A; Emrich, D; Ewall-Wice, A; Feng, L; Gaensler, B M; Goeke, R; Greenhill, L J; Hewitt, J N; Hurley-Walker, N; Johnston-Hollitt, M; Kaplan, D L; Kasper, J C; Kim, H S; Kratzenberg, E; Lenc, E; Line, J; Loeb, A; Lonsdale, C J; Lynch, M J; McKinley, B; McWhirter, S R; Mitchell, D A; Morales, M F; Morgan, E; Neben, A R; Thyagarajan, N; Oberoi, D; Offringa, A R; Ord, S M; Paul, S; Prabu, T; Procopio, P; Riding, J; Rogers, A E E; Roshi, A; Shankar, N Udaya; Sethi, Shiv K; Srivani, K S; Subrahmanyan, R; Tegmark, M; Tingay, S J; Waterson, M; Wayth, R B; Webster, R L; Whitney, A R; Williams, A; Williams, C L; Wu, C; Wyithe, J S B

    2016-01-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple, independent, data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregr...

  4. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    Science.gov (United States)

    Jacobs, Daniel C.; Hazelton, B. J.; Trott, C. M.; Dillon, Joshua S.; Pindor, B.; Sullivan, I. S.; Pober, J. C.; Barry, N.; Beardsley, A. P.; Bernardi, G.; Bowman, Judd D.; Briggs, F.; Cappallo, R. J.; Carroll, P.; Corey, B. E.; de Oliveira-Costa, A.; Emrich, D.; Ewall-Wice, A.; Feng, L.; Gaensler, B. M.; Goeke, R.; Greenhill, L. J.; Hewitt, J. N.; Hurley-Walker, N.; Johnston-Hollitt, M.; Kaplan, D. L.; Kasper, J. C.; Kim, HS; Kratzenberg, E.; Lenc, E.; Line, J.; Loeb, A.; Lonsdale, C. J.; Lynch, M. J.; McKinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Neben, A. R.; Thyagarajan, N.; Oberoi, D.; Offringa, A. R.; Ord, S. M.; Paul, S.; Prabu, T.; Procopio, P.; Riding, J.; Rogers, A. E. E.; Roshi, A.; Udaya Shankar, N.; Sethi, Shiv K.; Srivani, K. S.; Subrahmanyan, R.; Tegmark, M.; Tingay, S. J.; Waterson, M.; Wayth, R. B.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.; Wu, C.; Wyithe, J. S. B.

    2016-07-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  5. Analysis methodology and recent results of the IGS network combination

    Science.gov (United States)

    Ferland, R.; Kouba, J.; Hutchison, D.

    2000-11-01

    A working group of the International GPS Service (IGS) was created to look after Reference Frame (RF) issues and contribute to the densification and improvement of the International Terrestrial Reference Frame (ITRF). One important objective of the Reference Frame Working Group is to generate consistent IGS station coordinates and velocities, Earth Rotation Parameters (ERP) and geocenter estimates along with the appropriate covariance information. These parameters have a direct impact on other IGS products such as the estimation of GPS satellite ephemerides, as well as satellite and station clocks. The information required is available weekly from the Analysis Centers (AC) (cod, emr, esa, gfz, jpl, ngs, sio) and from the Global Network Associate Analysis Centers (GNAAC) (JPL, mit, ncl) using a "Software Independent Exchange Format" (SINEX). The AC are also contributing daily ERPs as part of their weekly submission. The procedure in place simultaneously combines the weekly station coordinates, geocenter and daily ERP estimates. A cumulative solution containing station coordinates and velocity is also updated with each weekly combination. This provides a convenient way to closely monitor the quality of the estimated station coordinates and to have an up to date cumulative solution available at all times. To provide some necessary redundancy, the weekly station coordinates solution is compared against the GNAAC solutions. Each of the 3 GNAAC uses its own software, allowing independent verification of the combination process. The RMS of the coordinate differences in the north, east and up components between the AC/GNAAC and the ITRF97 Reference Frame Stations are 4-10 mm, 5-20 mm and 6-25 mm. The station velocities within continental plates are compared to the NNR-NUVEL1A plate motion model (DeMets et al., 1994). The north, east and up velocity RMS are 2 mm/y, 3 mm/y and 8 mm/y. Note that NNR-NUVEL1A assumes a zero vertical velocity.

  6. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets

    OpenAIRE

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-01-01

    Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues.

  7. Analysis of volume holographic storage allowing large-angle illumination

    Science.gov (United States)

    Shamir, Joseph

    2005-05-01

    Advanced technological developments have stimulated renewed interest in volume holography for applications such as information storage and wavelength multiplexing for communications and laser beam shaping. In these and many other applications, the information-carrying wave fronts usually possess narrow spatial-frequency bands, although they may propagate at large angles with respect to each other or a preferred optical axis. Conventional analytic methods are not capable of properly analyzing the optical architectures involved. For mitigation of the analytic difficulties, a novel approximation is introduced to treat narrow spatial-frequency band wave fronts propagating at large angles. This approximation is incorporated into the analysis of volume holography based on a plane-wave decomposition and Fourier analysis. As a result of the analysis, the recently introduced generalized Bragg selectivity is rederived for this more general case and is shown to provide enhanced performance for the above indicated applications. The power of the new theoretical description is demonstrated with the help of specific examples and computer simulations. The simulations reveal some interesting effects, such as coherent motion blur, that were predicted in an earlier publication.

  8. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  9. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  10. Anthropological analysis of taekwondo--new methodological approach.

    Science.gov (United States)

    Cular, Drazen; Munivrana, Goran; Katić, Ratko

    2013-05-01

    The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance

  11. A new methodology for the CFD uncertainty analysis

    Institute of Scientific and Technical Information of China (English)

    YAO Zhen-qiu; SHEN Hong-cui; GAO Hui

    2013-01-01

    With respect to the measurement uncertainty,this paper discusses the definition,the sources,the classification and the expressions of the CFD uncertainty.Based on the orthogonal design and the statistics inference theory,a new verification and validation method and the related procedures in the CFD simulation are developed.With the method,two examples of the CFD verification and validation are studied for the drag coefficient and the nominal wake fraction,and the calculation factors and their interactions which would significantly affect the simulation results are obtained.Moreover,the sizes of all uncertainty components resulting from the controlled and un-controlled calculation factors are determined,and the optimal combination of the calculation factors is obtained by an effect estimation in the orthogonal experiment design.It is shown that the new method can be used for the verification in the CFD uncertainty analysis,and can reasonably and definitely judge the credibility of the simulative result.As for CFD simulation of the drag coefficient and the nominal wake fraction,the results predicted can be validated.Although there is still some difference between the simulation results and the experiment results,its approximate level and credibility can be accepted.

  12. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  13. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. PMID:26186171

  14. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  15. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  16. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  17. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and Braslet-M Occlusion Cuffs

    Science.gov (United States)

    Hamilton, Douglas; Sargsyan, Ashot E.; Ebert, Douglas; Duncan, Michael; Bogomolov, Valery V.; Alferova, Irina V.; Matveev, Vladimir P.; Dulchavsky, Scott A.

    2010-01-01

    The objective of this joint U.S. - Russian project was the development and validation of an in-flight methodology to assess a number of cardiac and vascular parameters associated with circulating volume and its manipulation in long-duration space flight. Responses to modified Valsalva and Mueller maneuvers were measured by cardiac and vascular ultrasound (US) before, during, and after temporary volume reduction by means of Braslet-M thigh occlusion cuffs (Russia). Materials and Methods: The study protocol was conducted in 14 sessions on 9 ISS crewmembers, with an average exposure to microgravity of 122 days. Baseline cardiovascular measurements were taken by echocardiography in multiple modes (including tissue Doppler of both ventricles) and femoral and jugular vein imaging on the International Space Station (ISS). The Braslet devices were then applied and measurements were repeated after >10 minutes. The cuffs were then released and the hemodynamic recovery process was monitored. Modified Valsalva and Mueller maneuvers were used throughout the protocol. All US data were acquired by the HDI-5000 ultrasound system aboard the ISS (ATL/Philips, USA) during remotely guided sessions. The study protocol, including the use of Braslet-M for this purpose, was approved by the ISS Human Research Multilateral Review Board (HRMRB). Results: The effects of fluid sequestration on a number of echocardiographic and vascular parameters were readily detectable by in-flight US, as were responses to respiratory maneuvers. The overall volume status assessment methodology appears to be valid and practical, with a decrease in left heart lateral E (tissue Doppler) as one of the most reliable measures. Increase in the femoral vein cross-sectional areas was consistently observed with Braslet application. Other significant differences and trends within the extensive cardiovascular data were also observed. (Decreased - RV and LV preload indices, Cardiac Output, LV E all maneuvers, LV Stroke

  18. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  19. Parallel runway requirement analysis study. Volume 1: The analysis

    Science.gov (United States)

    Ebrahimi, Yaghoob S.

    1993-01-01

    The correlation of increased flight delays with the level of aviation activity is well recognized. A main contributor to these flight delays has been the capacity of airports. Though new airport and runway construction would significantly increase airport capacity, few programs of this type are currently underway, let alone planned, because of the high cost associated with such endeavors. Therefore, it is necessary to achieve the most efficient and cost effective use of existing fixed airport resources through better planning and control of traffic flows. In fact, during the past few years the FAA has initiated such an airport capacity program designed to provide additional capacity at existing airports. Some of the improvements that that program has generated thus far have been based on new Air Traffic Control procedures, terminal automation, additional Instrument Landing Systems, improved controller display aids, and improved utilization of multiple runways/Instrument Meteorological Conditions (IMC) approach procedures. A useful element to understanding potential operational capacity enhancements at high demand airports has been the development and use of an analysis tool called The PLAND_BLUNDER (PLB) Simulation Model. The objective for building this simulation was to develop a parametric model that could be used for analysis in determining the minimum safety level of parallel runway operations for various parameters representing the airplane, navigation, surveillance, and ATC system performance. This simulation is useful as: a quick and economical evaluation of existing environments that are experiencing IMC delays, an efficient way to study and validate proposed procedure modifications, an aid in evaluating requirements for new airports or new runways in old airports, a simple, parametric investigation of a wide range of issues and approaches, an ability to tradeoff air and ground technology and procedures contributions, and a way of considering probable

  20. Environmentally-acceptable fossil energy site evaluation and selection: methodology and user's guide. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Northrop, G.M.

    1980-02-01

    This report is designed to facilitate assessments of environmental and socioeconomic impacts of fossil energy conversion facilities which might be implemented at potential sites. The discussion of methodology and the User's Guide contained herein are presented in a format that assumes the reader is not an energy technologist. Indeed, this methodology is meant for application by almost anyone with an interest in a potential fossil energy development - planners, citizen groups, government officials, and members of industry. It may also be of instructional value. The methodology is called: Site Evaluation for Energy Conversion Systems (SELECS) and is organized in three levels of increasing sophistication. Only the least complicated version - the Level 1 SELECS - is presented in this document. As stated above, it has been expressly designed to enable just about anyone to participate in evaluating the potential impacts of a proposed energy conversion facility. To accomplish this objective, the Level 1 calculations have been restricted to ones which can be performed by hand in about one working day. Data collection and report preparation may bring the total effort required for a first or one-time application to two to three weeks. If repeated applications are made in the same general region, the assembling of data for a different site or energy conversion technology will probably take much less time.

  1. Atlas based brain volumetry: How to distinguish regional volume changes due to biological or physiological effects from inherent noise of the methodology.

    Science.gov (United States)

    Opfer, Roland; Suppa, Per; Kepp, Timo; Spies, Lothar; Schippling, Sven; Huppertz, Hans-Jürgen

    2016-05-01

    Fully-automated regional brain volumetry based on structural magnetic resonance imaging (MRI) plays an important role in quantitative neuroimaging. In clinical trials as well as in clinical routine multiple MRIs of individual patients at different time points need to be assessed longitudinally. Measures of inter- and intrascanner variability are crucial to understand the intrinsic variability of the method and to distinguish volume changes due to biological or physiological effects from inherent noise of the methodology. To measure regional brain volumes an atlas based volumetry (ABV) approach was deployed using a highly elastic registration framework and an anatomical atlas in a well-defined template space. We assessed inter- and intrascanner variability of the method in 51 cognitively normal subjects and 27 Alzheimer dementia (AD) patients from the Alzheimer's Disease Neuroimaging Initiative by studying volumetric results of repeated scans for 17 compartments and brain regions. Median percentage volume differences of scan-rescans from the same scanner ranged from 0.24% (whole brain parenchyma in healthy subjects) to 1.73% (occipital lobe white matter in AD), with generally higher differences in AD patients as compared to normal subjects (e.g., 1.01% vs. 0.78% for the hippocampus). Minimum percentage volume differences detectable with an error probability of 5% were in the one-digit percentage range for almost all structures investigated, with most of them being below 5%. Intrascanner variability was independent of magnetic field strength. The median interscanner variability was up to ten times higher than the intrascanner variability.

  2. Comparative analysis of operational forecasts versus actual weather conditions in airline flight planning, volume 1

    Science.gov (United States)

    Keitz, J. F.

    1982-01-01

    The impact of more timely and accurate weather data on airline flight planning with the emphasis on fuel savings is studied. This volume of the report discusses the results of Task 1 of the four major tasks included in the study. Task 1 compares flight plans based on forecasts with plans based on the verifying analysis from 33 days during the summer and fall of 1979. The comparisons show that: (1) potential fuel savings conservatively estimated to be between 1.2 and 2.5 percent could result from using more timely and accurate weather data in flight planning and route selection; (2) the Suitland forecast generally underestimates wind speeds; and (3) the track selection methodology of many airlines operating on the North Atlantic may not be optimum resulting in their selecting other than the optimum North Atlantic Organized Track about 50 percent of the time.

  3. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  4. Application of seismic analysis methodology to small modular integral reactor internals

    International Nuclear Information System (INIS)

    The fluid-structure interaction (FSI) effect should be carefully considered in a seismic analysis of nuclear reactor internals to obtain the appropriate seismic responses because the dynamic characteristics of reactor internals change when they are submerged in the reactor coolant. This study suggests that a seismic analysis methodology considered the FSI effect in an integral reactor, and applies the methodology to the System-Integrated Modular Advanced Reactor (SMART) developed in Korea. In this methodology, we especially focus on constructing a numerical analysis model that can represent the dynamic behaviors considered in the FSI effect. The effect is included in the simplified seismic analysis model by adopting the fluid elements at the gap between the structures. The overall procedures of the seismic analysis model construction are verified by using dynamic characteristics extracted from a scaled-down model, and then the time history analysis is carried out using the constructed seismic analysis model, applying the El Centro earthquake input in order to obtain the major seismic responses. The results show that the seismic analysis model can clearly provide the seismic responses of the reactor internals. Moreover, the results emphasize the importance of the consideration of the FSI effect in the seismic analysis of the integral reactor. (author)

  5. Semantic analysis according to Peep Koort--a substance-oriented research methodology.

    Science.gov (United States)

    Sivonen, Kerstin; Kasén, Anne; Eriksson, Katie

    2010-12-01

    The aim of this article is to describe the hermeneutic semantic analysis created by professor Peep Koort (1920-1977) and to discuss it as a methodology for research within caring science. The methodology is developed with a hermeneutic approach that differs from the traditions of semantic analysis in philosophy or linguistics. The research objects are core concepts and theoretical constructs (originally within the academic discipline of education science, later on within the academic discipline of caring science), focusing deeper understanding of essential meaning content when developing a discipline. The qualitative methodology of hermeneutic semantic analysis is described step by step as created by Koort, interpreted and developed by the authors. An etymological investigation and an analysis of synonymy between related concepts within a conceptual family guides the researcher to understand and discriminate conceptual dimensions of meaning content connected to the word studied, thus giving opportunities to summarise it in a theoretical definition, a discovery that can be tested in varying contexts. From a caring science perspective, we find the hermeneutic methodology of semantic analysis fruitful and suitable for researchers developing their understanding of core concepts and theoretical constructs connected to the development of the academic discipline.

  6. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  7. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  8. Analysis of target volumes for gliomas; Volumes-cibles anatomocliniques (GTV et CTV) des tumeurs gliales

    Energy Technology Data Exchange (ETDEWEB)

    Kantor, G. [Centre Regional de Lutte Contre le Cancer, Service de Radiotherapie, Institut Bergonie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France); Loiseau, H. [Hopital Pellegrin-Tripode, Service de Neurochirurgie, 33 - Bordeaux (France); Bordeaux-2 Univ., 33 (France)

    2005-06-15

    Gliomas are the most frequent tumors of the central nervous system of the adult. These intra-parenchymal tumors are infiltrative and the most important criterion for definition of GTV and CTV is the extent of infiltration. Delineation of GTV and CTV for untreated and resected glioma remains a controversial and difficult issue because of the discrepancy between real tumor invasion and that estimated by CT or MRI. Is particularly helpful a joint analysis of the four different methods as histopathological correlations with CT and MRI, use of new modality imaging, pattern of relapses after treatment and interobserver studies. The presence of isolated tumor cells in intact brain, oedema or adjacent structures requires the definition of two different options for CTV: i) a geometrical option with GTV defined as the tumor mass revealed by the contrast-enhanced zone on CT or MRI and a CTV with an expanded margin of 2 or 3 cm; ii) an anatomic option including the entire zone of oedema or isolated tumor cell infiltration extending at least as far as the limits of the hyperintense zone on T2-weighted MRI. Inclusion of adjacent structures (such as white matter, corpus callosum, subarachnoid spaces) in the CTV mainly depends on the site of the tumor and size of the volume is generally enlarged. (authors)

  9. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  10. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  11. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  12. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  13. Parametric analysis of architectural volumes through genetic algorithms

    Directory of Open Access Journals (Sweden)

    Pedro Salcedo Lagos

    2015-03-01

    Full Text Available During the last time, architectural design has developed partly due to new digital design techniques, which allow the generation of geometries based on the definition of initial parameters and the programming of formal relationship between them. Design processes based on these technologies allow to create shapes with the capacity to modify and adapt to multiple constrains or specific evaluation criteria, which raises the problem of identifying the best architectural solution. Several experiences have set up the utilization of genetic algorithm to face this problem. This paper demonstrates the possibility to implement a parametric analysis of architectural volumes with genetic algorithm, in order to combine functional, environmental and structural requirements, with an effective search method to select a variety of proper solutions through digital technologies.

  14. Methodological Principles of Assessing the Volume of Investment Influx from Non-State Pension Funds into the Economy of Ukraine

    Directory of Open Access Journals (Sweden)

    Dmitro Leonov

    2004-11-01

    Full Text Available This article addresses the processes of forming investment resources from nonstate pension funds under current conditions in Ukraine and the laws and regula tions that define the principles of the formation of in vestment institutions. Based on factors that in the near est future will affect the decisionmaking process by which different kinds of investors make payments to non state pension funds, we develop a procedure for assessing the volume of investment influx from nonstate pension funds into the economy and propose a procedure for long and shortterm prognosis of the volume of investment in flux from nonstate pension funds into the Ukrainian economy.

  15. Complete Photoionization Experiments via Ultrafast Coherent Control with Polarization Multiplexing II: Numerics & Analysis Methodologies

    CERN Document Server

    Hockett, P; Lux, C; Baumert, T

    2015-01-01

    The feasibility of complete photoionization experiments, in which the full set of photoionization matrix elements are determined, using multiphoton ionization schemes with polarization-shaped pulses has recently been demonstrated [Hockett et. al., Phys. Rev. Lett. 112, 223001 (2014)]. Here we extend on our previous work to discuss further details of the numerics and analysis methodology utilised, and compare the results directly to new tomographic photoelectron measurements, which provide a more sensitive test of the validity of the results. In so doing we discuss in detail the physics of the photoionziation process, and suggest various avenues and prospects for this coherent multiplexing methodology.

  16. Can the capability approach be evaluated within the frame of mainstream economics?: A methodological analysis

    Directory of Open Access Journals (Sweden)

    Karaçay Çakmak Hatice

    2010-01-01

    Full Text Available The aim of this article is to examine the capability approach of Amartya Sen and mainstream economic theory in terms of their epistemological, methodological and philosophical/cultural aspects. The reason for undertaking this analysis is the belief that Sen's capability approach, contrary to some economists' claim, is uncongenial to mainstream economic views on epistemology and methodology (not on ontologically. However, while some social scientists regard that Sen, on the whole, is a mainstream economist, his own approach strongly criticizes both the theory and practice of mainstream economics.

  17. Petri net modeling and software safety analysis: methodology for an embedded military application.

    OpenAIRE

    Lewis, Alan D.

    1988-01-01

    Approved for public release; distribution is unlimited This thesis investigates the feasibility of software safety analysis using Petri net modeling and an automated suite of Petri Net UTilities (P-NUT) developed at UC Irvine. We briefly introduce software safety concepts, Petri nets, reachability theory, and the use of P-NUT. We then develop a methodology to combine these ideas for efficient and effective preliminary safety analysis of a real-time, embedded software, ...

  18. Two-dimensional thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Y.N.; Silva, Mario A.B. da; Lira, Carlos A.B. de O., E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamaento de Energia Nuclear

    2015-07-01

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  19. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  20. A Methodological Review for the Analysis of Divide and Conquer Based Sorting/ Searching Algorithms

    Directory of Open Access Journals (Sweden)

    Deepak Abhyankar

    2011-09-01

    Full Text Available This paper develops a practical methodology for the analysis of sorting/searching algorithms. To achieve this objective an analytical study of Quicksort and searching problem was undertaken. This work explains that asymptotic analysis can be misleading if applied slovenly. The study provides a fresh insight into the working of Quicksort and Binary search. Also this presents an exact analysis of Quicksort. Our study finds that asymptotic analysis is a sort of approximation and may hide many useful facts. It was shown that infinite inefficient algorithms can easily be classified with a few efficient algorithms using asymptotic approach.

  1. A methodology for the analysis of protection against overpressure using the Ramona-3 B code

    International Nuclear Information System (INIS)

    A methodology to carry out the most severe overpressure transient that could happen at Laguna Verde Nuclear power Plant is presented, this study is a requirement as a part of the licensing analysis in a fuel reload. The analysis is put into effect with the Ramona 3-B code. The results are compared against the safety analysis report of the Laguna Verde Nuclear Power Plant. The aim of the analysis is to determine the maximum pressure reached on the reactor vessel during operational events, in order to demonstrate agreement with the ASME code for Containers and Pressure Vessels. (Author)

  2. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  3. Predicted costs of environmental controls for a commercial oil shale industry. Volume 1. An engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nevens, T.D.; Culbertson, W.J. Jr.; Wallace, J.R.; Taylor, G.C.; Jovanovich, A.P.; Prien, C.H.; Hicks, R.E.; Probstein, R.F.; Domahidy, G.

    1979-07-01

    The pollution control costs for a commercial oil shale industry were determined in a joint effort by Denver Research Institute, Water Purification Associates of Cambridge, and Stone and Webster Engineering of Boston and Denver. Four commercial oil shale processes were considered. The results in terms of cost per barrel of syncrude oil are predicted to be as follows: Paraho Process, $0.67 to $1.01; TOSCO II Process, $1.43 to $1.91; MIS Process, $2.02 to $3.03; and MIS/Lurgi-Ruhrgas Process, $1.68 to $2.43. Alternative pollution control equipment and integrated pollution control strategies were considered and optimal systems selected for each full-scale plant. A detailed inventory of equipment (along with the rationale for selection), a detailed description of control strategies, itemized costs and predicted emission levels are presented for each process. Capital and operating cost data are converted to a cost per barrel basis using detailed economic evaluation procedures. Ranges of cost are determined using a subjective self-assessment of uncertainty approach. An accepted methodology for probability encoding was used, and cost ranges are presented as subjective probability distributions. Volume I presents the detailed engineering results. Volume II presents the detailed analysis of uncertainty in the predicted costs.

  4. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  5. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  6. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rebollo, L. (Union Fenosa, Madrid (Spain))

    1993-07-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests.

  7. A new analysis methodology for the motion of self-propelled particles and its application

    Science.gov (United States)

    Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

    2011-03-01

    The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

  8. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  9. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  10. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  11. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  12. Travel time impacts analysis of system-wide signal timing optimization methodology

    OpenAIRE

    Ainchil Cayuela, Luis María

    2014-01-01

    This study analyzes the economic impact that users would experience with the travel time variation due to system-wide signal timing optimization. To do this, a comprehensive analysis of travel time user benefits is conducted using traffic volume, speed and other attributes of road network, before and after signal timing optimization.

  13. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Faculdade de Medicina. Dept. de Biologia Molecular], e-mail: mejia_famerp@yahoo.com.br; Braga, J. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Div. de Astrofisica; Correa, R. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Ciencia Espacial e Atmosferica; Leite, J.P. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Neurologia, Psiquiatria e Psicologia Medica; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica

    2009-08-15

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  14. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  15. Finite-Volume Analysis for the Cahn-Hilliard equation with Dynamic boundary conditions

    OpenAIRE

    Nabet, Flore

    2014-01-01

    This work is devoted to the convergence analysis of a finite-volume approximation of the 2D Cahn-Hilliard equation with dynamic boundary conditions. The method that we propose couples a 2d-finite-volume method in a bounded, smooth domain and a 1d-finite-volume method on its boundary. We prove convergence of the sequence of approximate solutions.

  16. Health monitoring methodology based on exergetic analysis for building mechanical systems

    International Nuclear Information System (INIS)

    Exergetic analysis is not often performed in the context of retrocommissioning (RCX); this research provides insight into the benefits of incorporating this approach. Data collected from a previously developed RCX test for an air handling unit (AHU) on a college campus are used in an advanced thermodynamic analysis. The operating data is analyzed using the first and second laws and retrofit design solutions are recommended for improved system performance; the second law analysis is particularly helpful because it requires few additional calculations or data collections. The thermodynamic methodology is extended to a building's cooling plant, which uses a vapor compression refrigeration cycle (VCRC) chiller. Existing chiller data collected for the design of automated fault detection and diagnosis methodology is used. As with the AHU analysis, the second law analysis locates irreversibilities that would not be determined from a first law analysis alone. Plant data representing both normal and faulty operation is used to develop a chiller model for assessing performance and health monitoring. Data is analyzed to determine the viability of health monitoring by performing an exergy analysis on existing data. Conclusions are drawn about the usefulness of exergetic analysis for improving system operations of energy intensive building mechanical systems.

  17. Development of high pressure two-phase choked flow analysis methodology in complex piping system

    International Nuclear Information System (INIS)

    Choked flow mechanism, characteristics of two-phase flow sound velocity and compressibility effects on flow through various piping system components are studied to develop analysis methodology for high pressure two-phase choked flow in complex piping system which allows choking flow rate evaluation and piping system design related analysis. Piping flow can be said choked if Mach number is equal to 1 and compressibility effects can be accounted through modified incompressible formula in momentum equation. Based on these findings, overall analysis system is developed to study thermal-hydraulic effects on steady-state piping system flow and future research items are presented. (Author)

  18. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  19. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  20. Path Constitution Analysis: A Methodology for Understanding Path Dependence and Path Creation

    Directory of Open Access Journals (Sweden)

    Jörg Sydow

    2012-11-01

    Full Text Available Although an increasing number of studies of technological, institutional and organizational change refer to the concepts of path dependence and path creation, few attempts have been made to consider these concepts explicitly in their methodological accounts. This paper addresses this gap and contributes to the literature by developing a comprehensive methodology that originates from the concepts of path dependence and path creation – path constitution analysis (PCA – and allows for the integration of multi-actor constellations on multiple levels of analysis within a process perspective. Based upon a longitudinal case study in the field of semiconductors, we illustrate PCA ‘in action’ as a template for other researchers and critically examine its adequacy. We conclude with implications for further path-oriented inquiries.

  1. Social representations, correspondence factor analysis and characterization questionnaire: a methodological contribution.

    Science.gov (United States)

    Lo Monaco, Grégory; Piermattéo, Anthony; Guimelli, Christian; Abric, Jean-Claude

    2012-11-01

    The characterization questionnaire is inspired by Q-sort methodologies (i.e. qualitative sorting). It consists in asking participants to give their opinion on a list of items by sorting them into categories depending on their level of characterization of the object. This technique allows us to obtain distributions for each item and each response modality (i.e. characteristic vs. not chosen vs. not characteristic). This contribution intends to analyze these frequencies by means of correspondence factor analysis. The originality of this contribution lies in the fact that this kind of analysis has never been used to process data collected by means of this questionnaire. The procedure will be detailed and exemplified by means of two empirical studies on social representations of the good wine and the good supermarket. The interests of such a contribution will be discussed from both methodological points of view and an applications perspective. PMID:23156928

  2. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2011-10-01

    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  3. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  4. Putting phylogeny into the analysis of biological traits: a methodological approach.

    Science.gov (United States)

    Jombart, Thibaut; Pavoine, Sandrine; Devillard, Sébastien; Pontier, Dominique

    2010-06-01

    Phylogenetic comparative methods have long considered phylogenetic signal as a source of statistical bias in the correlative analysis of biological traits. However, the main life-history strategies existing in a set of taxa are often combinations of life history traits that are inherently phylogenetically structured. In this paper, we present a method for identifying evolutionary strategies from large sets of biological traits, using phylogeny as a source of meaningful historical and ecological information. Our methodology extends a multivariate method developed for the analysis of spatial patterns, and relies on finding combinations of traits that are phylogenetically autocorrelated. Using extensive simulations, we show that our method efficiently uncovers phylogenetic structures with respect to various tree topologies, and remains powerful in cases where a large majority of traits are not phylogenetically structured. Our methodology is illustrated using empirical data, and implemented in the adephylo package for the free software R.

  5. Novel Data Mining Methodologies for Adverse Drug Event Discovery and Analysis

    OpenAIRE

    Harpaz, Rave; DuMouchel, William; Shah, Nigam H; Madigan, David; Ryan, Patrick; Friedman, Carol

    2012-01-01

    Discovery of new adverse drug events (ADEs) in the post-approval period is an important goal of the health system. Data mining methods that can transform data into meaningful knowledge to inform patient safety have proven to be essential. New opportunities have emerged to harness data sources that have not been used within the traditional framework. This article provides an overview of recent methodological innovations and data sources used in support of ADE discovery and analysis.

  6. Methodology of image analysis for study of the vertisols moisture content

    OpenAIRE

    Cumbrera Gonzalez, Ramiro Alberto; Milán Vega, Humberto; Tarquis Alfonso, Ana Maria

    2014-01-01

    The main problem to study vertical drainage from the moisture distribution, on a vertisol profile, is searching for suitable methods using these procedures. Our aim was to design a digital image processing methodology and its analysis to characterize the moisture content distribution of a vertisol profile. In this research, twelve soil pits were excavated on a ba re Mazic Pellic Vertisols ix of them in May 13/2011 and the rest in May 19 /2011 after a moderate rainfall event. Digi...

  7. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  8. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Pol, Hilleke E. Hulshoff; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension. perceptual organization and perceptual speed as assessed by the Wechsler I

  9. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  10. Volume component analysis for classification of LiDAR data

    Science.gov (United States)

    Varney, Nina M.; Asari, Vijayan K.

    2015-03-01

    One of the most difficult challenges of working with LiDAR data is the large amount of data points that are produced. Analysing these large data sets is an extremely time consuming process. For this reason, automatic perception of LiDAR scenes is a growing area of research. Currently, most LiDAR feature extraction relies on geometrical features specific to the point cloud of interest. These geometrical features are scene-specific, and often rely on the scale and orientation of the object for classification. This paper proposes a robust method for reduced dimensionality feature extraction of 3D objects using a volume component analysis (VCA) approach.1 This VCA approach is based on principal component analysis (PCA). PCA is a method of reduced feature extraction that computes a covariance matrix from the original input vector. The eigenvectors corresponding to the largest eigenvalues of the covariance matrix are used to describe an image. Block-based PCA is an adapted method for feature extraction in facial images because PCA, when performed in local areas of the image, can extract more significant features than can be extracted when the entire image is considered. The image space is split into several of these blocks, and PCA is computed individually for each block. This VCA proposes that a LiDAR point cloud can be represented as a series of voxels whose values correspond to the point density within that relative location. From this voxelized space, block-based PCA is used to analyze sections of the space where the sections, when combined, will represent features of the entire 3-D object. These features are then used as the input to a support vector machine which is trained to identify four classes of objects, vegetation, vehicles, buildings and barriers with an overall accuracy of 93.8%

  11. Methodological Aspects of Qualitative-Quantitative Analysis of Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Gawlik Remigiusz

    2016-06-01

    Full Text Available The paper aims at recognizing the possibilities and perspectives of application of qualitative-quantitative research methodology in the field of economics, with a special focus on production engineering management processes. The main goal of the research is to define the methods that would extend the research apparatus of economists and managers by tools that allow the inclusion of qualitative determinants into quantitative analysis. Such approach is justified by qualitative character of many determinants of economic occurrences. At the same time quantitative approach seems to be predominant in production engineering management, although methods of transposition of qualitative decision criteria can be found in literature. Nevertheless, international economics and management could profit from a mixed methodology, incorporating both types of determinants into joint decision-making models. The research methodology consists of literature review and own analysis of applicability of mixed qualitative-quantitative methods for managerial decision-making. The expected outcome of the research is to find which methods should be applied to include qualitative-quantitative analysis into multicriteria decision-making models in the fields of economics, with a special regard to production engineering management.

  12. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  13. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  14. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R. (INEEL); Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K. (SNL); Rath, J.S. (New Mexico Engineering Research Institute)

    1998-10-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  15. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  16. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  17. TRAC-BF1/NEM stability methodology for BWR core wide and regional stability analysis

    International Nuclear Information System (INIS)

    A time-series analysis stability methodology is presented based on the TRAC-BF1/NEM coupled code. The methodology presented has a potential application for BWR core-wide and regional stability studies allowed by the 3D capabilities of the code. The stability analysis is performed at two different levels: using the TRAC-BF1 point kinetics model and employing the three-dimensional neutronic transient capability of the NEM code. Point kinetics calculations show power fluctuations when white noise is applied to the inlet mass flow rate of each of the channel components. These fluctuations contain information about the system stability, and are subsequently studied with time-series analysis methods. The analysis performed showed that the reactor core has a low-frequency resonance typical of BWRs. Analysis of preliminary three-dimensional calculations indicates that the power fluctuations do not contain the typical resonance at low frequency. This fact may be related to the limitation of the thermal-hydraulic (T-H) feedback representation through the use of two-dimensional tables for the cross-sections needed for 3D kinetics calculations. The results suggest that a more accurate table look-up should be used, which includes a three-dimensional representation of the feedback parameters (namely, average fuel temperature, average moderator temperature, and void fraction of the T-H cell of interest). Further research is being conducted on improving the cross-section modeling methodology, used to feed the neutron kinetics code for both steady state and transient cases. Also a comprehensive analysis of the code transient solution is being conducted to investigate the nature of the weak dependence of the power response on T-H variations during the performed 3D stability transient calculations

  18. Style, content and format guide for writing safety analysis documents. Volume 1, Safety analysis reports for DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    The purpose of Volume 1 of this 4-volume style guide is to furnish guidelines on writing and publishing Safety Analysis Reports (SARs) for DOE nuclear facilities at Sandia National Laboratories. The scope of Volume 1 encompasses not only the general guidelines for writing and publishing, but also the prescribed topics/appendices contents along with examples from typical SARs for DOE nuclear facilities.

  19. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  20. Methodology for the analysis of fenbendazole and its metabolites in plasma, urine, feces, and tissue homogenates.

    Science.gov (United States)

    Barker, S A; Hsieh, L C; Short, C R

    1986-05-15

    New methodology for the extraction and analysis of the anthelmintic fenbendazole and its metabolites from plasma, urine, liver homogenates, and feces from several animal species is presented. Quantitation of fenbendazole and its metabolites was conducted by high-pressure liquid chromatography using ultraviolet detection at 290 nm. The combined extraction and analysis procedures give excellent recoveries in all of the different biological matrices examined. High specificity, low limits of detection, and excellent linearity, accuracy, and inter- and intrasample variability were also obtained. The study of fenbendazole pharmacokinetics in vitro and in vivo should be greatly enhanced through the utilization of these methods.

  1. Analysis of the processes in training groups: A methodological proposal and an empirical exemplification

    Directory of Open Access Journals (Sweden)

    Florinda Picone

    2014-09-01

    Full Text Available Authors propose a new methodology for group process analysis trough a code-grid of a text of group experience. The method proposed has been constructed on the basis of the CCRT model elaborated by L. Luborsky and some analythic categories suggested by Lieberman and Whitaker in their Focal Group Conflict Theory. Authors proposed also an empirical application sample of their method to a text of an analythic training group. Keywords: group process analysis, training group, empirical group reasearch

  2. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    Science.gov (United States)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  3. A gap analysis methodology for collecting crop genepools: a case study with phaseolus beans.

    Directory of Open Access Journals (Sweden)

    Julián Ramírez-Villegas

    Full Text Available BACKGROUND: The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis of crop wild relatives as a means to guide efficient and effective collecting activities. METHODOLOGY/PRINCIPAL FINDINGS: The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5% are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap "hotspots", representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. CONCLUSIONS/SIGNIFICANCE: Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding. Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources.

  4. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    Science.gov (United States)

    Naik, Kartiki S; Stenstrom, Michael K

    2016-03-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option. PMID:26730575

  5. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  6. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  7. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej;

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate....... The transformation of ordinary systems to element-based ones and the aggregation of non-key elements allow the important design parameters, such as the number of stages, feed stage and minimum reflux ratio, to be determined by using simple diagrams similar to those regularly employed for non-reactive systems...

  8. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    Science.gov (United States)

    Naik, Kartiki S; Stenstrom, Michael K

    2016-03-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option.

  9. A methodology for the analysis on a national scale of environmental parameters

    International Nuclear Information System (INIS)

    It is here described a methodology for the research over the Italian territory of areas suitable for the siting of nuclear power plants. This methodology is designed for the analysis of a wide territory and it makes use of all the parameters available with a continuous character all over Italy (i.e. demographical and hydrographical data); the consideration of the missing parameters is deferred at the moment of the study on the areas resulting from the first review. The authors underline the usefulness of a territorial ''data-bank'', both for sorting out siting zones for nuclear power plants and for more genetic environmental and sanitary evaluations. In this report are also presented thematic charts, deriving from the elaboration of some parameters

  10. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-08-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  11. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  12. A Review and Analysis on Mobile Application Development Processes using Agile Methodologies

    Directory of Open Access Journals (Sweden)

    Harleen K. Flora

    2013-07-01

    currently in use for the development of mobile applications. This paper provides a detailed review and analysis on the use of agile methodologies in the proposed processes associated with mobile application skills and highlights its benefit and constraints. In addition, based on this analysis, future research needs are identified and discussed.

  13. The methodological fundaments of development state analysis of surface engineering technologies

    Directory of Open Access Journals (Sweden)

    A. Dobrzańska-Danikiewicz

    2010-06-01

    Full Text Available Purpose: The goal of this paper is to present the authority methodological fundaments of development state analysis of surface engineering technologies against a background of macro- and microenvironment. That analysis is carried out as a part of the project entitled “The foresight of surface properties formation leading technologies of engineering materials and biomaterials”. The research project called FORSURF is co-founded by European Regional Development Fund.Design/methodology/approach: The foresight is the whole activity focused on choosing the best future vision and showing ways of that vision realisation using the right methods. However, the approach called technology foresight is the process concentrating scientists, engineers, industrialists, Government officials and others in order to identify areas of strategic research and the leading technologies, which in long term will contribute to the greatest economic and social benefits and sustain industrial competitiveness. The considered FORSURF project belongs to the set of technology foresights.Findings: The set of the crucial technologies in each considered research scope is an expected result of the carried out development state analysis of surface engineering technology against a background of macro- and microenvironment. There are fourteen research scopes in the FORSURF project.Research limitations/implications: The results of the development state analysis of surface engineering technologies are the basis conditioning subject matter of the first research iteration of Delphi method carried out within the framework of the FORSURF project. The main research implication of the whole FORSURF project is an identification of strategic research directions crucial in the next 20 years in the field of surface engineering.Practical implications: The practical implication of the definition of the methodological fundaments of development state analysis of surface engineering technology is to

  14. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  15. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Licu, Tony [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: antonio.licu@eurocontrol.int; Cioran, Florin [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: florin.cioran@eurocontrol.int; Hayward, Brent [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: bhayward@dedale.net; Lowe, Andrew [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: alowe@dedale.net

    2007-09-15

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability.

  16. Effects of immersion on visual analysis of volume data.

    Science.gov (United States)

    Laha, Bireswar; Sensharma, Kriti; Schiffbauer, James D; Bowman, Doug A

    2012-04-01

    Volume visualization has been widely used for decades for analyzing datasets ranging from 3D medical images to seismic data to paleontological data. Many have proposed using immersive virtual reality (VR) systems to view volume visualizations, and there is anecdotal evidence of the benefits of VR for this purpose. However, there has been very little empirical research exploring the effects of higher levels of immersion for volume visualization, and it is not known how various components of immersion influence the effectiveness of visualization in VR. We conducted a controlled experiment in which we studied the independent and combined effects of three components of immersion (head tracking, field of regard, and stereoscopic rendering) on the effectiveness of visualization tasks with two x-ray microscopic computed tomography datasets. We report significant benefits of analyzing volume data in an environment involving those components of immersion. We find that the benefits do not necessarily require all three components simultaneously, and that the components have variable influence on different task categories. The results of our study improve our understanding of the effects of immersion on perceived and actual task performance, and provide guidance on the choice of display systems to designers seeking to maximize the effectiveness of volume visualization applications. PMID:22402687

  17. Methodology for Analysis, Modeling and Simulation of Airport Gate-waiting Delays

    Science.gov (United States)

    Wang, Jianfeng

    This dissertation presents methodologies to estimate gate-waiting delays from historical data, to identify gate-waiting-delay functional causes in major U.S. airports, and to evaluate the impact of gate operation disruptions and mitigation strategies on gate-waiting delay. Airport gates are a resource of congestion in the air transportation system. When an arriving flight cannot pull into its gate, the delay it experiences is called gate-waiting delay. Some possible reasons for gate-waiting delay are: the gate is occupied, gate staff or equipment is unavailable, the weather prevents the use of the gate (e.g. lightning), or the airline has a preferred gate assignment. Gate-waiting delays potentially stay with the aircraft throughout the day (unless they are absorbed), adding costs to passengers and the airlines. As the volume of flights increases, ensuring that airport gates do not become a choke point of the system is critical. The first part of the dissertation presents a methodology for estimating gate-waiting delays based on historical, publicly available sources. Analysis of gate-waiting delays at major U.S. airports in the summer of 2007 identifies the following. (i) Gate-waiting delay is not a significant problem on majority of days; however, the worst delay days (e.g. 4% of the days at LGA) are extreme outliers. (ii) The Atlanta International Airport (ATL), the John F. Kennedy International Airport (JFK), the Dallas/Fort Worth International Airport (DFW) and the Philadelphia International Airport (PHL) experience the highest gate-waiting delays among major U.S. airports. (iii) There is a significant gate-waiting-delay difference between airlines due to a disproportional gate allocation. (iv) Gate-waiting delay is sensitive to time of a day and schedule peaks. According to basic principles of queueing theory, gate-waiting delay can be attributed to over-scheduling, higher-than-scheduled arrival rate, longer-than-scheduled gate-occupancy time, and reduced gate

  18. Symbolic Dynamics Analysis: a new methodology for foetal heart rate variability analysis

    OpenAIRE

    Improta, Giovanni

    2015-01-01

    Cardiotocography (CTG) is a widespread foetal diagnostic methods. However, it lacks of objectivity and reproducibility since its dependence on observer's expertise. To overcome these limitations, more objective methods for CTG interpretation have been proposed. In particular, many developed techniques aim to assess the foetal heart rate variability (FHRV). Among them, some methodologies from nonlinear systems theory have been applied to the study of FHRV. All the techniques have proved to be ...

  19. Determination of fiber volume in graphite/epoxy materials using computer image analysis

    Science.gov (United States)

    Viens, Michael J.

    1990-01-01

    The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.

  20. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  1. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author)

  2. Multiscale Entropy Analysis of Center-of-Pressure Dynamics in Human Postural Control: Methodological Considerations

    Directory of Open Access Journals (Sweden)

    Brian J. Gow

    2015-11-01

    Full Text Available Multiscale entropy (MSE is a widely used metric for characterizing the nonlinear dynamics of physiological processes. Significant variability, however, exists in the methodological approaches to MSE which may ultimately impact results and their interpretations. Using publications focused on balance-related center of pressure (COP dynamics, we highlight sources of methodological heterogeneity that can impact study findings. Seventeen studies were systematically identified that employed MSE for characterizing COP displacement dynamics. We identified five key methodological procedures that varied significantly between studies: (1 data length; (2 frequencies of the COP dynamics analyzed; (3 sampling rate; (4 point matching tolerance and sequence length; and (5 filtering of displacement changes from drifts, fidgets, and shifts. We discuss strengths and limitations of the various approaches employed and supply flowcharts to assist in the decision making process regarding each of these procedures. Our guidelines are intended to more broadly inform the design and analysis of future studies employing MSE for continuous time series, such as COP.

  3. Uncertainty quantification methodology development for the best-estimate safety analysis

    International Nuclear Information System (INIS)

    This study deals with two approaches to uncertainty quantification methodology. In the first approach, an uncertainty quantification methodology is proposed and applied to the estimation of nuclear reactor fuel peak cladding temperature (PCT) uncertainty. The proposed method adopts the use of Latin hypercube sampling (LHS). The independency between the input variables is verified through a correlation coefficient test. The uncertainty of the output variables is estimated through a goodness-of-fit test on the sample data. In the application, the approach taken to quantifying the total mean and total 95% probability PCTs is given. Emphasis is placed upon the PCT uncertainty estimation due to models' or correlations' uncertainties with the assumption that significant sources of PCT uncertainty are determined. In the second approach, an uncertainty quantification methodology is proposed for a severe accident analysis which has large uncertainties. The proposed method adopts the concept of probabilistic belief measure to transform an analyst's belief on a top event into the equivalent probability of that top event. For the purpose of comparison, analyses are done by 1) applying probability theory regarding the occurring probability of top event as a physical probability or a frequency, 2) applying fuzzy set theory with fuzzy numbered occurring probability of top event, and 3) transforming the analysts' belief on the top event into equivalent probability by the probabilistic belief measure method

  4. Simulation Methodology for Analysis of Substrate Noise Impact on Analog / RF Circuits Including Interconnect Resistance

    CERN Document Server

    Soens, C; Wambacq, P; Donnay, S

    2011-01-01

    This paper reports a novel simulation methodology for analysis and prediction of substrate noise impact on analog / RF circuits taking into account the role of the parasitic resistance of the on-chip interconnect in the impact mechanism. This methodology allows investigation of the role of the separate devices (also parasitic devices) in the analog / RF circuit in the overall impact. This way is revealed which devices have to be taken care of (shielding, topology change) to protect the circuit against substrate noise. The developed methodology is used to analyze impact of substrate noise on a 3 GHz LC-tank Voltage Controlled Oscillator (VCO) designed in a high-ohmic 0.18 $\\mu$m 1PM6 CMOS technology. For this VCO (in the investigated frequency range from DC to 15 MHz) impact is mainly caused by resistive coupling of noise from the substrate to the non-ideal on-chip ground interconnect, resulting in analog ground bounce and frequency modulation. Hence, the presented test-case reveals the important role of the o...

  5. Time-domain analysis methodology for large-scale RLC circuits and its applications

    Institute of Scientific and Technical Information of China (English)

    LUO Zuying; CAI Yici; Sheldon X.-D Tan; HONG Xianlong; WANG Xiaoyi; PAN Zhu; FU Jingjing

    2006-01-01

    With soaring work frequency and decreasing feature sizes, VLSI circuits with RLC parasitic components are more like analog circuits and should be carefully analyzed in physical design. However, the number of extracted RLC components is typically too large to be analyzed efficiently by using present analog circuit simulators like SPICE. In order to speedup the simulations without error penalty, this paper proposes a novel methodology to compress the time-descritized circuits resulted from numerical integration approximation at every time step. The main contribution of the methodology is the efficient structure-level compression of DC circuits containing many current sources, which is an important complement to present circuit analysis theory. The methodology consists of the following parts: 1) An approach is proposed to delete all intermediate nodes of RL branches. 2) An efficient approach is proposed to compress and back-solve parallel and serial branches so that it is error-free and of linear complexity to analyze circuits of tree topology. 3) The Y to ( transformation method is used to error-free reduce and back-solve the intermediate nodes of ladder circuits with the linear complexity. Thus, the whole simulation method is very accurate and of linear complexity to analyze circuits of chain topology. Based on the methodology, we propose several novel algorithms for efficiently solving RLC-model transient power/ground (P/G) networks. Among them, EQU-ADI algorithm of linear-complexity is proposed to solve RLC P/G networks with mesh-tree or mesh-chain topologies. Experimental results show that the proposed method is at least two orders of magnitude faster than SPICE while it can scale linearly in both time- and memory-complexity to solve very large P/G networks.

  6. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  7. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Živan Ristić

    2006-12-01

    Full Text Available Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a Methodological aspects of evaluation (kinds of evaluation, metaevaluation and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement (b Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspectives (in the perspectives of learning and development, perspectives of internal processes, perspectives of the consumer/user, and in financial perspectives (c Systems and IT solutions of evaluation and measuring performances of the organization in strategic analysis and control.

  8. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  9. Refined multi-level methodology in parallel computing environment for BWR RIA analysis

    Science.gov (United States)

    Solis-Rodarte, Jorge

    2000-12-01

    Best-estimate methodologies in boiling water reactor can reduce the traditional conservative thermal margins imposed on the designs and during the operation of this type of nuclear reactors. Traditional operating thermal margins are obtained based on simplified modeling techniques that are supplemented with the required dose of conservatism. For instance, much more realistic transient pin peaking distributions can be predicted by applying a dehomogenization algorithm, based on a flux reconstruction scheme which uses nodal results during both steady state and transient calculation at each time step. A subchannel analysis module for obtaining thermal margins supplements the calculation approach used. A multi-level methodology to extend the TRAC-BF1/NEM coupled code capability to obtain the transient fuel rod response has been implemented. To fulfill the development requirements some improved neutronic models were implemented into the NEM solution algorithm, namely the pin power reconstruction capability, and the simulation of a dynamic scram. The obtained results were coupled to a subchannel analysis module: COBRA-TF T-H subchannel analysis code. The objective of the pin power reconstruction capability of NEM is to locate the most limiting node (axial region of assembly/channel) within the core. The power information obtained from the NEM 3D neutronic calculation is used by the hot channel analysis module (COBRA-TF). COBRA-TF needs also the T-H conditions at the boundary nodes. This information is provided by TRACBF1 T-H system analysis code. The Subchannel analysis module uses this information to re-calculate the fluid, thermal and hydraulics conditions in the most limiting node (axial region of assembly/channel) within the core.

  10. Automated segmentation and dose-volume analysis with DICOMautomaton

    Science.gov (United States)

    Clark, H.; Thomas, S.; Moiseenko, V.; Lee, R.; Gill, B.; Duzenli, C.; Wu, J.

    2014-03-01

    Purpose: Exploration of historical data for regional organ dose sensitivity is limited by the effort needed to (sub-)segment large numbers of contours. A system has been developed which can rapidly perform autonomous contour sub-segmentation and generic dose-volume computations, substantially reducing the effort required for exploratory analyses. Methods: A contour-centric approach is taken which enables lossless, reversible segmentation and dramatically reduces computation time compared with voxel-centric approaches. Segmentation can be specified on a per-contour, per-organ, or per-patient basis, and can be performed along either an embedded plane or in terms of the contour's bounds (e.g., split organ into fractional-volume/dose pieces along any 3D unit vector). More complex segmentation techniques are available. Anonymized data from 60 head-and-neck cancer patients were used to compare dose-volume computations with Varian's EclipseTM (Varian Medical Systems, Inc.). Results: Mean doses and Dose-volume-histograms computed agree strongly with Varian's EclipseTM. Contours which have been segmented can be injected back into patient data permanently and in a Digital Imaging and Communication in Medicine (DICOM)-conforming manner. Lossless segmentation persists across such injection, and remains fully reversible. Conclusions: DICOMautomaton allows researchers to rapidly, accurately, and autonomously segment large amounts of data into intricate structures suitable for analyses of regional organ dose sensitivity.

  11. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  12. Criticality Safety Evaluation of a Swiss wet storage pool using a global uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Highlights: • Uncertainty evaluation of manufacturing tolerances is relevant for wet storage pool. • Analyses based on global modelling can over-estimates the conservatism. • It is important to perturb independently elements relative to storage racks. • PDF sets based on conventional assumption could over-exceed in conservatism. • We suggest caution in applying standard approach when special PDF are involved. - Abstract: Uncertainty quantification is a key component in the Criticality Safety Evaluation (CSE) of spent nuclear fuel systems. An important source of uncertainties is caused by manufacturing and technological parameter tolerances. In this work, such class of uncertainties are evaluated for a Swiss wet storage pool. The selected configuration corresponds to a case where the target criticality eigenvalue is close to the upper criticality safety limits. Although current PSI CSE safety criteria are fulfilled, it is reasonable to apply uncertainty quantification methodologies in order to provide the regulatory authorities with additional information relevant for safety evaluations. The MTUQ (Manufacturing and Technological Uncertainty Quantification) methodology, based on global stochastic sampling was the selected tool for the analysis. Such tool is specifically designed for the treatment of geometrical/material uncertainties for any target system. In particular the MTUQ advanced modelling capability allows the implementation of realistic boundary condition, with a resulting detailed evaluation of statistical quantities of interest in CSE. Therein, the computational code implemented is the MCNP Monte Carlo based neutron transport code. The analysis showed the benefits in using realistic modelling compared to the traditional one-factor-at-time methodology applied to system modelled using repeated structures. A detailed comparison between the 2 approaches is also presented. Finally, it is discussed the role of asymmetrical probability distribution

  13. Optimization of coagulation-flocculation process for pulp and paper mill effluent by response surface methodological analysis.

    Science.gov (United States)

    Ahmad, A L; Wong, S S; Teng, T T; Zuhairi, A

    2007-06-25

    Coagulation-flocculation is a proven technique for the treatment of high suspended solids wastewater. In this study, the central composite face-centered design (CCFD) and response surface methodology (RSM) have been applied to optimize two most important operating variables: coagulant dosage and pH, in the coagulation-flocculation process of pulp and paper mill wastewater treatment. The treated wastewater with high total suspended solids (TSS) removal, low SVI (sludge volume index) and high water recovery are the main objectives to be achieved through the coagulation-flocculation process. The effect of interactions between coagulant dosage and pH on the TSS removal and SVI are significant, whereas there is no interaction between coagulant dosage and water recovery. Quadratic models have been developed for the response variables, i.e. TSS removal, SVI and water recovery based on the high coefficient of determination (R(2)) value of >0.99 obtained from the analysis of variances (ANOVA). The optimum conditions for coagulant dosage and pH are 1045mgL(-1) and 6.75, respectively, where 99% of TSS removal, SVI of 37mLg(-1) and 82% of water recovery can be obtained. PMID:17161910

  14. Qualitative data analysis using the n Vivo programe and the application of the methodology of grounded theory procedures

    Directory of Open Access Journals (Sweden)

    Niedbalski Jakub

    2012-02-01

    Full Text Available The main aim of the article is to identify the capabilities and constraints of using CAQDAS (Computer-Assisted Qualitative Data Analysis Software programs in qualitative data analysis. Our considerations are based on the personal experiences gained while conducting the research projects using the methodology of grounded theory (GT and the NVivo 8 program. In presented article we focusedon relations between the methodological principles of grounded theory and the technical possibilities of NVivo 8. The paper presents our opinion about the most important options available in NVivo 8 and their application in the studies based on the methodology of grounded theory.

  15. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  16. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  17. A New Mathematical Model for Flank Wear Prediction Using Functional Data Analysis Methodology

    Directory of Open Access Journals (Sweden)

    Sonja Jozić

    2014-01-01

    Full Text Available This paper presents a new approach improving the reliability of flank wear prediction during the end milling process. In the present work, prediction of flank wear has been achieved by using cutting parameters and force signals as the sensitive carriers of information about the machining process. A series of experiments were conducted to establish the relationship between flank wear and cutting force components as well as the cutting parameters such as cutting speed, feed per tooth, and radial depth of cut. In order to be able to predict flank wear a new linear regression mathematical model has been developed by utilizing functional data analysis methodology. Regression coefficients of the model are in the form of time dependent functions that have been determined through the use of functional data analysis methodology. The mathematical model has been developed by means of applied cutting parameters and measured cutting forces components during the end milling of workpiece made of 42CrMo4 steel. The efficiency and flexibility of the developed model have been verified by comparing it with the separate experimental data set.

  18. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  19. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    Science.gov (United States)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are

  20. Analysis of some nuclear waste management options. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Berman, L.E.; Ensminger, D.A.; Giuffre, M.S.; Koplik, C.M.; Oston, S.G.; Pollak, G.D.; Ross, B.I.

    1978-10-10

    This report describes risk analyses performed on that portion of a nuclear fuel cycle which begins following solidification of high-level waste. Risks associated with handling, interim storage and transportation of the waste are assessed, as well as the long term implications of disposal in deep mined cavities. The risk is expressed in terms of expected dose to the general population and peak dose to individuals in the population. This volume consists of appendices which provide technical details of the work performed.

  1. Analysis of airborne radiometric data. Volume 3. Topical reports

    International Nuclear Information System (INIS)

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors

  2. Discourse analysis: a new methodology for understanding the ideologies of health and illness.

    Science.gov (United States)

    Lupton, D

    1992-06-01

    Discourse analysis is an interdisciplinary field of inquiry which has been little employed by public health practitioners. The methodology involves a focus upon the sociocultural and political context in which text and talk occur. Discourse analysis is, above all, concerned with a critical analysis of the use of language and the reproduction of dominant ideologies (belief systems) in discourse (defined here as a group of ideas or patterned way of thinking which can both be identified in textual and verbal communications and located in wider social structures). Discourse analysis adds a linguistic approach to an understanding of the relationship between language and ideology, exploring the way in which theories of reality and relations of power are encoded in such aspects as the syntax, style and rhetorical devices used in texts. This paper argues that discourse analysis is pertinent to the concerns of public health, for it has the potential to lay bare the ideological dimension of such phenomena as lay health beliefs, the doctor-patient relationship, and the dissemination of health information in the entertainment mass media. This dimension is often neglected by public health research. The method of discourse analysis is explained, and examples of its use in the area of public health given. PMID:1391155

  3. Development of core thermal hydraulic analysis methodology using multichannel code system

    International Nuclear Information System (INIS)

    A multi-channel core analysis model using a subchannel code TORC is developed to improve the thermal margin, and is assessed and compared with the existing single-channel analysis model. To apply the TORC code to the w-type reactor core, a hot subchannel DNBR analysis model is developed using the lumping technology. In addition, the sensitivity of TORC to various models and input parameters are carried out to appreciate the code characteristics. The developed core analysis model is applied to the evaluation of the thermal margin for 17 x 17 KOFA loaded core. For this calculation, the KRB1 CHF correlation is developed on the basis of w and Siemens bundle CHF data, and the DNB design limit is established using the STDP method. From the result of the steady-state and transient analysis of the 17 x 17 KOFA loaded core, it is found that the extra 10% DNBR margin can be obtained compared with the existing single-channel analysis methodology. (Author) 65 figs., 12 tabs

  4. MossWinn—methodological advances in the field of Mössbauer data analysis

    International Nuclear Information System (INIS)

    The methodology of Mössbauer data analysis has been advanced via the development of a novel scientific database system concept and its realization in the field of Mössbauer spectroscopy, as well as by the application of parallel computing techniques for the enhancement of the efficiency of various processes encountered in the practice of Mössbauer data handling and analysis. The present article describes the new database system concept along with details of its realization in the form of the MossWinn Internet Database (MIDB), and illustrates the performance advantage that may be realized on multi-core processor systems by the application of parallel algorithms for the implementation of database system functions.

  5. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  6. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  7. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    Science.gov (United States)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  8. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    Science.gov (United States)

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen. PMID:22355708

  9. Seismic geometric attribute analysis for fracture characterization: New methodologies and applications

    Science.gov (United States)

    Di, Haibin

    In 3D subsurface exploration, detection of faults and fractures from 3D seismic data is vital to robust structural and stratigraphic analysis in the subsurface, and great efforts have been made in the development and application of various seismic attributes (e.g. coherence, semblance, curvature, and flexure). However, the existing algorithms and workflows are not accurate and efficient enough for robust fracture detection, especially in naturally fractured reservoirs with complicated structural geometry and fracture network. My Ph.D. research is proposing the following scopes of work to enhance our capability and to help improve the resolution on fracture characterization and prediction. For discontinuity attribute, previous methods have difficulty highlighting subtle discontinuities from seismic data in cases where the local amplitude variation is non-zero mean. This study proposes implementing a gray-level transformation and the Canny edge detector for improved imaging of discontinuities. Specifically, the new process transforms seismic signals to be zero mean and helps amplify subtle discontinuities, leading to an enhanced visualization for structural and stratigraphic details. Applications to various 3D seismic datasets demonstrate that the new algorithm is superior to previous discontinuity-detection methods. Integrating both discontinuity magnitude and discontinuity azimuth helps better define channels, faults and fractures, than the traditional similarity, amplitude gradient and semblance attributes. For flexure attribute, the existing algorithm is computationally intensive and limited by the lateral resolution for steeply-dipping formations. This study proposes a new and robust volume-based algorithm that evaluate flexure attribute more accurately and effectively. The algorithms first volumetrically fit a cubic surface by using a diamond 13-node grid cell to seismic data, and then compute flexure using the spatial derivatives of the built surface. To avoid

  10. Space Tug Docking Study. Volume 5: Cost Analysis

    Science.gov (United States)

    1976-01-01

    The cost methodology, summary cost data, resulting cost estimates by Work Breakdown Structure (WBS), technical characteristics data, program funding schedules and the WBS for the costing are discussed. Cost estimates for two tasks of the study are reported. The first, developed cost estimates for design, development, test and evaluation (DDT&E) and theoretical first unit (TFU) at the component level (Level 7) for all items reported in the data base. Task B developed total subsystem DDT&E costs and funding schedules for the three candidate Rendezvous and Docking Systems: manual, autonomous, and hybrid.

  11. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  12. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  13. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  14. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports (DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively). At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  15. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    Science.gov (United States)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may

  16. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  17. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  18. Synfuel program analysis. Volume 2: VENVAL users manual

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  19. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    CERN Document Server

    Niven, Robert K

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of "entropy" is then established using Jaynes' maximum entropy method, both in general and in equilibrium thermodynamics. The thermodynamic entropy then gives the "entropy production" concept. Equations for the entropy production are then derived for simple, integral and infinitesimal flow systems. Some technical aspects are examined, including discrete and continuum representations of volume elements, the effect of radiation, and the analysis of systems subdivided into compartments. A Reynolds decomposition of the entropy ...

  20. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  1. Analysis of maternal and child health policies in Malawi: The methodological perspective.

    Science.gov (United States)

    Daire, J; Khalil, D

    2015-12-01

    The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context.

  2. Analysis of maternal and child health policies in Malawi: The methodological perspective.

    Science.gov (United States)

    Daire, J; Khalil, D

    2015-12-01

    The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context. PMID:26955434

  3. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  4. Proposal of a Methodology of Stakeholder Analysis for the Brazilian Satellite Space Program

    Directory of Open Access Journals (Sweden)

    Mônica Elizabeth Rocha de Oliveira

    2012-03-01

    Full Text Available To ensure the continuity and growth of space activities in Brazil, it is fundamental to persuade the Brazilian society and its representatives in Government about the importance of investments in space activities. Also, it is important to convince talented professionals to place space activities as an object of their interest; the best schools should also be convinced to offer courses related to the space sector; finally, innovative companies should be convinced to take part in space sector activities, looking to returns, mainly in terms of market differentiation and qualification, as a path to take part in high-technology and high-complexity projects. On the one hand, this process of convincing or, more importantly, committing these actors to space activities, implies a thorough understanding of their expectations and needs, in order to plan how the system/organization can meet them. On the other hand, if stakeholders understand how much they can benefit from this relationship, their consequent commitment will very much strengthen the action of the system/organization. With this framework in perspective, this paper proposes a methodology of stakeholder analysis for the Brazilian satellite space program. In the exercise developed in the article, stakeholders have been identified from a study of the legal framework of the Brazilian space program. Subsequently, the proposed methodology has been applied to the planning of actions by a public organization.

  5. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    Science.gov (United States)

    Rovira-Garcia, A.; Juan, J. M.; Sanz, J.; González-Casado, G.; Ibáñez, D.

    2016-03-01

    The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences are separated as hardware delays (a receiver constant plus a satellite constant) per day. The present study was conducted for the entire year of 2014, i.e. during the last solar cycle maximum. The ionospheric models assessed are the operational models broadcast by the global positioning system (GPS) and Galileo constellations, the satellite-based augmentation system (SBAS) (i.e. European Geostationary Navigation Overlay System (EGNOS) and wide area augmentation system (WAAS)), a number of post-process global ionospheric maps (GIMs) from different International GNSS Service (IGS) analysis centres (ACs) and, finally, a more sophisticated GIM computed by the research group of Astronomy and GEomatics (gAGE). Ionospheric models based on GNSS data and represented on a grid (IGS GIMs or SBAS) correct about 85 % of the total slant ionospheric delay, whereas the models broadcasted in the navigation messages of GPS and Galileo only account for about 70 %. Our gAGE GIM is shown to correct 95 % of the delay. The proposed methodology appears to be a useful tool to improve current ionospheric models.

  6. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241Pu and 208-keV 237U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  7. Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

  8. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  9. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  10. A Systemic Method for Organisational Stakeholder Identification and Analysis Using Soft Systems Methodology (SSM)

    OpenAIRE

    Wang, Wei; Liu, Wenbin; Mingers, John

    2015-01-01

    This paper presents a systemic methodology for identifying and analysing the stakeholders of an organisation at many different levels. The methodology is based on soft systems methodology and is applicable to all types of organisation, both for profit and non-profit. The methodology begins with the top-level objectives of the organisation, developed through debate and discussion, and breaks these down into the key activities needed to achieve them. A range of stakeholders are identified for e...

  11. CDM afforestation and reforestation baseline methodologies: An analysis of the submission and approval process

    OpenAIRE

    Michaelowa, Axel; Rawat, V. R. S.

    2007-01-01

    Afforestation and Reforestation (A/R), also widely termed LULUCF have been an important field of conflict in the Clean Development Mechanism (CDM) of the Kyoto Protocol. The first methodology for A/R projects has been submitted only by October 2004 and the first project was registered only in November 2006, two years after the first project in the energy sector. Like energy efficiency and transportation methodologies, A/R methodologies also suffer high rejection rate. 20 A/R CDM methodologies...

  12. From continuous flow analysis to programmable Flow Injection techniques. A history and tutorial of emerging methodologies.

    Science.gov (United States)

    Ruzicka, Jaromir Jarda

    2016-09-01

    Automation of reagent based assays, also known as Flow Analysis, is based on sample processing, in which a sample flows towards and through a detector for monitoring of its components. The Achilles heel of this methodology is that the majority of FA techniques use constant continuous forward flow to transport the sample - an approach which continually consumes reagents and generates chemical waste. Therefore the purpose of this report is to highlight recent developments of flow programming that not only save reagents, but also lead by means of advanced sample processing to selective and sensitive assays based on stop flow measurement. Flow programming combined with a novel approach to data harvesting yields a novel approach to single standard calibration, and avoids interference caused by refractive index. Finally, flow programming is useful for sample preparation, such as rapid, extensive sample dilution. The principles are illustrated by selected references to an available online tutorial http://www.flowinjectiontutorial,com/. PMID:27343609

  13. Failure detection and isolation methodology based on the sequential analysis and extended Kalman filter technique

    International Nuclear Information System (INIS)

    A nuclear power plant operation relies on accurate and precise response of the monitoring system in order to assure a safety operational standard during the most predictable operational transients. The signal from the sensor are in general contaminated with noise and also with the randomic fluctuations making a precise plant assessment uncertain, thus with the possibility of erroneous operator decision or even with the false alarm actuation. In practice the noisy environment could even overcome the sensor malfunction misreading the plant operational status. In the present work a new failure detection and isolation (FDI) algorithm has been developed based on the sequential analysis and extended Kalman filter residue monitoring. The present methodology has been applied to both highly redundant monitoring systems and to non redundant systems where high signal reliability is required. (C.M.)

  14. Methodology for adding and amending glycaemic index values to a nutrition analysis package.

    LENUS (Irish Health Repository)

    Levis, Sharon P

    2011-04-01

    Since its introduction in 1981, the glycaemic index (GI) has been a useful tool for classifying the glycaemic effects of carbohydrate foods. Consumption of a low-GI diet has been associated with a reduced risk of developing CVD, diabetes mellitus and certain cancers. WISP (Tinuviel Software, Llanfechell, Anglesey, UK) is a nutrition software package used for the analysis of food intake records and 24 h recalls. Within its database, WISP contains the GI values of foods based on the International Tables 2002. The aim of the present study is to describe in detail a methodology for adding and amending GI values to the WISP database in a clinical or research setting, using data from the updated International Tables 2008.

  15. Life Cycle Exergy Analysis of Wind Energy Systems : Assessing and improving life cycle analysis methodology

    OpenAIRE

    Davidsson, Simon

    2011-01-01

    Wind power capacity is currently growing fast around the world. At the same time different forms of life cycle analysis are becoming common for measuring the environmental impact of wind energy systems. This thesis identifies several problems with current methods for assessing the environmental impact of wind energy and suggests improvements that will make these assessments more robust. The use of the exergy concept combined with life cycle analysis has been proposed by several researchers ov...

  16. Measurement and analysis of grain boundary grooving by volume diffusion

    Science.gov (United States)

    Hardy, S. C.; Mcfadden, G. B.; Coriell, S. R.; Voorhees, P. W.; Sekerka, R. F.

    1991-01-01

    Experimental measurements of isothermal grain boundary grooving by volume diffusion are carried out for Sn bicrystals in the Sn-Pb system near the eutectic temperature. The dimensions of the groove increase with a temporal exponent of 1/3, and measurement of the associated rate constant allows the determination of the product of the liquid diffusion coefficient D and the capillarity length Gamma associated with the interfacial free energy of the crystal-melt interface. The small-slope theory of Mullins is generalized to the entire range of dihedral angles by using a boundary integral formulation of the associated free boundary problem, and excellent agreement with experimental groove shapes is obtained. By using the diffusivity measured by Jordon and Hunt, the present measured values of Gamma are found to agree to within 5 percent with the values obtained from experiments by Gunduz and Hunt on grain boundary grooving in a temperature gradient.

  17. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  18. Evaluation of safety assessment methodologies in Rocky Flats Risk Assessment Guide (1985) and Building 707 Final Safety Analysis Report (1987)

    Energy Technology Data Exchange (ETDEWEB)

    Walsh, B.; Fisher, C.; Zigler, G.; Clark, R.A. [Science and Engineering Associates, Inc., Albuquerque, NM (United States)

    1990-11-09

    FSARs. Rockwell International, as operating contractor at the Rocky Flats plant, conducted a safety analysis program during the 1980s. That effort resulted in Final Safety Analysis Reports (FSARs) for several buildings, one of them being the Building 707 Final Safety Analysis Report, June 87 (707FSAR) and a Plant Safety Analysis Report. Rocky Flats Risk Assessment Guide, March 1985 (RFRAG85) documents the methodologies that were used for those FSARs. Resources available for preparation of those Rocky Flats FSARs were very limited. After addressing the more pressing safety issues, some of which are described below, the present contractor (EG&G) intends to conduct a program of upgrading the FSARs. This report presents the results of a review of the methodologies described in RFRAG85 and 707FSAR and contains suggestions that might be incorporated into the methodology for the FSAR upgrade effort.

  19. Economic analysis of the space shuttle system, volume 1

    Science.gov (United States)

    1972-01-01

    An economic analysis of the space shuttle system is presented. The analysis is based on economic benefits, recurring costs, non-recurring costs, and ecomomic tradeoff functions. The most economic space shuttle configuration is determined on the basis of: (1) objectives of reusable space transportation system, (2) various space transportation systems considered and (3) alternative space shuttle systems.

  20. Methodology of analysis of economic evidence of cartel in the resale retail of the fuel sector

    International Nuclear Information System (INIS)

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at deconstitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  1. Methodology of economic analysis of evidence of cartel in the resale market of fuels

    International Nuclear Information System (INIS)

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at the constitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  2. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL3 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    As a part of the structural integrity research for aging LWR (Light Water Reactor) components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. The PASCAL code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The continuous development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering recent developments in the fracture mechanics and computer performance. Previous version of PASCAL (PASCAL Ver.2) that was released in 2007 has many functions including the evaluation method for an embedded crack and conditional probabilities of crack initiation and fracture of a RPV, PTS transient database, inspection crack detection probability model and others. Since 2007, the PASCAL Ver. 2 has been improved mainly considering the effects of weld-overlay cladding on the inner surface of RPV. A generalized analysis method is available on the basis of the development of PASCAL Ver.3 and sensitivity analysis results. Graphical user interface (GUI) including a generalized method and some functions of probabilistic fracture mechanics have been also updated for PASCAL3. This report provides the user's manual, examples of analysis and theoretical background of PASCAL Ver.3. (author)

  3. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  4. Experimental stress analysis for materials and structures stress analysis models for developing design methodologies

    CERN Document Server

    Freddi, Alessandro; Cristofolini, Luca

    2015-01-01

    This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.

  5. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  6. SLUDGE TREATMENT PROJECT ALTERNATIVES ANALYSIS SUMMARY REPORT (VOLUME 1)

    International Nuclear Information System (INIS)

    Highly radioactive sludge (containing up to 300,000 curies of actinides and fission products) resulting from the storage of degraded spent nuclear fuel is currently stored in temporary containers located in the 105-K West storage basin near the Columbia River. The background, history, and known characteristics of this sludge are discussed in Section 2 of this report. There are many compelling reasons to remove this sludge from the K-Basin. These reasons are discussed in detail in Section1, and they include the following: (1) Reduce the risk to the public (from a potential release of highly radioactive material as fine respirable particles by airborne or waterborn pathways); (2) Reduce the risk overall to the Hanford worker; and (3) Reduce the risk to the environment (the K-Basin is situated above a hazardous chemical contaminant plume and hinders remediation of the plume until the sludge is removed). The DOE-RL has stated that a key DOE objective is to remove the sludge from the K-West Basin and River Corridor as soon as possible, which will reduce risks to the environment, allow for remediation of contaminated areas underlying the basins, and support closure of the 100-KR-4 operable unit. The environmental and nuclear safety risks associated with this sludge have resulted in multiple legal and regulatory remedial action decisions, plans,and commitments that are summarized in Table ES-1 and discussed in more detail in Volume 2, Section 9

  7. User's manual and analysis methodology of probabilistic fracture mechanics analysis code PASCAL Ver.2 for reactor pressure vessel (Contract research)

    International Nuclear Information System (INIS)

    As a part of the aging structural integrity research for LWR components, the probabilistic fracture mechanics (PFM) analysis code PASCAL (PFM Analysis of Structural Components in Aging LWR) has been developed in JAEA. This code evaluates the conditional probabilities of crack initiation and fracture of a reactor pressure vessel (RPV) under transient conditions such as pressurized thermal shock (PTS). The development of the code has been aimed to improve the accuracy and reliability of analysis by introducing new analysis methodologies and algorithms considering the recent development in the fracture mechanics and computer performance. PASCAL Ver.1 has functions of optimized sampling in the stratified Monte Carlo simulation, elastic-plastic fracture criterion of the R6 method, crack growth analysis models for a semi-elliptical crack, recovery of fracture toughness due to thermal annealing and so on. Since then, under the contract between the Ministry of Economy, Trading and Industry of Japan and JAEA, we have continued to develop and introduce new functions into PASCAL Ver.2 such as the evaluation method for an embedded crack, KI database for a semi-elliptical crack considering stress discontinuity at the base/cladding interface, PTS transient database, and others. A generalized analysis method is proposed on the basis of the development of PASCAL Ver.2 and results of sensitivity analyses. Graphical user interface (GUI) including a generalized method as default values has been also developed for PASCAL Ver.2. This report provides the user's manual and theoretical background of PASCAL Ver.2. (author)

  8. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology

    Directory of Open Access Journals (Sweden)

    Hammami MM

    2016-05-01

    Full Text Available Muhammad M Hammami,1,2 Safa Hammami,1 Hala A Amer,1 Nesrine A Khodr1 1Clinical Studies and Empirical Ethics Department, King Faisal Specialist Hospital and Research Centre, 2College of Medicine, Alfaisal University, Riyadh, Saudi Arabia Background: Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females’ end-of-life choices.Methods: A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology.Results: The mean age of the females in the sample was 30.3 years (range, 19–55 years. Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: “physical and emotional privacy concerned, family caring” (younger, lower religiosity, “whole person” (higher religiosity, “pain and informational privacy concerned” (lower life quality, “decisional privacy concerned” (older, higher life quality, and “life quantity concerned, family dependent” (high life quality, low life satisfaction. Out of the

  9. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  10. Structural analysis of cylindrical thrust chambers, volume 1

    Science.gov (United States)

    Armstrong, W. H.

    1979-01-01

    Life predictions of regeneratively cooled rocket thrust chambers are normally derived from classical material fatigue principles. The failures observed in experimental thrust chambers do not appear to be due entirely to material fatigue. The chamber coolant walls in the failed areas exhibit progressive bulging and thinning during cyclic firings until the wall stress finally exceeds the material rupture stress and failure occurs. A preliminary analysis of an oxygen free high conductivity (OFHC) copper cylindrical thrust chamber demonstrated that the inclusion of cumulative cyclic plastic effects enables the observed coolant wall thinout to be predicted. The thinout curve constructed from the referent analysis of 10 firing cycles was extrapolated from the tenth cycle to the 200th cycle. The preliminary OFHC copper chamber 10-cycle analysis was extended so that the extrapolated thinout curve could be established by performing cyclic analysis of deformed configurations at 100 and 200 cycles. Thus the original range of extrapolation was reduced and the thinout curve was adjusted by using calculated thinout rates at 100 and 100 cycles. An analysis of the same underformed chamber model constructed of half-hard Amzirc to study the effect of material properties on the thinout curve is included.

  11. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  12. WaterSense Program: Methodology for National Water Savings Analysis Model Indoor Residential Water Use

    Energy Technology Data Exchange (ETDEWEB)

    Whitehead, Camilla Dunham; McNeil, Michael; Dunham_Whitehead, Camilla; Letschert, Virginie; della_Cava, Mirka

    2008-02-28

    The U.S. Environmental Protection Agency (EPA) influences the market for plumbing fixtures and fittings by encouraging consumers to purchase products that carry the WaterSense label, which certifies those products as performing at low flow rates compared to unlabeled fixtures and fittings. As consumers decide to purchase water-efficient products, water consumption will decline nationwide. Decreased water consumption should prolong the operating life of water and wastewater treatment facilities.This report describes the method used to calculate national water savings attributable to EPA?s WaterSense program. A Microsoft Excel spreadsheet model, the National Water Savings (NWS) analysis model, accompanies this methodology report. Version 1.0 of the NWS model evaluates indoor residential water consumption. Two additional documents, a Users? Guide to the spreadsheet model and an Impacts Report, accompany the NWS model and this methodology document. Altogether, these four documents represent Phase One of this project. The Users? Guide leads policy makers through the spreadsheet options available for projecting the water savings that result from various policy scenarios. The Impacts Report shows national water savings that will result from differing degrees of market saturation of high-efficiency water-using products.This detailed methodology report describes the NWS analysis model, which examines the effects of WaterSense by tracking the shipments of products that WaterSense has designated as water-efficient. The model estimates market penetration of products that carry the WaterSense label. Market penetration is calculated for both existing and new construction. The NWS model estimates savings based on an accounting analysis of water-using products and of building stock. Estimates of future national water savings will help policy makers further direct the focus of WaterSense and calculate stakeholder impacts from the program.Calculating the total gallons of water the

  13. Thermal characterization and analysis of microliter liquid volumes using the three-omega method.

    Science.gov (United States)

    Roy-Panzer, Shilpi; Kodama, Takashi; Lingamneni, Srilakshmi; Panzer, Matthew A; Asheghi, Mehdi; Goodson, Kenneth E

    2015-02-01

    Thermal phenomena in many biological systems offer an alternative detection opportunity for quantifying relevant sample properties. While there is substantial prior work on thermal characterization methods for fluids, the push in the biology and biomedical research communities towards analysis of reduced sample volumes drives a need to extend and scale these techniques to these volumes of interest, which can be below 100 pl. This work applies the 3ω technique to measure the temperature-dependent thermal conductivity and heat capacity of de-ionized water, silicone oil, and salt buffer solution droplets from 24 to 80 °C. Heater geometries range in length from 200 to 700 μm and in width from 2 to 5 μm to accommodate the size restrictions imposed by small volume droplets. We use these devices to measure droplet volumes of 2 μl and demonstrate the potential to extend this technique down to pl droplet volumes based on an analysis of the thermally probed volume. Sensitivity and uncertainty analyses provide guidance for relevant design variables for characterizing properties of interest by investigating the tradeoffs between measurement frequency regime, device geometry, and substrate material. Experimental results show that we can extract thermal conductivity and heat capacity with these sample volumes to within less than 1% of thermal properties reported in the literature.

  14. Towards criterion validity in classroom language analysis: methodological constraints of metadiscourse and inter-rater agreement

    Directory of Open Access Journals (Sweden)

    Douglas Altamiro Consolo

    2001-02-01

    Full Text Available

    This paper reports on a process to validate a revised version of a system for coding classroom discourse in foreign language lessons, a context in which the dual role of language (as content and means of communication and the speakers' specific pedagogical aims lead to a certain degree of ambiguity in language analysis. The language used by teachers and students has been extensively studied, and a framework of concepts concerning classroom discourse well-established. Models for coding classroom language need, however, to be revised when they are applied to specific research contexts. The application and revision of an initial framework can lead to the development of earlier models, and to the re-definition of previously established categories of analysis that have to be validated. The procedures followed to validate a coding system are related here as guidelines for conducting research under similar circumstances. The advantages of using instruments that incorporate two types of data, that is, quantitative measures and qualitative information from raters' metadiscourse, are discussed, and it is suggested that such procedure can contribute to the process of validation itself, towards attaining reliability of research results, as well as indicate some constraints of the adopted research methodology.

  15. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  16. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  17. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  18. Application of probabilistic safety analysis methodology to physical security in Unit 1 of Laguna Verde Nuclear Power plant

    International Nuclear Information System (INIS)

    The implementation and application of a methodology for the probabilistic safety analysis in the Vulnerability analysis project of Laguna Verde Nuclear Power plant (CNLV), performed by Comision Federal de Electricidad with the technical support of Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS) is presented in this work. The results obtained by the application of this methodology give the most important targets or fundamental areas of CNLV in which the execution of sabotage actions could set in danger the physical security of CNLV and the population in general. (Author)

  19. Space tug economic analysis study. Volume 1: Executive summary

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The space tug is defined as any liquid propulsion stage under 100,000 pounds propellant loading that is flown from the space shuttle cargo bay. Two classes of vehicles are the orbit injection stages and reusable space tugs. The vehicle configurations, propellant combinations, and operating modes used for the study are reported. The summary contains data on the study approach, results, conclusions, and recommendations.

  20. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  1. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  2. Waste Isolation Pilot Plant Safety Analysis Report. Volume 5

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  3. Waste Isolation Pilot Plant Safety Analysis Report. Volume 1

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection: Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating control and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  4. Waste Isolation Pilot Plant Safety Analysis Report. Volume 2

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  5. Waste Isolation Pilot Plant Safety Analysis Report. Volume 4

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  6. Waste Isolation Pilot Plant Safety Analysis Report. Volume 3

    International Nuclear Information System (INIS)

    This Safety Analysis Report (SAR) has been prepared by the US Department of Energy (DOE) to support the construction and operation of the Waste Isolation Pilot Plant (WIPP) in southeastern New Mexico. The WIPP facility is designed to receive, inspect, emplace, and store unclassified defense-generated transuranic wastes in a retrievable fashion in an underground salt medium and to conduct studies and perform experiments in salt with high-level wastes. Upon the successful completion of these studies and experiments, WIPP is designed to serve as a permanent facility. The first chapter of this report provides a summary of the location and major design features of WIPP. Chapters 2 through 5 describe the site characteristics, design criteria, and design bases used in the design of the plant and the plant operations. Chapter 6 discusses radiation protection; Chapters 7 and 8 present an accident analysis of the plant and an assessment of the long-term waste isolation at WIPP. The conduct of operations and operating controls and limits are discussed in Chapters 9 and 10. The quality assurance programs are described in Chapter 11

  7. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally app

  8. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  9. A Framework for Decomposition and Analysis of Agile Methodologies During Their Adaptation

    Science.gov (United States)

    Mikulenas, Gytenis; Kapocius, Kestutis

    In recent years there has been a steady increase of interest in Agile software development methodologies and techniques, which are often positioned as proven alternatives to the traditional plan-driven approaches. However, although there is no shortage of Agile methodologies to choose from, the formal methods for actually choosing or adapting the right one are lacking. The aim of the presented research was to define the formal way of preparing Agile methodologies for adaptation and creating an adaptation process framework. We argue that Agile methodologies can be successfully broken down into individual parts that can be specified on three different levels and later analyzed with regard to problem/concern areas. Results of such decomposition can form the foundation for the decisions on the adaptation of the specific Agile methodology. A case study is included in this chapter to further clarify the proposed approach.

  10. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 2: Economic analysis

    Science.gov (United States)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) cost uncertainties, (2) scenario analysis, (3) economic sensitivities, (4) mixed integer programming formulation of the space tug problem, and (5) critical parameters in the evaluation of a public expenditure.

  11. The methodology for developing a prospective meta-analysis in the family planning community

    Directory of Open Access Journals (Sweden)

    Jacobson Janet C

    2011-04-01

    . Conclusions PMA is a novel research method that improves meta-analysis by including several study sites, establishing uniform reporting of specific outcomes, and yet allowing some independence on the part of individual sites with respect to the conduct of research. The inclusion of several sites increases statistical power to address important clinical questions. Compared to multi-center trials, PMA methodology encourages collaboration, aids in the development of new investigators, decreases study costs, and decreases time to publication. Trial Registration ClinicalTrials.gov: NCT00613366, NCT00886834, NCT01001897, NCT01147497 and NCT01307111

  12. Determination of bone mineral volume fraction using impedance analysis and Bruggeman model

    Energy Technology Data Exchange (ETDEWEB)

    Ciuchi, Ioana Veronica; Olariu, Cristina Stefania, E-mail: oocristina@yahoo.com; Mitoseriu, Liliana, E-mail: lmtsr@uaic.ro

    2013-11-20

    Highlights: • Mineral volume fraction of a bone sample was determined. • Dielectric properties for bone sample and for the collagen type I were determined by impedance spectroscopy. • Bruggeman effective medium approximation was applied in order to evaluate mineral volume fraction of the sample. • The computed values were compared with ones derived from a histogram test performed on SEM micrographs. -- Abstract: Measurements by impedance spectroscopy and Bruggeman effective medium approximation model were employed in order to determine the mineral volume fraction of dry bone. This approach assumes that two or more phases are present into the composite: the matrix (environment) and the other ones are inclusion phases. A fragment of femur diaphysis dense bone from a young pig was investigated in its dehydrated state. Measuring the dielectric properties of bone and its main components (hydroxyapatite and collagen) and using the Bruggeman approach, the mineral volume filling factor was determined. The computed volume fraction of the mineral volume fraction was confirmed by a histogram test analysis based on the SEM microstructures. In spite of its simplicity, the method provides a good approximation for the bone mineral volume fraction. The method which uses impedance spectroscopy and EMA modeling can be further developed by considering the conductive components of the bone tissue as a non-invasive in situ impedance technique for bone composition evaluation and monitoring.

  13. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    International Nuclear Information System (INIS)

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported

  14. Error analysis of overlay compensation methodologies and proposed functional tolerances for EUV photomask flatness

    Science.gov (United States)

    Ballman, Katherine; Lee, Christopher; Dunn, Thomas; Bean, Alexander

    2016-05-01

    Due to the impact on image placement and overlay errors inherent in all reflective lithography systems, EUV reticles will need to adhere to flatness specifications below 10nm for 2018 production. These single value metrics are near impossible to meet using current tooling infrastructure (current state of the art reticles report P-V flatness ~60nm). In order to focus innovation on areas which lack capability for flatness compensation or correction, this paper redefines flatness metrics as being "correctable" vs. "non-correctable" based on the surface topography's contributions to the final IP budget at wafer, as well as whether data driven corrections (write compensation or at scanner) are available for the reticle's specific shape. To better understand and define the limitations of write compensation and scanner corrections, an error budget for processes contributing to these two methods is presented. Photomask flatness measurement tools are now targeting 6σ reproducibility <1nm (previous 3σ reproducibility ~3nm) in order to drive down error contributions and provide more accurate data for correction techniques. Taking advantage of the high order measurement capabilities of improved metrology tooling, as well as computational capabilities which enable fast measurements and analysis of sophisticated shapes, we propose a methodology for the industry to create functional tolerances focused on the flatness errors that are not correctable with compensation.

  15. The Component Packaging Problem: A Vehicle for the Development of Multidisciplinary Design and Analysis Methodologies

    Science.gov (United States)

    Fadel, Georges; Bridgewood, Michael; Figliola, Richard; Greenstein, Joel; Kostreva, Michael; Nowaczyk, Ronald; Stevenson, Steve

    1999-01-01

    This report summarizes academic research which has resulted in an increased appreciation for multidisciplinary efforts among our students, colleagues and administrators. It has also generated a number of research ideas that emerged from the interaction between disciplines. Overall, 17 undergraduate students and 16 graduate students benefited directly from the NASA grant: an additional 11 graduate students were impacted and participated without financial support from NASA. The work resulted in 16 theses (with 7 to be completed in the near future), 67 papers or reports mostly published in 8 journals and/or presented at various conferences (a total of 83 papers, presentations and reports published based on NASA inspired or supported work). In addition, the faculty and students presented related work at many meetings, and continuing work has been proposed to NSF, the Army, Industry and other state and federal institutions to continue efforts in the direction of multidisciplinary and recently multi-objective design and analysis. The specific problem addressed is component packing which was solved as a multi-objective problem using iterative genetic algorithms and decomposition. Further testing and refinement of the methodology developed is presently under investigation. Teaming issues research and classes resulted in the publication of a web site, (http://design.eng.clemson.edu/psych4991) which provides pointers and techniques to interested parties. Specific advantages of using iterative genetic algorithms, hurdles faced and resolved, and institutional difficulties associated with multi-discipline teaming are described in some detail.

  16. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  17. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Science.gov (United States)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail

    2014-12-01

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  18. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  19. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  20. Cointegration methodology for psychological researchers: An introduction to the analysis of dynamic process systems.

    Science.gov (United States)

    Stroe-Kunold, Esther; Gruber, Antje; Stadnytska, Tetiana; Werner, Joachim; Brosig, Burkhard

    2012-11-01

    Longitudinal data analysis focused on internal characteristics of a single time series has attracted increasing interest among psychologists. The systemic psychological perspective suggests, however, that many long-term phenomena are mutually interconnected, forming a dynamic system. Hence, only multivariate methods can handle such human dynamics appropriately. Unlike the majority of time series methodologies, the cointegration approach allows interdependencies of integrated (i.e., extremely unstable) processes to be modelled. This advantage results from the fact that cointegrated series are connected by stationary long-run equilibrium relationships. Vector error-correction models are frequently used representations of cointegrated systems. They capture both this equilibrium and compensation mechanisms in the case of short-term deviations due to developmental changes. Thus, the past disequilibrium serves as explanatory variable in the dynamic behaviour of current variables. Employing empirical data from cognitive psychology, psychosomatics, and marital interaction research, this paper describes how to apply cointegration methods to dynamic process systems and how to interpret the parameters under investigation from a psychological perspective.

  1. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  2. Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization

    Science.gov (United States)

    Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel

    2013-05-01

    The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.

  3. Modeling and Analysis of MRR, EWR and Surface Roughness in EDM Milling through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    A. K.M.S. Iqbal

    2010-01-01

    Full Text Available Problem statement: Electrical Discharge Machining (EDM has grown over the last few decades from a novelty to a mainstream manufacturing process. Though, EDM process is very demanding but the mechanism of the process is complex and far from completely understood. It is difficult to establish a model that can accurately predict the performance by correlating the process parameters. The optimum processing parameters are essential to increase the production rate and decrease the machining time, since the materials, which are processed by EDM and even the process is very costly. This research establishes empirical relations regarding machining parameters and the responses in analyzing the machinability of the stainless steel. Approach: The machining factors used are voltage, rotational speed of electrode and feed rate over the responses MRR, EWR and Ra. Response surface methodology was used to investigate the relationships and parametric interactions between the three controllable variables on the MRR, EWR and Ra. Central composite experimental design was used to estimate the model coefficients of the three factors. The responses were modeled using a response surface model based on experimental results. The significant coefficients were obtained by performing Analysis Of Variance (ANOVA at 95% level of significance. Results: The variation in percentage errors for developed models was found within 5%. Conclusion: The developed models show that voltage and rotary motion of electrode are the most significant machining parameters influencing MRR, EWR and Ra. These models can be used to get the desired responses within the experimental range.

  4. Elaboration of the methodological referential for life cycle analysis of first generation biofuels in the French context

    International Nuclear Information System (INIS)

    This study was made under the particular context of a strong growth of biofuels market, and the implication of French and European public authorities, and certain Member States (Germany, Netherlands, UK), for the development of certification schemes for first generation biofuels. The elaboration of such schemes requires a consensus on the methodology to apply when producing Life Cycle Analysis (LCA) of biofuels. To answer this demand, the study built up the methodological referential for biofuels LCAs in order to assess the Greenhouse Gases (GHG) emissions, fossil fuels consumptions and local atmospheric pollutants emissions induced by the different biofuel production pathways. The work consisted in methodological engineering, and was accomplished thanks to the participation of all the members of the Technical Committee of the study. An initial bibliographic review on biofuels LCAs allowed the identification of the main methodological issues (listed below). For each point, the impact of the methodological choices on the biofuels environmental balances was assessed by several sensitivity analyses. The results of these analyses were taken into account for the elaboration of the recommendations: - Consideration of the environmental burdens associated with buildings, equipments and their maintenance - Quantification of nitrous oxide (N2O) emissions from fields - Impact of the Land Use Change (LUC) - Allocation method for the distribution of the environmental impacts of biofuel production pathways between the different products and coproducts generated. Within the framework of this study, we made no distinction in terms of methodological approach between GHG emissions and local pollutants emissions. This results from the fact that the methodological issues cover all the environmental burdens and do not require specific approaches. This executive summary presents the methodological aspects related to biofuels LCAs. The complete report of the study presents in addition

  5. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and sustainability analysis

    International Nuclear Information System (INIS)

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment and development - that is, baseline development, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, and recommends methodologies for and approaches to baseline development. To present the application and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana, Egypt- is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO2/MWh) estimated in accordance with these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power in the baseline scenario. Hydroelectric resources constitute around 21% of the generation capacity in Egypt, and, if excluding hydropower, the difference between the lowest and the highest baseline is reduced to 18%. Furthermore, since the two variations of the 'historical' baseline option examined result in the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risoe National Laboratory has developed, makes it possible for this report to explore the project-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production

  6. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    Science.gov (United States)

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  7. Scaling analysis for mixing in large stratified volumes of passive containment

    International Nuclear Information System (INIS)

    Integral test plays a key role in assessing the feasibility of the passive containment cooling system (PCCS) and the accuracy of the calculation model. The scaling analysis for ; mixing in large stratified volumes of PCCS provides the primary theoretical basis for determining the key size of the integral test facility. Based on the mixing in large stratified volumes, the key parameters were obtained by scaling analysis based on the hierarchical two-tiered scaling method. The similarity criteria that ensure the integral test facility can well simulate mixing in the passive containment was obtained. (authors)

  8. Comparison of nested case-control and survival analysis methodologies for analysis of time-dependent exposure

    Directory of Open Access Journals (Sweden)

    Platt Robert W

    2005-01-01

    Full Text Available Abstract Background Epidemiological studies of exposures that vary with time require an additional level of methodological complexity to account for the time-dependence of exposure. This study compares a nested case-control approach for the study of time-dependent exposure with cohort analysis using Cox regression including time-dependent covariates. Methods A cohort of 1340 subjects with four fixed and seven time-dependent covariates was used for this study. Nested case-control analyses were repeated 100 times for each of 4, 8, 16, 32, and 64 controls per case, and point estimates were compared to those obtained using Cox regression on the full cohort. Computational efficiencies were evaluated by comparing central processing unit times required for analysis of the cohort at sizes 1, 2, 4, 8, 16, and 32 times its initial size. Results Nested case-control analyses yielded results that were similar to results of Cox regression on the full cohort. Cox regression was found to be 125 times slower than the nested case-control approach (using four controls per case. Conclusions The nested case-control approach is a useful alternative for cohort analysis when studying time-dependent exposures. Its superior computational efficiency may be particularly useful when studying rare outcomes in databases, where the ability to analyze larger sample sizes can improve the power of the study.

  9. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  10. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  11. Principal component and volume of interest analyses in depressed patients imaged by {sup 99m}Tc-HMPAO SPET: a methodological comparison

    Energy Technology Data Exchange (ETDEWEB)

    Pagani, Marco [Institute of Cognitive Sciences and Technologies, CNR, Rome (Italy); Section of Nuclear Medicine, Department of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Gardner, Ann; Haellstroem, Tore [NEUROTEC, Division of Psychiatry, Karolinska Institutet, Huddinge University Hospital, Stockholm (Sweden); Salmaso, Dario [Institute of Cognitive Sciences and Technologies, CNR, Rome (Italy); Sanchez Crespo, Alejandro; Jonsson, Cathrine; Larsson, Stig A. [Section of Nuclear Medicine, Department of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Jacobsson, Hans [Department of Radiology, Karolinska Hospital, Stockholm (Sweden); Lindberg, Greger [Department of Medicine, Division of Gastroenterology and Hepatology, Karolinska Institutet, Huddinge University Hospital, Stockholm (Sweden); Waegner, Anna [Department of Clinical Neuroscience, Division of Neurology, Karolinska Hospital, Stockholm (Sweden)

    2004-07-01

    Previous regional cerebral blood flow (rCBF) studies on patients with unipolar major depressive disorder (MDD) have analysed clusters of voxels or single regions and yielded conflicting results, showing either higher or lower rCBF in MDD as compared to normal controls (CTR). The aim of this study was to assess rCBF distribution changes in 68 MDD patients, investigating the data set with both volume of interest (VOI) analysis and principal component analysis (PCA). The rCBF distribution in 68 MDD and 66 CTR, at rest, was compared. Technetium-99m d,l-hexamethylpropylene amine oxime single-photon emission tomography was performed and the uptake in 27 VOIs, bilaterally, was assessed using a standardising brain atlas. Data were then grouped into factors by means of PCA performed on rCBF of all 134 subjects and based on all 54 VOIs. VOI analysis showed a significant group x VOI x hemisphere interaction (P<0.001). rCBF in eight VOIs (in the prefrontal, temporal, occipital and central structures) differed significantly between groups at the P<0.05 level. PCA identified 11 anatomo-functional regions that interacted with groups (P<0.001). As compared to CTR, MDD rCBF was relatively higher in right associative temporo-parietal-occipital cortex (P<0.01) and bilaterally in prefrontal (P<0.005) and frontal cortex (P<0.025), anterior temporal cortex and central structures (P<0.05 and P<0.001 respectively). Higher rCBF in a selected group of MDD as compared to CTR at rest was found using PCA in five clusters of regions sharing close anatomical and functional relationships. At the single VOI level, all eight regions showing group differences were included in such clusters. PCA is a data-driven method for recasting VOIs to be used for group evaluation and comparison. The appearance of significant differences absent at the VOI level emphasises the value of analysing the relationships among brain regions for the investigation of psychiatric disease. (orig.)

  12. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....

  13. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 5 - North Anna Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1993-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor (AFR) criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark AFR criticality analysis methods using selected critical configurations from commercial pressurized-water reactors (PWR). The analysis methodology selected for all calculations reported herein was the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of one reactor critical configuration for North Anna Unit 1 Cycle 5. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted comparison of criticality calculations directly using the utility-calculated isotopics to those using the isotopics generated by the SCALE-4 SAS2H

  14. 'Rosatom' sites vulnerability analysis and assessment of their physical protection effectiveness. Methodology and 'tools'

    International Nuclear Information System (INIS)

    Full text: Enhancement of physical protection (PP) efficiency at nuclear sites (NS) of State Corporation (SC) 'Rosatom' is one of priorities. This issue is reflected in a series of international and Russian documents. PP enhancement at the sites can be achieved through upgrades of both administrative procedures and technical security system. However, in any case it is requisite to initially identify the so called 'objects of physical protection', that is, answer the question of what we need to protect and identify design basis threats (DBT) and adversary models. Answers to these questions constitute the contents of papers on vulnerability analysis (VA) for nuclear sites. Further, it is necessary to answer the question, to what extent we protect these 'objects of physical protection' and site as a whole; and this is the essence of assessment of physical protection effectiveness. In the process of effectiveness assessment at specific Rosatom sites we assess the effectiveness of the existing physical protection system (PPS) and the proposed options of its upgrades. Besides, there comes a possibility to select the optimal option based on 'cost-efficiency' criterion. Implementation of this work is a mandatory requirement as defined in federal level documents. In State Corporation 'Rosatom' there are methodologies in place for vulnerability analysis and effectiveness assessment as well as 'tools' (methods, regulations, computer software), that make it possible to put the above work into practice. There are corresponding regulations developed and approved by the Rosatom senior management. Special software for PPS effectiveness assessment called 'Vega-2' developed by a Rosatom specialized subsidiary - State Enterprise 'Eleron', is designed to assess PPS effectiveness at fixed nuclear sites. It was implemented practically at all the major Rosatom nuclear sites. As of now, this 'Vega-2' software has been certified and prepared for forwarding to corporation's nuclear sites so

  15. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Directory of Open Access Journals (Sweden)

    Faycal Mimouni

    2016-04-01

    Full Text Available Purpose: Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Design/methodology/approach: Network modeling by combining Petri and Bayesian network. Findings: Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Research limitations/implications: Demands are independent from returns. Practical implications: Model can only be used on nonperishable products. Social implications: Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Originality/value: Bayesian network with a cycle combined with the Petri Network.

  16. Methodology and Systems Analysis of Data Mining Model for Successful Implementation of Data Warehouse in Tertiary Institutions

    OpenAIRE

    U.F. Eze

    2014-01-01

    This research work with title “Methodology and system analysis of data mining model for successful implementation of data warehouse in tertiary institutions†is a proposal that provides a framework that is used to structure, plan, and control the process involved in information discovery for tertiary institutions. It equally deals with series of steps or procedures which governs the analysis and design of this particular Data Mining Model for Tertiary Institutions. The methods, techniques ...

  17. Methodologies for localizing loco-regional hypopharyngeal carcinoma recurrences in relation to FDG-PET positive and clinical radiation therapy target volumes

    DEFF Research Database (Denmark)

    Due, Anne Kirkebjerg; Korreman, Stine Sofia; Tomé, Wolfgang;

    2010-01-01

    Focal methods to determine the source of recurrence are presented, tested for reproducibility and compared to volumetric approaches with respect to the number of recurrences ascribed to the FDG-PET positive and high dose volumes.......Focal methods to determine the source of recurrence are presented, tested for reproducibility and compared to volumetric approaches with respect to the number of recurrences ascribed to the FDG-PET positive and high dose volumes....

  18. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    Science.gov (United States)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  19. Between analysis and transformation: technology, methodology and evaluation on the SPLICE project

    OpenAIRE

    Johnson, Mark; Lager, Peter; Pollard, Bill; Hall, Graham; Edwards, Miranda; Whitfield, Pete; Ward, Rupert

    2009-01-01

    This paper concerns the ways in which technological change may entail methodological development in e-learning research. The focus of our argument centres on the subject of evaluation in e-learning and how technology can contribute to consensus-building on the value of project outcomes, and the identification of mechanisms behind those outcomes. We argue that a critical approach to the methodology of evaluation which harnesses technology in this way is vital to agile and effective pol...

  20. Comparing Rough Set Theory with Multiple Regression Analysis as Automated Valuation Methodologies

    OpenAIRE

    Maurizio d’Amato

    2007-01-01

    This paper focuses on the problem of applying rough set theory to mass appraisal. This methodology was first introduced by a Polish mathematician, and has been applied recently as an automated valuation methodology by the author. The method allows the appraiser to estimate a property without defining econometric modeling, although it does not give any quantitative estimation of marginal prices. In a previous paper by the author, data were organized into classes prior to the valuation process,...

  1. Multiscale Entropy Analysis of Center-of-Pressure Dynamics in Human Postural Control: Methodological Considerations

    OpenAIRE

    Gow, Brian J.; Chung-Kang Peng; Wayne, Peter M.; Andrew C Ahn

    2015-01-01

    Multiscale entropy (MSE) is a widely used metric for characterizing the nonlinear dynamics of physiological processes. Significant variability, however, exists in the methodological approaches to MSE which may ultimately impact results and their interpretations. Using publications focused on balance-related center of pressure (COP) dynamics, we highlight sources of methodological heterogeneity that can impact study findings. Seventeen studies were systematically identified that employed MSE f...

  2. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    HUANG Zhen; LIU JingFang; ZENG DaXing

    2009-01-01

    It is well known that the traditional Grubler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots, and this situation seriously hampers mechani-cal innovation. To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism. The modified Grubler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms, especially the mobility of all recent parallel mechanisms listed by Gogu, and the Bennett mechanism known for its particular difficulty. With wide applications of the criterion, a systematic methodology has recently formed. This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms. In addition, the methodology is convenient for judgment of the instantaneous or full-cycle mobility, and has become an effective and general method of great scientific value and practical significance. In the first half, this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade. The second half of this paperpresents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally, this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  3. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    It is well known that the traditional Grübler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots,and this situation seriously hampers mechani-cal innovation.To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism.The modified Grübler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms,especially the mobility of all recent parallel mechanisms listed by Gogu,and the Bennett mechanism known for its particular difficulty.With wide applications of the criterion,a systematic methodology has recently formed.This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms.In addition,the methodology is convenient for judgment of the instantaneous or full-cycle mobility,and has become an effective and general method of great scientific value and practical significance.In the first half,this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade.The second half of this paper presents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally,this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  4. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples.

    Science.gov (United States)

    Mackie, David M; Jahnke, Justin P; Benyamin, Marcus S; Sumner, James J

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users' purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells.

  5. Informational database methodology for urban risk analysis.Case study: the historic centre of Bucharest

    Science.gov (United States)

    Armas, I.; Dumitrascu, S.

    2009-04-01

    , but is also a very populated area, this being factors that favour a high susceptibility level. In addition, the majority of the buildings are included in the first and second categories of seismic risk, being built between 1875 and 1940, the age of the buildings establishing an increased vulnerability to natural hazards. The methodology was developed through the contribution of three partner universities from Bucharest: the University of Bucharest, the Academy for Economic Studies and the Technical University of Constructions. The method suggested was based on the analysis and processing of digital and statistical spatial information resulted from 1:500 topographical plans, satellite pictures, archives and historical maps used for the identification of the age of the buildings. Also, an important stage was represented by the field investigations that resulted with the data used in the assessment of the buildings: year of construction, location and vicinity, height, number of floors, state and function of the building, equipment and construction type. The information collected from the field together with the data resulted from the digitization of the ortophotoplans were inserted in ArcGIS in order to compile the database. Furthermore, the team from the Cybernetics Faculty developed a special software package in Visual Studio and SQL server in order to insert the sheets in GIS so that they could be statistically processed. The final product of the study is a program that includes as main functions editing, the analysis based on selected factors (individual or group) and viewing of building information in the shape of maps or 3D visualization. The strengths of the informational system resulted are given by the extended range of applicability, the short processing period, accessibility, capacity of support for a large amount of information and, thus, standing out as an adequate instrument to fit the needs of a susceptible population.

  6. A normative price for energy from an electricity generation system: An Owner-dependent Methodology for Energy Generation (system) Assessment (OMEGA). Volume 1: Summary

    Science.gov (United States)

    Chamberlain, R. G.; Mcmaster, K. M.

    1981-01-01

    The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.

  7. Development of a methodology for analysis of the crystalline quality of single crystals

    International Nuclear Information System (INIS)

    This work aims to establish a methodology for the analysis of the crystalline quality of single crystals. It shows that from neutron diffraction tri dimensional rocking curves it is possible to determine the intrinsic full widths at half maximum (FWHM) of the crystalline domains of a crystal, as well as the relative intensities of such domains and the angular distances between them. For the development of the method, a tridimensional I x ω x χ rocking curve has been obtained with neutrons from a mosaic aluminum crystal. The intensity I was measured as rocking curves by turning the crystal around the ω-axis of a goniometer, one curve for each angular position χ obtained by step-scanning this angle in a convenient interval. The set of individual bidimensional I x ω rocking curves formed the tridimensional I x ω x χ rocking curve for the aluminum crystal. The tridimensional rocking curve was fitted by Gaussians and deconvolved from the instrumental broadening in both directions ω and χ. The instrumental broadenings were obtained with a perfect lithium fluoride (LiF) crystal from rocking curves measured around the ω and χ-axes. Owing to an enhanced Lorentz factor in direction χ, which excessively enlarged the rocking curves in that direction, the χ scale had to be shrunk by a correction factor. The shrinkage turned the FWHM of domains in such direction equivalent to those found in direction ω. The construction of a contour map with individualized domains, on the basis of the tridimensional rocking curve of the aluminum crystal, made it easier to determine the characteristics of each domain. This contour map showed five domains. They were characterized in relation to the FWHM, relative intensity and angular distance between them. (author)

  8. Analysis of agreement among definitions of metabolic syndrome in nondiabetic Turkish adults: a methodological study

    Directory of Open Access Journals (Sweden)

    Bersot Thomas P

    2007-12-01

    Full Text Available Abstract Background We aimed to explore the agreement among World Health Organization (WHO, European Group for the Study of Insulin Resistance (EGIR, National Cholesterol Education Program (NCEP, American College of Endocrinology (ACE, and International Diabetes Federation (IDF definitions of the metabolic syndrome. Methods 1568 subjects (532 men, 1036 women, mean age 45 and standard deviation (SD 13 years were evaluated in this cross-sectional, methodological study. Cardiometabolic risk factors were determined. Insulin sensitivity was calculated by HOMA-IR. Agreement among definitions was determined by the kappa statistic. ANOVA and post hoc Tukey's test were used to compare multiple groups. Results The agreement between WHO and EGIR definitions was very good (kappa: 0.83. The agreement between NCEP, ACE, and IDF definitions was substantial to very good (kappa: 0.77–0.84. The agreement between NCEP or ACE or IDF and WHO or EGIR definitions was fair (kappa: 0.32–0.37. The age and sex adjusted prevalence of metabolic syndrome was 38% by NCEP, 42% by ACE and IDF, 20% by EGIR and 19% by WHO definition. The evaluated definitions were dichotomized after analysis of design, agreement and prevalence: insulin measurement requiring definitions (WHO and EGIR and definitions not requiring insulin measurement (NCEP, ACE, IDF. One definition was selected from each set for comparison. WHO-defined subjects were more insulin resistant than subjects without the metabolic syndrome (mean and SD for log HOMA-IR, 0.53 ± 0.14 vs. 0.07 ± 0.23, respectively, p 0.05, but lower log HOMA-IR values (p Conclusion The metabolic syndrome definitions that do not require measurement of insulin levels (NCEP, ACE and IDF identify twice more patients with insulin resistance and increased Framingham risk scores and are more useful than the definitions that require measurement of insulin levels (WHO and EGIR.

  9. Phylodynamic analysis of porcine circovirus type 2: Methodological approach and datasets.

    Science.gov (United States)

    Franzo, Giovanni; Cortey, Martì; Segalés, Joaquim; Hughes, Joseph; Drigo, Michele

    2016-09-01

    Since its first description, PCV2 has emerged as one of the most economically relevant diseases for the swine industry. Despite the introduction of vaccines effective in controlling clinical syndromes, PCV2 spread was not prevented and some potential evidences of vaccine immuno escape have recently been reported ("Complete genome sequence of a novel porcine circovirus type 2b variant present in cases of vaccine failures in the United States" (Xiao and Halbur, 2012) [1], "Genetic and antigenic characterization of a newly emerging porcine circovirus type 2b mutant first isolated in cases of vaccine failure in Korea" (Seo et al., 2014) [2]). In this article, we used a collection of PCV2 full genomes, provided in the present manuscript, and several phylogentic, phylodynamic and bioinformatic methods to investigate different aspects of PCV2 epidemiology, history and evolution (more thoroughly described in "PHYLODYNAMIC ANALYSIS of PORCINE CIRCOVIRUS TYPE 2 REVEALS GLOBAL WAVES of EMERGING GENOTYPES and the CIRCULATION of RECOMBINANT FORMS"[3]). The methodological approaches used to consistently detect recombiantion events and estimate population dymanics and spreading patterns of rapidly evolving ssDNA viruses are herein reported. Programs used are described and original scripts have been provided. Ensembled databases used are also made available. These consist of a broad collection of complete genome sequences (i.e. 843 sequences; 63 complete genomes of PCV2a, 310 of PCV2b, 4 of PCV2c, 217 of PCV2d, 64 of CRF01, 140 of CRF02 and 45 of CRF03.), divided in differnt ORF (i.e. ORF1, ORF2 and intergenic regions), of PCV2 genotypes and major Circulating Recombinat Forms (CRF) properly annotated with respective collection data and country. Globally, all of these data can be used as a starting point for further studies and for classification purpose. PMID:27508215

  10. A Study for Appropriateness of National Nuclear Policy by using Economic Analysis Methodology after Fukushima accident

    International Nuclear Information System (INIS)

    The aim of this paper is to clarify the appropriateness of national nuclear policy in BPE of Korea from an economic perspective. To do this, this paper only focus on the economic analysis methodology without any considering other conditions such as political, cultural, or historical things. In a number of countries, especially Korea, nuclear energy policy is keeping the status quo after Fukushima accident. However the nation's nuclear policy may vary depending on the choice of people. Thus, to make the right decisions, it is important to deliver accurate information and knowledge about nuclear energy to the people. As proven in this paper, the levelized cost of nuclear power is the most inexpensive among the base load units. As the reliance on nuclear power is getting stronger through the economic logic, the nuclear safety and environmental elements will be strengthened. Based on this, national nuclear policy should be promoted. In the aftermath of the Fukushima accident recognized as the world's worst nuclear disaster since the Chernobyl, there are some changes in the nuclear energy policy of various countries. Germany, for example, called a halt to operate Nuclear Power Plant (NPP) which accounts for about 7.5% of the national power generation capacity of 6.3GW. In developing countries such as China and India they conducted the safety check of the nuclear power plants again before preceding their nuclear business. Korea government announced 'The 6th Basic Plan for Long-term Electricity Supply and Demand (BPE)', considering the safety and general public acceptance of the nuclear power plants. According to BPE, they postponed a plan for additional NPP construction, except for constructions that had been already reflected in the 5th BPE. All told, the responses for nuclear energy policy of countries are different depending on their own circumstances

  11. A new methodology for fluorescence analysis of composite resins used in anterior direct restorations.

    Science.gov (United States)

    de Lima, Liliane Motta; Abreu, Jessica Dantas; Cohen-Carneiro, Flavia; Regalado, Diego Ferreira; Pontes, Danielson Guedes

    2015-01-01

    The aim of this study was to use a new methodology to evaluate the fluorescence of composite resins for direct restorations. Microhybrid (group 1, Amelogen; group 2, Opallis; group 3, Filtek Z250) and nanohybrid (group 4, Filtek Z350 XT; group 5, Brilliant NG; group 6, Evolu-X) composite resins were analyzed in this study. A prefabricated matrix was used to prepare 60 specimens of 7.0 × 3.0 mm (n = 10 per group); the composite resin discs were prepared in 2 increments (1.5 mm each) and photocured for 20 seconds. To establish a control group of natural teeth, 10 maxillary central incisor crowns were horizontally sectioned to create 10 discs of dentin and enamel tissues with the same dimensions as the composite resin specimens. The specimens were placed in a box with ultraviolet light, and photographs were taken. Aperture 3.0 software was used to quantify the central portion of the image of each specimen in shades of red (R), green (G), and blue (B) of the RGB color space. The brighter the B shade in the evaluated area of the image, the greater the fluorescence shown by the specimen. One-way analysis of variance revealed significant differences between the groups. The fluorescence achieved in group 1 was statistically similar to that of the control group and significantly different from those of the other groups (Bonferroni test). Groups 3 and 4 had the lowest fluorescence values, which were significantly different from those of the other groups. According to the results of this study, neither the size nor the amount of inorganic particles in the evaluated composite resin materials predicts if the material will exhibit good fluorescence.

  12. A Study for Appropriateness of National Nuclear Policy by using Economic Analysis Methodology after Fukushima accident

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Jong Myoung; Roh, Myung Sub [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    The aim of this paper is to clarify the appropriateness of national nuclear policy in BPE of Korea from an economic perspective. To do this, this paper only focus on the economic analysis methodology without any considering other conditions such as political, cultural, or historical things. In a number of countries, especially Korea, nuclear energy policy is keeping the status quo after Fukushima accident. However the nation's nuclear policy may vary depending on the choice of people. Thus, to make the right decisions, it is important to deliver accurate information and knowledge about nuclear energy to the people. As proven in this paper, the levelized cost of nuclear power is the most inexpensive among the base load units. As the reliance on nuclear power is getting stronger through the economic logic, the nuclear safety and environmental elements will be strengthened. Based on this, national nuclear policy should be promoted. In the aftermath of the Fukushima accident recognized as the world's worst nuclear disaster since the Chernobyl, there are some changes in the nuclear energy policy of various countries. Germany, for example, called a halt to operate Nuclear Power Plant (NPP) which accounts for about 7.5% of the national power generation capacity of 6.3GW. In developing countries such as China and India they conducted the safety check of the nuclear power plants again before preceding their nuclear business. Korea government announced 'The 6th Basic Plan for Long-term Electricity Supply and Demand (BPE)', considering the safety and general public acceptance of the nuclear power plants. According to BPE, they postponed a plan for additional NPP construction, except for constructions that had been already reflected in the 5th BPE. All told, the responses for nuclear energy policy of countries are different depending on their own circumstances.

  13. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 2. OPERATIONS MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  14. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 1. CASE STUDY

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. Original system development was based on research conducted in a smaller water utility in Kenton Co...

  15. CITY OF TAMPA MANAGEMENT ANALYSIS AND REPORT SYSTEM (MARS). VOLUME 3. PROGRAMMING MANUAL

    Science.gov (United States)

    This three-volume report describes the development and implementation of a management analysis and report system (MARS) in the Tampa, Florida, Water and Sanitary Sewer Departments. MARS will help both the Water and Sanitary Sewer Departments control costs and manage expanding ser...

  16. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    International Nuclear Information System (INIS)

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  17. Supply-demand analysis. Volume II. of the offshore oil industry support craft market. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, J.H.

    1977-10-01

    Volume Two of this report presents a description of the market for offshore petroleum industry support craft and an analysis of that market. Financial performance of five major operating companies is described. A forecast of support craft supply and demand for 1977, 1982, and 1987 is included.

  18. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2006-03-20

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  19. Waste Isolation Pilot Plant Geotechnical Analysis Report for July 2005 - June 2006, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2007-03-25

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2006. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  20. Meta-analysis: Effects of glycerol administration on plasma volume, haemoglobin, and haematocrit.

    Science.gov (United States)

    Koehler, Karsten; Thevis, Mario; Schaenzer, Wilhelm

    2013-01-01

    The use of glycerol in combination with excess fluid can be used to increase total body water. Because glycerol hyperhydration may also be misused to mask the effects of blood doping on doping-relevant parameters, namely haemoglobin and haematocrit, glycerol has been prohibited by the World Anti-Doping Agency since 2010. In order to test this rationale, the purpose of this meta-analysis was to quantify the effects of glycerol hyperhydration on plasma volume, haemoglobin, and haematocrit in comparison to administration of fluid only. Following a literature search, a total of seven studies was included and meta-analyses were performed separately for the effects on plasma volume (5 studies, total n = 54) and on haemoglobin (6 studies, n = 52) and haematocrit (6 studies, n = 52). The meta-analysis revealed that the increase in plasma volume was 3.3% larger (95%-CI: 1.1-5.5%) after glycerol administration when compared to fluid only. Reductions in haemoglobin were 0.2 g/dl (95%-CI: -0.3, 0.0) larger and there was no difference in the changes in haematocrit between glycerol and fluid administration (95%-CI: -0.7-0.8%). In comparison with other plasma-volume expanding agents, glycerol hyperhydration has a very limited potential in increasing plasma volume and altering doping-relevant blood parameters. PMID:24353192

  1. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  2. Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Hansen, Clifford W.

    2010-09-01

    The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additional Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.

  3. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  4. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  5. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  6. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research. PMID:22592885

  7. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    Science.gov (United States)

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  8. Organizational analysis and safety for utilities with nuclear power plants: an organizational overview. Volume 1

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. A model is introduced for the purposes of organizing the literature review and showing key relationships among identified organizational factors and nuclear power plant safety. Volume I of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety

  9. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 1: Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.Z.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  10. Analysis of methodologies for assessing the employability of Sport graduates in Portugal

    Directory of Open Access Journals (Sweden)

    Dina Miragaia

    2013-01-01

    Full Text Available The purpose of this article is to analyze the methodologies used to monitor and evaluate employability of Portuguese graduated sport students. The consultation of national and international literature was carried to understand the significance of employability concept and its implications for the methodological design used in evaluation reports of employability. Questionnaires applied by official organizations were consulted and the coherence between the dimensions/variables uncovered and the concept of employability was examined. Problems of validity, reliability, discrimination and comparability between studies were identified, which suggests that it is required find new ways to evaluate employability. One possible way consists in the use of the methodology of triangulation (data, researchers that integrates inter-institutional research.

  11. Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet

    Science.gov (United States)

    Etxaniz, J.; Monje, P. M.; Aranguren, G.

    2014-03-01

    This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.

  12. Design of batch operations: Systematic methodology for generation and analysis of sustainable alternatives

    DEFF Research Database (Denmark)

    Carvalho, Ana; Matos, Henrique A.; Gani, Rafiqul

    2010-01-01

    The objective of this paper is to present a new methodology that is able to generate, screen and identify sustainable alternatives to continuous chemical processes as well as processes operating in the batch mode. The methodology generates the sustainable (design) alternatives by locating...... processes are described, highlighting the main differences between them. Through two case studies, the application of the methodology, to obtain sustainable design alternatives for batch plants, is highlighted....... the operational, environmental, economical and safety related problems inherent in the process (batch or continuous). Alternatives that are more sustainable, compared to a reference, are generated and evaluated by addressing one or more of the identified problems. A decomposition technique as well as a set...

  13. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  14. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes. PMID:24179734

  15. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  16. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  17. Methodology for the biosphere analysis in the evaluation of deep geological repositories for high radioactive waste

    International Nuclear Information System (INIS)

    This report summarizes the work done and the achievements reached within the R and D Project that IMA/CIEMAT has had with ENRESA during 1993-1995. The overal R and D Project has a wide radiological protection context, but the work reported here relates only to the development of a Methodology for considering the Biosphere sub-system in the assessments of deep geological repositories for high radioactive wastes (HLW). The main areas concerned within the Methodology have to do with Biosphere structure and morphology in the long-term relevant to deep disposal of HLW: in the contexts of the assessment of these systems, and appropiate modelling of the behaviour of radionuclides released to the biosphere system and with the associated human exposure. This document first provides a review of the past and present international and national concerns about the biosphere modelling and its importance in relation to the definition of safety criteria. A joint ENRESA/ANDRA/IPSN/CIEMAT study about the definition and proactical descriptions of the biosphere systems under different climatic states is then summarized. The Methodology developed by IMA/CIEMAT is outlined with an illustration of the way it works. Different steps and procedures are included for a better proactical understanding of the software tools developed within the project to support the application of the Methologoy. This Methodology is widely based on an international working group on Reference Biospheres part national work for ENRESA has been supported under a collaborative agreement with QuantiSci Ltd. Specific software development have been carried out in collaboration with QuantiSci Ltd and with the Polytechnical University of Madrid. Most of the items included within the Methodology and moreover the Methodology as a whole, follows a continuos progressive development. It is increasinaly recognized that assessment capabilities, establisment of safety criteria and regulatory framework and the steps in a

  18. Investment in selective social programs: a proposed methodological tool for the analysis of programs’ sustainability

    Directory of Open Access Journals (Sweden)

    Manuel Antonio Barahona Montero

    2014-08-01

    Full Text Available This paper proposes a methodology to evaluate sustainability of Selective Social Programs (SSP, based on the relationship between economic growth and human development posed by the United Nations Development Program (UNDP.  For such purposes, the Circle of Sustainability is developed, which is comprised of 12 pillars. Each pillar is evaluated based on its current status and impact.  Combining both results allows to assesses sustainability of these programs and identify areas of focus. Therefore, this methodology helps to better channel available efforts and resources.

  19. Organizational analysis and safety for utilities with nuclear power plants: perspectives for organizational assessment. Volume 2

    International Nuclear Information System (INIS)

    This two-volume report presents the results of initial research on the feasibility of applying organizational factors in nuclear power plant (NPP) safety assessment. Volume 1 of this report contains an overview of the literature, a discussion of available safety indicators, and a series of recommendations for more systematically incorporating organizational analysis into investigations of nuclear power plant safety. The six chapters of this volume discuss the major elements in our general approach to safety in the nuclear industry. The chapters include information on organizational design and safety; organizational governance; utility environment and safety related outcomes; assessments by selected federal agencies; review of data sources in the nuclear power industry; and existing safety indicators

  20. Volumes of the hippocampus and amygdala in patients with borderline personality disorder: a meta-analysis.

    Science.gov (United States)

    Nunes, Paulo Menezes; Wenzel, Amy; Borges, Karinne Tavares; Porto, Cristianne Ribeiro; Caminha, Renato Maiato; de Oliveira, Irismar Reis

    2009-08-01

    Individuals with borderline personality disorder (BPD) often exhibit impulsive and aggressive behavior. The hippocampus and amygdala form part of the limbic system, which plays a central role in controlling such expressions of emotional reactivity. There are mixed results in the literature regarding whether patients with BPD have smaller hippocampal and amygdalar volume relative to healthy controls. To clarify the precise nature of these mixed results, we performed a meta-analysis to aggregate data on the size of the hippocampus and amygdala in patients with BPD. Seven publications involving six studies and a total of 104 patients with BPD and 122 healthy controls were included. A significantly smaller volume was found in both the right and left hippocampi and amygdala of patients with BPD compared to healthy controls. These findings raise the possibility that reduced hippocampal and amygdalar volumes are biological substrates of some symptoms of BPD. PMID:19663654

  1. Laser Raman spectroscopic analysis of polymorphic forms in microliter fluid volumes.

    Science.gov (United States)

    Anquetil, Patrick A; Brenan, Colin J H; Marcolli, Claudia; Hunter, Ian W

    2003-01-01

    Knowledge and control of the polymorphic phase of chemical compounds are important aspects of drug development in the pharmaceutical industry. We report herein in situ and real-time Raman spectroscopic polymorphic analysis of optically trapped microcrystals in a microliter volume format. The system studied in particular was the recrystallization of carbamazepine (CBZ) in methanol. Raman spectrometry enabled noninvasive measurement of the amount of dissolved CBZ in a sample as well as polymorphic characterization, whereas exclusive recrystallization of either CBZ form I or CBZ form III from saturated solutions was achieved by specific selection of sample cell cooling profiles. Additionally, using a microcell versus a macroscopic volume gives the advantage of reaching equilibrium much faster while using little compound quantity. We demonstrate that laser Raman spectral polymorphic analysis in a microliter cell is a potentially viable screening platform for polymorphic analysis and could lead to a new high throughput method for polymorph screening.

  2. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  3. Development of Methodology for Spent Fuel Pool Severe Accident Analysis Using MELCOR Program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won-Tae; Shin, Jae-Uk [RETech. Co. LTD., Yongin (Korea, Republic of); Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The general reason why SFP severe accident analysis has to be considered is that there is a potential great risk due to the huge number of fuel assemblies and no containment in a SFP building. In most cases, the SFP building is vulnerable to external damage or attack. In contrary, low decay heat of fuel assemblies may make the accident processes slow compared to the accident in reactor core because of a great deal of water. In short, its severity of consequence cannot exclude the consideration of SFP risk management. The U.S. Nuclear Regulatory Commission has performed the consequence studies of postulated spent fuel pool accident. The Fukushima-Daiichi accident has accelerated the needs for the consequence studies of postulated spent fuel pool accidents, causing the nuclear industry and regulatory bodies to reexamine several assumptions concerning beyond-design basis events such as a station blackout. The tsunami brought about the loss of coolant accident, leading to the explosion of hydrogen in the SFP building. Analyses of SFP accident processes in the case of a loss of coolant with no heat removal have studied. Few studies however have focused on a long term process of SFP severe accident under no mitigation action such as a water makeup to SFP. USNRC and OECD have co-worked to examine the behavior of PWR fuel assemblies under severe accident conditions in a spent fuel rack. In support of the investigation, several new features of MELCOR model have been added to simulate both BWR fuel assembly and PWR 17 x 17 assembly in a spent fuel pool rack undergoing severe accident conditions. The purpose of the study in this paper is to develop a methodology of the long-term analysis for the plant level SFP severe accident by using the new-featured MELCOR program in the OPR-1000 Nuclear Power Plant. The study is to investigate the ability of MELCOR in predicting an entire process of SFP severe accident phenomena including the molten corium and concrete reaction. The

  4. Selection methodology for LWR safety R and D programs and proposals. Volume III. User's manual for the multi-attribute utility package (MAUP)

    International Nuclear Information System (INIS)

    The computer program which was developed to apply the multi-attribute utility (MAU) methodology to the selection of LWR safety R and D programs and proposals is described. An overview of the MAU method is presented, followed by a description of the steps incorporated in developing individual modules for use in the multi-attribute utility package (MAUP). Each module is described complete with usage information and an example of computer output

  5. Study for the optimization of a transport aircraft wing for maximum fuel efficiency. Volume 1: Methodology, criteria, aeroelastic model definition and results

    Science.gov (United States)

    Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.

    1985-01-01

    Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.

  6. Challenges and methodology for safety analysis of a high-level waste tank with large periodic releases of flammable gas

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, J.N.; Pasamehmetoglu, K.O.; White, J.R. [Los Alamos National Lab., NM (United States); Stewart, C.W. [Pacific Northwest Lab., Richland, WA (United States)

    1994-07-01

    Tank 241-SY-101, located at the Department of Energy Hanford Site, has periodically released up to 10,000 ft{sup 3} of flammable gas. This release has been one of the highest-priority DOE operational safety problems. The gases include hydrogen and ammonia (fuels) and nitrous oxide (oxidizer). There have been many opinions regarding the controlling mechanisms for these releases, but demonstrating an adequate understanding of the problem, selecting a mitigation methodology, and preparing the safety analysis have presented numerous new challenges. The mitigation method selected for the tank was to install a pump that would mix the tank contents and eliminate the sludge layer believed to be responsible for the gas retention and periodic releases. This report will describe the principal analysis methodologies used to prepare the safety assessment for the installation and operation of the pump, and because this activity has been completed, it will describe the results of pump operation.

  7. Guidance for the application of an assessment methodology for innovative nuclear energy systems. INPRO manual - Overview of the methodology. Vol. 1 of 9 of the final report of phase 1 of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) including a CD-ROM comprising all volumes

    International Nuclear Information System (INIS)

    The International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) was initiated in the year 2000, based on a resolution of the IAEA General Conference (GC(44)/RES/21). The main objectives of INPRO are (1) to help to ensure that nuclear energy is available to contribute in fulfilling energy needs in the 21st century in a sustainable manner, (2) to bring together both technology holders and technology users to consider jointly the international and national actions required to achieve desired innovations in nuclear reactors and fuel cycles; and (3) to create a forum to involve all relevant stakeholders that will have an impact on, draw from, and complement the activities of existing institutions, as well as ongoing initiatives at the national and international level. This document follows the guidelines of the INPRO report 'Methodology for the assessment of innovative nuclear reactors and fuel cycles, Report of Phase 1B (first part) of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO)', IAEA-TECDOC-1434 (2004), together with its previous report Guidance for the evaluation for innovative nuclear reactors and fuel cycles, Report of Phase 1A of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), IAEA-TECDOC-1362 (2003). This INPRO manual is comprised of an overview volume (laid out in this report), and eight additional volumes (available on a CD-ROM attached to the inside back cover of this report) covering the areas of economics (Volume 2), infrastructure (Volume 3), waste management (Volume 4), proliferation resistance (Volume 5), physical protection (Volume 6), environment (Volume 7), safety of reactors (Volume 8), and safety of nuclear fuel cycle facilities (Volume 9). The overview volume sets out the philosophy of INPRO and a general discussion of the INPRO methodology. This overview volume discusses the relationship of INPRO with the UN concept of sustainability to demonstrate how the

  8. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  9. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  10. The Three Stages of Critical Policy Methodology: An Example from Curriculum Analysis

    Science.gov (United States)

    Rata, Elizabeth

    2014-01-01

    The article identifies and discusses three stages in the critical policy methodology used in the sociology of education. These are: firstly, employing a political economy theoretical framework that identifies causal links between global forces and local developments; secondly, analysing educational policy within that theoretically conceptualised…

  11. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  12. A Case Study of a Case Study: Analysis of a Robust Qualitative Research Methodology

    Science.gov (United States)

    Snyder, Catherine

    2012-01-01

    A unique multi-part qualitative study methodology is presented from a study which tracked the transformative journeys of four career-changing women from STEM fields into secondary education. The article analyzes the study's use of archived writing, journaling, participant-generated photography, interviews, member-checking, and reflexive analytical…

  13. Is High Breast Density a Risk Factor for Breast Cancer ? Significant Points Emerging from the DMIST Study Methodological Analysis

    OpenAIRE

    Colin, Catherine; Prince, Violaine

    2009-01-01

    High breast density (HBD) tends to be seen as a significant and independent risk factor for breast cancer. This article describes a methodological and quantitative study of the variables selected by the large DMIST study, i.e., age, hormonal status and breast density, in correlation with cancer occurrence frequency. The statistical analysis of cancer rates in every patient subgroup of a study involving more than 42,000 women in screening, shows that HBD, when isolated from other variables, do...

  14. Design and tolerance analysis of a low bending loss hole-assisted fiber using statistical design methodology.

    Science.gov (United States)

    Van Erps, Jürgen; Debaes, Christof; Nasilowski, Tomasz; Watté, Jan; Wojcik, Jan; Thienpont, Hugo

    2008-03-31

    We present the design of a low bending loss hole-assisted fiber for a 180?-bend fiber socket application, including a tolerance analysis for manufacturability. To this aim, we make use of statistical design methodology, combined with a fully vectorial mode solver. Two resulting designs are presented and their performance in terms of bending loss, coupling loss to Corning SMF-28 standard telecom fiber, and cut-off wavelength is calculated.

  15. Application of human reliability analysis methodology of second generation; Aplicacion de metodologia de analisis de confiabilidad humana de segunda generacion

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz S, T. de J.; Nelson E, P. F. [Facultad de Ingenieria, Departamento de Sistemas Energeticos, UNAM, Paseo Cuauhnahuac 8532, 62550 Jiutepec, Morelos (Mexico)], e-mail: trs@cie.unam.mx

    2009-10-15

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  16. Perfusion analysis using a wide coverage flat-panel volume CT: feasibility study

    Science.gov (United States)

    Grasruck, M.; Gupta, R.; Reichardt, B.; Klotz, E.; Schmidt, B.; Flohr, T.

    2007-03-01

    We developed a Flat-panel detector based Volume CT (VCT) prototype scanner with large z-coverage. In that prototype scanner a Varian 4030CB a-Si flat-panel detector was mounted in a multi slice CT-gantry (Siemens Medical Solutions) which provides a 25 cm field of view with 18 cm z-coverage at isocenter. The large volume covered in one rotation can be used for visualization of complete organs of small animals, e.g. rabbits. By implementing a mode with continuous scanning, we are able to reconstruct the complete volume at any point in time during the propagation of a contrast bolus. Multiple volumetric reconstructions over time elucidate the first pass dynamics of a bolus of contrast resulting in 4-D angiography and potentially allowing whole organ perfusion analysis. We studied to which extent pixel based permeability and blood volume calculation with a modified Patlak approach was possible. Experimental validation was performed by imaging evolution of contrast bolus in New Zealand rabbits. Despite the short circulation time of a rabbit, the temporal resolution was sufficient to visually resolve various phases of the first pass of the contrast bolus. Perfusion imaging required substantial spatial smoothing but allowed a qualitative discrimination of different types of parenchyma in brain and liver. If a true quantitative analysis is possible, requires further studies.

  17. Configuration evaluation and criteria plan. Volume 1: System trades study and design methodology plan (preliminary). Space Transportation Main Engine (STME) configuration study

    Science.gov (United States)

    Bair, E. K.

    1986-01-01

    The System Trades Study and Design Methodology Plan is used to conduct trade studies to define the combination of Space Shuttle Main Engine features that will optimize candidate engine configurations. This is accomplished by using vehicle sensitivities and engine parametric data to establish engine chamber pressure and area ratio design points for candidate engine configurations. Engineering analyses are to be conducted to refine and optimize the candidate configurations at their design points. The optimized engine data and characteristics are then evaluated and compared against other candidates being considered. The Evaluation Criteria Plan is then used to compare and rank the optimized engine configurations on the basis of cost.

  18. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  19. FINITE VOLUME NUMERICAL ANALYSIS FOR PARABOLIC EQUATION WITH ROBIN BOUNDARY CONDITION

    Institute of Scientific and Technical Information of China (English)

    Xia Cui

    2005-01-01

    In this paper, finite volume method on unstructured meshes is studied for a parabolic convection-diffusion problem on an open bounded set of Rd (d = 2 or 3) with Robin boundary condition. Upwinding approximations are adapted to treat both the convection term and Robin boundary condition. By directly getting start from the formulation of the finite volume scheme, numerical analysis is done. By using several discrete functional analysis techniques such as summation by parts, discrete norm inequality, et al, the stability and error estimates on the approximate solution are established, existence and uniqueness of the approximate solution and the 1st order temporal norm and L2 and H1 spacial norm convergence properties are obtained.

  20. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  1. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs

  2. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    International Nuclear Information System (INIS)

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs

  3. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  4. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  5. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis

    DEFF Research Database (Denmark)

    Ringius, L.; Grohnheit, Poul Erik; Nielsen, Lars Henrik;

    2002-01-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment anddevelopment - that is, baseline development......, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, andrecommends methodologies for and approaches to baseline development. To present the application...... and implications of the various methodologies and approaches in a concrete context, Africa's largest wind farm-namely the 60 MW wind farm located in Zafarana,Egypt is examined as a hypothetical CDM wind power project The report shows that for the present case example there is a difference of about 25% between...

  6. The total scattering atomic pair distribution function: New methodology for nanostructure analysis

    Science.gov (United States)

    Masadeh, Ahmad

    The conventional xray diffration (XRD) methods probe for the presence of long-range order (periodic structure) which are reflected in the Bragg peaks. Local structural deviations or disorder mainly affect the diffuse scattering intensity. In order to obtain structural information about both long-range order and local structure disorder, a technique that takes in account both Bragg and diffuse scattering need to be employed, such as the atomic pair distribution function (PDF) technique. This work introduces a PDF based methodology to quantitatively investigate nanostructure materials in general. The introduced methodology can be applied to extract quantitatively structural information about structure, crystallinity level, core/shell size, nanoparticle size, and inhomogeneous internal strain in the measured nanoparticles. This method is generally applicable to the characterization of the nano-scale solid, many of which may exhibit complex disorder and strain

  7. THE ANALYSIS OF SOCIAL PROTECTION EXPENDITURE OF ROMANIA BY FUNCTIONS ACCORDING TO THE EUROPEAN UNION METHODOLOGY

    Directory of Open Access Journals (Sweden)

    OANA DOBRE-BARON

    2014-12-01

    Full Text Available For Romania, the status of full member of the European Union created obligations, but especially the need to respect the directives and regulations of the European bodies. The European Commission regulation which provides for each Member State the implementation of the ESSPROS methodology which refers to the coordination of social security systems in each country in terms of statistical data collection and calculation of welfare indicators on a comparable basis may also be found among these directives. This paper aims to present and analyze the way in which Romania joined this regulation, how it applies the methodology of the European Union and especially what is the level in which our system of social protection ensures welfare of citizens compared to other systems in the European Union.

  8. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    's sustainability is about addressing the extent to which product life cycles affect poverty levels among the current generation, as well as changes in the level of natural, human and produced and social capital available for the future population. It is shown that the extent to which product life cycles affect......It has been claimed that in order to assess the sustainability of products, a combination of the results from a life cycle assessment (LCA), social life cycle assessment (SLCA) and life cycle costing (LCC) is needed. Despite the frequent reference to this claim in the literature, very little...... large extent considered in any of the LC methodologies. Furthermore, because of the present level of knowledge about what creates and destroys social capital, it is difficult to assess how it relates to the LC methodologies. It is also found that the LCC is only relevant in the context of a life cycle...

  9. Formal security analysis of registration protocols for interactive systems: a methodology and a case of study

    CERN Document Server

    Diaz, Jesus; Rodriguez, Francisco B

    2012-01-01

    In this work we present and formally analyze CHAT-SRP (CHAos based Tickets-Secure Registration Protocol), a protocol to provide interactive and collaborative platforms with a cryptographically robust solution to classical security issues. Namely, we focus on the secrecy and authenticity properties while keeping a high usability. Indeed, most interactive platforms currently base their security properties almost exclusively on the correct implementation and configuration of the systems. In this sense, users are forced to blindly trust the system administrators and developers. Moreover, as far as we know, there is a lack of formal methodologies for the verification of security properties for interactive applications. We propose here a methodology to fill this gap, i.e., to analyse both the security of the proposed protocol and the pertinence of the underlying premises. In this concern, we propose the definition and formal evaluation of a protocol for the distribution of digital identities. Once distributed, thes...

  10. COMPARATIVE ANALYSIS OF TRAINING METHODOLOGY EFFICIENCY ON THE MOTOR SPHERE OF JUNIOR I DANCERS

    Directory of Open Access Journals (Sweden)

    Grigore Virgil

    2015-10-01

    Full Text Available The purpose of this paper is to highlight the influence of the training methodology on the motor sphere of junior I dancers. This scientific approach has involved the organization of an experimental study in ”Two Step” Club of Bucharest. The research activity was conducted from January 2012 to November 2013, by investigating two groups of athletes, an experimental group and a control group; each group included 12 dancers, aged from 12 to 13, corresponding to sports classification category Junior I. The results of the research show that thanks to the training methodology applied to the Junior I dancers included in the experimental group, these ones improved their strength of abdominal and arms muscles, they had an increase of the spine and coxo-femoral joint mobility and they improved their strength under speed conditions as well.

  11. Reliability Analysis of Phased Mission Systems by the Considering the Sensitivity Analysis, Uncertainty and Common Cause Failure Analysis using the GO-FLOW Methodology

    Directory of Open Access Journals (Sweden)

    Muhammad Hashim

    2013-04-01

    Full Text Available The reliability is the probability that a device will perform its required function under stated conditions for a specified period of time. The Common Cause Failure (CCFs is the multiple failures and has long been recognized (U.S. NRC, 1975 as an important issue in the Probabilistic Safety Assessment (PSA and uncertainty and sensitivity analysis has the important information for the evaluation of system reliability. In this study, two cases has been considered, in the first case, author have made the analysis of reliability of PWR safety system by GO-FLOW methodology alternatively to Fault Tree Analysis and Even Tree because it is success-oriented system analysis technique and comparatively easy to conduct the reliability analysis of the complex system. In the second case, sensitivity analysis has been made in order to prioritize the important parameters which have largest contribution to system reliability and also for common cause failure analysis and uncertainty analysis. For an example of phased mission system, PWR containment spray system has been considered.

  12. Analysis of Innovation Modes in the Russian Economy: Methodological Approaches and First Results

    OpenAIRE

    Leonid Gokhberg; Tatyana Kuznetsova; Vitaly Roud

    2010-01-01

    The paper discusses methodologies used to adapt to the Russian context the usual measures of STI performance, developed and implemented under the framework of the European Manufacturing Survey (EMS), and reports on results. Two rounds of surveys were conducted in 2009-2010 covering around 2000 Russian companies. The third round is planned for 2012. The paper outlines how innovative activities in the real sector of the Russian economy can be measured. The EMS toolkit must be significantly adap...

  13. Analysis of the application and integration of methodologies by software development companies

    OpenAIRE

    Soliński, Adam

    2012-01-01

    Context. In recent years there has been observed a significant shift from plan-driven development towards agile, which is considered as a vast improvement to processes. However, it has also been spotted that agile methodologies are hardly ever applied in their pure form. Moreover, hybrid processes as combinations of plan-driven and agile practices emerge. In addition, agile adoption has been reported to result in both: benefits and limitations. Objectives. In this study the following matters ...

  14. Impression management : developing and illustrating a scheme of analysis for narrative disclosures – a methodological note

    OpenAIRE

    Brennan, Niamh; Guillamon-Saorin, Encarna; Pierce, Aileen

    2009-01-01

    Purpose – This paper develops a holistic measure for analysing impression management and for detecting bias introduced into corporate narratives as a result of impression management. Design/methodology/approach – Prior research on the seven impression management methods in the literature is summarised. Four of the less-researched methods are described in detail, and are illustrated with examples from UK Annual Results’ Press Releases (ARPRs). A method of computing a holistic composite impr...

  15. Review on the NEI Methodology of Debris Transport Analysis in Sump Blockage Issue for APR1400

    International Nuclear Information System (INIS)

    Since USNRC (United State Nuclear Regulatory Committee) initially addressed post-accident sump performance under Unresolved Safety Issue USI A-43, sump blockage issue has gone through GSI-191, Regulation Guide 1.82, Rev. 3 (RG. 1.82 Rev.3), and generic Letter 2004-02 for PWRs (Pressurized Water Reactors). As a response of these USNRC's activities, NEI 04-07 was issued in order to evaluate the post-accident performance of a plant's recirculation sump. The baseline methodology of NEI 04-07 is composed of break selection, debris generation, latent debris, debris transport, and head loss. In analytical refinement of NEI 04-07, computational fluid dynamic (CFD) is suggested for the evaluation of debris transport in emergency core cooling (ECC) recirculation mode as guided by RG. 1.82 Rev.3. In Korea nuclear industry also keeps step with international activities of this safety issue, with Kori 1 plant as a pioneering edge. Korean nuclear industry has been also pursuing development of an advanced PWR of APR1400, which incorporates several improved safety features. One of the key features, considering sump blockage issue, is the adoption of IRWST (In-containment Refueling Water Storage Tank). This device, as the acronym implies, changes the emergency core cooling water injection pattern. This fact makes us to review the applicability of NEI 04-07's methodology. In this paper we discuss the applicability of NEI 04- 07's methodology, and more over, new methodology is proposed. And finally the preliminary debris transport is analyzed

  16. A design methodology for scenario-analysis in urban freight modelling

    OpenAIRE

    Ambrosini, Christian; Gonzalez-Feliu, Jesus; Toilier, Florence

    2013-01-01

    International audience Urban goods movement modelling is a popular subject in urban logistics research. However, most models remain under-used because practitioners have difficulties to apply them to simulate urban policies and their impacts on transport flows, mainly when the assessed situations are different from the initial usage of the mode. This paper aims to answer to that issue by proposing a methodology of scenario construction and assessment using current models and tools. The met...

  17. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    OpenAIRE

    Živan Ristić; Neđo Balaban

    2006-01-01

    Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a) Methodological aspects of evaluation (kinds of evaluation, metaevaluation) and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement) (b) Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspect...

  18. Accuracy of ionospheric models used in GNSS and SBAS: methodology and analysis

    OpenAIRE

    Rovira Garcia, Adrià; Juan Zornoza, José Miguel; Sanz Subirana, Jaume; González Casado, Guillermo; Ibáñez Segura, Deimos

    2016-01-01

    © 2015, Springer-Verlag Berlin Heidelberg. The characterization of the accuracy of ionospheric models currently used in global navigation satellite systems (GNSSs) is a long-standing issue. The characterization remains a challenging problem owing to the lack of sufficiently accurate slant ionospheric determinations to be used as a reference. The present study proposes a methodology based on the comparison of the predictions of any ionospheric model with actual unambiguous carrier-phase measur...

  19. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  20. The HGPT methodology with 'control variable option' in quasistatic nuclear reactor analysis

    International Nuclear Information System (INIS)

    The heuristically based generalized perturbation theory (HGPT), to first and higher order, applied to the neutron field of a reactor system, is discussed in relation to quasistatic problems. This methodology is of interest in reactor operation. In this application it may allow an on-line appraisal of the main physical 'observables' (i.e., quantities which can be directly measured by core neighbouring devices) of the reactor system when subject to alterations relevant to normal system exploitation, e.g., control rod movement, and/or soluble boron concentration changes to be introduced for compensating power level variations following electrical network demands. In a frame of a wide R and D program aimed at setting up a fast answering system for operator's support in PWR normal operation, a joint cooperation is underway among FRAMATOM, ENEA and University of Bologna for the development of an original application of the GPT methodology to PWR technology. The present paper, after describing the main features of the theory, in particular the fundamentals of the so called 'control variable option', emphasizes the general objectives of such a system. The results from a small scale investigation performed on a simplified PWR system corroborate the validity of the methodology proposed and allow to draw some preliminary conclusions on its industrial potentiality. (author)

  1. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    OpenAIRE

    Adrian Irimescu

    2012-01-01

    In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This stu...

  2. 3D photography in the objective analysis of volume augmentation including fat augmentation and dermal fillers.

    Science.gov (United States)

    Meier, Jason D; Glasgold, Robert A; Glasgold, Mark J

    2011-11-01

    The authors present quantitative and objective 3D data from their studies showing long-term results with facial volume augmentation. The first study analyzes fat grafting of the midface and the second study presents augmentation of the tear trough with hyaluronic filler. Surgeons using 3D quantitative analysis can learn the duration of results and the optimal amount to inject, as well as showing patients results that are not demonstrable with standard, 2D photography. PMID:22004863

  3. Effect of varicocelectomy on testis volume and semen parameters in adolescents: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Tie Zhou

    2015-01-01

    Full Text Available Varicocele repair in adolescent remains controversial. Our aim is to identify and combine clinical trials results published thus far to ascertain the efficacy of varicocelectomy in improving testis volume and semen parameters compared with nontreatment control. A literature search was performed using Medline, Embase and Web of Science, which included results obtained from meta-analysis, randomized and nonrandomized controlled studies. The study population was adolescents with clinically palpable varicocele with or without the testicular asymmetry or abnormal semen parameters. Cases were allocated to treatment and observation groups, and testis volume or semen parameters were adopted as outcome measures. As a result, seven randomized controlled trials (RCTs and nonrandomized controlled trials studying bilateral testis volume or semen parameters in both treatment and observation groups were identified. Using a random effect model, mean difference of testis volume between the treatment group and the observation group was 2.9 ml (95% confidence interval [CI]: 0.6, 5.2; P< 0.05 for the varicocele side and 1.5 ml (95% CI: 0.3, 2.7; P< 0.05 for the healthy side. The random effect model analysis demonstrated that the mean difference of semen concentration, total semen motility, and normal morphology between the two groups was 13.7 × 10 6 ml−1 (95% CI: −1.4, 28.8; P = 0.075, 2.5% (95% CI: −3.6, 8.6; P= 0.424, and 2.9% (95% CI: −3.0, 8.7; P= 0.336 respectively. In conclusion, although varicocelectomy significantly improved bilateral testis volume in adolescents with varicocele compared with observation cases, semen parameters did not have any statistically significant difference between two groups. Well-planned, properly conducted RCTs are needed in order to confirm the above-mentioned conclusion further and to explore whether varicocele repair in adolescents could improve subsequently spontaneous pregnancy rates.

  4. Development of a methodology for the economical analysis of fuel cycles, application to the Laguna Verde central

    International Nuclear Information System (INIS)

    In this work a methodology developed to carry out the economical analysis of the fuel cycle of a nuclear reactor is presented. The methodology was applied to the Laguna Verde Nuclear Power Station (CNLV). The design of the reload scenarios of the CNLV are made with the Core Master Presto code (CM-Presto), three-dimensional simulator of the reactor core, the launched data by this, as well as the information of the Energy use plan (PUE), it allowed us to obtain reliable results through the fitness of an algorithm of economic calculation that considers all the components of the fuel cycle to present worth. With the application of the methodology it was obtained the generated energy, as well as their respective cost of each sub lot type of assemblies by operation cycle, from the start-up of the CNLV until September 13, 2002. Using the present worth method its were moved all the values at November 5, 1988, date of operation beginning. To the final of the analysis an even cost of 6.188 mills/kWh was obtained for those first 9 cycles of the Unit 1 of the CNLV, being observed that the costs of those first 3 operation cycles are the more elevated. Considering only the values starting from the cycle 4, the levelled cost turns out to be of 5.96 mills/kWh. It was also obtained the cost by fuel lot to evaluate the performance of assemble with the same physical composition. (Author)

  5. Development and application of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    A methodology is developed to identify and flag significant trends related to the safety and availability of U.S. commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs, and significant errors affecting systems and which are caused by response action are highly associated with startup reactor mode in BWRS

  6. Magnetic resonance velocity imaging derived pressure differential using control volume analysis

    Directory of Open Access Journals (Sweden)

    Cohen Benjamin

    2011-03-01

    Full Text Available Abstract Background Diagnosis and treatment of hydrocephalus is hindered by a lack of systemic understanding of the interrelationships between pressures and flow of cerebrospinal fluid in the brain. Control volume analysis provides a fluid physics approach to quantify and relate pressure and flow information. The objective of this study was to use control volume analysis and magnetic resonance velocity imaging to non-invasively estimate pressure differentials in vitro. Method A flow phantom was constructed and water was the experimental fluid. The phantom was connected to a high-resolution differential pressure sensor and a computer controlled pump producing sinusoidal flow. Magnetic resonance velocity measurements were taken and subsequently analyzed to derive pressure differential waveforms using momentum conservation principles. Independent sensor measurements were obtained for comparison. Results Using magnetic resonance data the momentum balance in the phantom was computed. The measured differential pressure force had amplitude of 14.4 dynes (pressure gradient amplitude 0.30 Pa/cm. A 12.5% normalized root mean square deviation between derived and directly measured pressure differential was obtained. These experiments demonstrate one example of the potential utility of control volume analysis and the concepts involved in its application. Conclusions This study validates a non-invasive measurement technique for relating velocity measurements to pressure differential. These methods may be applied to clinical measurements to estimate pressure differentials in vivo which could not be obtained with current clinical sensors.

  7. Report of a CSNI workshop on uncertainty analysis methods. Volume 1 + 2

    International Nuclear Information System (INIS)

    The OECD NEA CSNI Principal Working Group 2 (PWG2) Task Group on Thermal Hydraulic System Behaviour (TGTHSB) has, in recent years, received presentations of a variety of different methods to analyze the uncertainty in the calculations of advanced unbiased (best estimate) codes. Proposals were also made for an International Standard Problem (ISP) to compare the uncertainty analysis methods. The objectives for the Workshop were to discuss and fully understand the principles of uncertainty analysis relevant to LOCA modelling and like problems, to examine the underlying issues from first principles, in preference to comparing and contrasting the currently proposed methods, to reach consensus on the issues identified as far as possible while not avoiding the controversial aspects, to identify as clearly as possible unreconciled differences, and to issue a Status Report. Eight uncertainty analysis methods were presented. A structured discussion of various aspects of uncertainty analysis followed - the need for uncertainty analysis, identification and ranking of uncertainties, characterisation, quantification and combination of uncertainties and applications, resources and future developments. As a result, the objectives set out above were, to a very large extent, achieved. Plans for the ISP were also discussed. Volume 1 contains a record of the discussions on uncertainty methods. Volume 2 is a compilation of descriptions of the eight uncertainty analysis methods presented at the workshop

  8. Development of a Generalized Methodology for Soil-Structure Interaction Analysis Using Nonlinear Time-Domain Techniques

    International Nuclear Information System (INIS)

    A generalized time-domain method for Soil-Structure Interaction Analysis is developed, based upon an extension of the Bielak Method. The methodology is combined with the use of a simple hysteretic soil model based upon the Ramberg-Osgood formulation and applied to a notional Small Modular Reactor. These benchmark results compare well with those obtained by using the industry-standard frequency domain code SASSI. The methodology provides a path forward for investigation of other sources of nonlinearity, including those associated with the use of more physically-realistic material models incorporating pore-pressure effects, gap opening/closing, the effect of nonlinear structural elements, and 3D seismic inputs.

  9. Conceptual and methodological frameworks for large scale and high resolution analysis of the physical flood susceptibility of buildings

    Science.gov (United States)

    Blanco-Vogt, A.; Schanze, J.

    2013-10-01

    There are some approaches available for assessing flood damage to buildings and critical infrastructure. However, these methods up to now can hardly be adapted to a large scale because of lacking high resolution classification and characterisation approaches for the built structures. To overcome this obstacle, the paper presents, first, a conceptual framework for understanding physical flood susceptibility of buildings; and second, a methodological framework for its analysis. The latter ranges from automatic extraction of buildings mainly from remote sensing with their subsequent classification and characterisation to a systematic physical flood susceptibility assessment. The work shows the results of implementation and testing a respective methodology in a district of the city of Magangué, Magdalena River Colombia.

  10. Lobar analysis of collapsibility indices to assess functional lung volumes in COPD patients

    Directory of Open Access Journals (Sweden)

    Kitano M

    2014-12-01

    Full Text Available Mariko Kitano,1 Shingo Iwano,1 Naozumi Hashimoto,2 Keiji Matsuo,3 Yoshinori Hasegawa,2 Shinji Naganawa1 1Department of Radiology, 2Department of Respiratory Medicine, Graduate School of Medicine, Nagoya University, Nagoya, Aichi, Japan; 3Department of Radiology, Ichinomiya Municipal Hospital, Ichinomiya, Aichi, Japan Background: We investigated correlations between lung volume collapsibility indices and pulmonary function test (PFT results and assessed lobar differences in chronic obstructive pulmonary disease (COPD patients, using paired inspiratory and expiratory three dimensional (3D computed tomography (CT images. Methods: We retrospectively assessed 28 COPD patients who underwent paired inspiratory and expiratory CT and PFT exams on the same day. A computer-aided diagnostic system calculated total lobar volume and emphysematous lobar volume (ELV. Normal lobar volume (NLV was determined by subtracting ELV from total lobar volume, both for inspiratory phase (NLVI and for expiratory phase (NLVE. We also determined lobar collapsibility indices: NLV collapsibility ratio (NLVCR (% = (1 - NLVE/NLVI × 100%. Associations between lobar volumes and PFT results, and collapsibility indices and PFT results were determined by Pearson correlation analysis. Results: NLVCR values were significantly correlated with PFT results. Forced expiratory volume in 1 second, measured as percent of predicted results (FEV1%P was significantly correlated with NLVCR values for the lower lobes (P<0.01, whereas this correlation was not significant for the upper lobes (P=0.05. FEV1%P results were also moderately correlated with inspiratory, expiratory ELV (ELVI,E for the lower lobes (P<0.05. In contrast, the ratio of the diffusion capacity for carbon monoxide to alveolar gas volume, measured as percent of predicted (DLCO/VA%P results were strongly correlated with ELVI for the upper lobes (P<0.001, whereas this correlation with NLVCR values was weaker for upper lobes (P<0

  11. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  12. Performance Prediction Modelling for Flexible Pavement on Low Volume Roads Using Multiple Linear Regression Analysis

    Directory of Open Access Journals (Sweden)

    C. Makendran

    2015-01-01

    Full Text Available Prediction models for low volume village roads in India are developed to evaluate the progression of different types of distress such as roughness, cracking, and potholes. Even though the Government of India is investing huge quantum of money on road construction every year, poor control over the quality of road construction and its subsequent maintenance is leading to the faster road deterioration. In this regard, it is essential that scientific maintenance procedures are to be evolved on the basis of performance of low volume flexible pavements. Considering the above, an attempt has been made in this research endeavor to develop prediction models to understand the progression of roughness, cracking, and potholes in flexible pavements exposed to least or nil routine maintenance. Distress data were collected from the low volume rural roads covering about 173 stretches spread across Tamil Nadu state in India. Based on the above collected data, distress prediction models have been developed using multiple linear regression analysis. Further, the models have been validated using independent field data. It can be concluded that the models developed in this study can serve as useful tools for the practicing engineers maintaining flexible pavements on low volume roads.

  13. Multifidus Muscle Volume Estimation Based on Three Dimensional Wavelet Multi Resolution Analysis: MRA with Buttocks Computer-Tomography: CT Images

    OpenAIRE

    Kohei Arai

    2013-01-01

    Multi-Resolution Analysis:. MRA based edge detection algorithm is proposed for estimation of volume of multifidus muscle in the Computer Tomography: CT scanned image The volume of multifidus muscle would be a good measure for metabolic syndrome rather than internal fat from a point of view from processing complexity. The proposed measure shows 0.178 of R square which corresponds to mutual correlation between internal fat and the volume of multifidus muscle. It is also fund that R square betwe...

  14. Subsystem Hazard Analysis Methodology for the Ares I Upper Stage Source Controlled Items

    Science.gov (United States)

    Mitchell, Michael S.; Winner, David R.

    2010-01-01

    This article describes processes involved in developing subsystem hazard analyses for Source Controlled Items (SCI), specific components, sub-assemblies, and/or piece parts, of the NASA ARES I Upper Stage (US) project. SCIs will be designed, developed and /or procured by Boeing as an end item or an off-the-shelf item. Objectives include explaining the methodology, tools, stakeholders and products involved in development of these hazard analyses. Progress made and further challenges in identifying potential subsystem hazards are also provided in an effort to assist the System Safety community in understanding one part of the ARES I Upper Stage project.

  15. Methodology for the application of the probabilistic safety analysis to the cobalto therapy units in Cuba

    International Nuclear Information System (INIS)

    Presently work the main elements are discussed kept in mind for the use of the Analyses Probabilistas of Security in the evaluation of the security of the units of cobalto therapy of Cuba and it is presented, like part of the results of the first stage of the Study, the Methodological Guide that is being used in a Contract of Investigation of the OIEA that at the moment carries out the community of authors of the CNSN, of group with other specialists of the Ministry of Public Health (MINSAP)

  16. A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems

    CERN Document Server

    Milev, Momchil

    2011-01-01

    Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The results of the analysis are easy to interpret. Estimated phase margin is readily available. Instability nodes and loops along with their respective oscillation frequencies are immediately identified and mapped to the existing circuit nodes thus offering significant advantages compared to traditional "black-box" methods of stability analysis (Transient Overshoot, Bode and Phase margin plots etc.). The tool for AC-Stability analysis is written in SKILL? and is fully integrated in DFII? environment. Its "push-button" graphical user interface (GUI) is easy to use and understand. The t...

  17. Development of Thermal-hydraulic Analysis Methodology for Multi-module Breeding Blankets in K-DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun; Park, Goon-Cherl; Cho, Hyoung-Kyu [Seoul National University, Seoul (Korea, Republic of); Im, Kihak [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, the purpose of the analyses is to extend the capability of MARS-KS to the entire blanket system which includes a few hundreds of single blanket modules. Afterwards, the plan for the whole blanket system analysis using MARS-KS is introduced and the result of the multiple blanket module analysis is summarized. A thermal-hydraulic analysis code for a nuclear reactor safety, MARS-KS, was applied for the conceptual design of the K-DEMO breeding blanket thermal analysis. Then, a methodology to simulate multiple blanket modules was proposed, which uses a supervisor program to handle each blanket module individually at first and then distribute the flow rate considering pressure drops arises in each module. For a feasibility test of the proposed methodology, 10 outboard blankets in a toroidal field sector were simulated, which are connected with each other through the inlet and outlet common headers. The calculation results of flow rates, pressure drops, and temperatures showed the validity of the calculation and thanks to the parallelization using MPI, almost linear speed-up could be obtained.

  18. Methodology to decide optimum replacement term for components of nuclear power plants using decision analysis

    International Nuclear Information System (INIS)

    Mostly, the economic analyses for replacement of major components of Nuclear Power Plants(NPPs) have been performed in deterministic ways. However, the analysis results are more or less affected by the uncertainties associated with input variables. Therefore, it is desirable to use a probabilistic economic analysis method to properly consider uncertainty of real problem. In this paper, the probabilistic economic analysis method and decision analysis technique are briefly described. The probabilistic economy analysis method using decision analysis will provide efficient and accurate way of economic analysis for the repair and/or replace major components of NPPs

  19. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  20. Fast implementation of kernel simplex volume analysis based on modified Cholesky factorization for endmember extraction

    Institute of Scientific and Technical Information of China (English)

    Jing LI; Xiao-run LI; Li-jiao WANG; Liao-ying ZHAO

    2016-01-01

    Endmember extraction is a key step in the hyperspectral image analysis process. The kernel new simplex growing algorithm (KNSGA), recently developed as a nonlinear alternative to the simplex growing algorithm (SGA), has proven a prom-ising endmember extraction technique. However, KNSGA still suffers from two issues limiting its application. First, its random initialization leads to inconsistency in final results; second, excessive computation is caused by the iterations of a simplex volume calculation. To solve the first issue, the spatial pixel purity index (SPPI) method is used in this study to extract the first endmember, eliminating the initialization dependence. A novel approach tackles the second issue by initially using a modified Cholesky fac-torization to decompose the volume matrix into triangular matrices, in order to avoid directly computing the determinant tauto-logically in the simplex volume formula. Theoretical analysis and experiments on both simulated and real spectral data demon-strate that the proposed algorithm significantly reduces computational complexity, and runs faster than the original algorithm.

  1. Economic analysis of a volume reduction/polyethylene solidification system for low-level radioactive wastes

    International Nuclear Information System (INIS)

    A study was conducted at Brookhaven National Laboratory to determine the economic feasibility of a fluidized bed volume reduction/polyethylene solidification system for low-level radioactive wastes. These results are compared with the ''null'' alternative of no volume reduction and solidification of aqueous waste streams in hydraulic cement. The economic analysis employed a levelized revenue requirement (LRR) technique conducted over a ten year period. An interactive computer program was written to conduct the LRR calculations. Both of the treatment/solidification options were considered for a number of scenarios including type of plant (BWR or PWR) and transportation distance to the disposal site. If current trends in the escalation rates of cost components continue, the volume reduction/polyethylene solidification option will be cost effective for both BWRs and PWRs. Data indicate that a minimum net annual savings of $0.8 million per year (for a PWR shipping its waste 750 miles) and a maximum net annual savings of $9 million per year (for a BWR shipping its waste 2500 miles) can be achieved. A sensitivity analysis was performed for the burial cost escalation rate, which indicated that variation of this factor will impact the total levelized revenue requirement. The burial cost escalation rate which yields a break-even condition was determined for each scenario considered. 11 refs., 8 figs., 39 tabs

  2. Conceptual design and systems analysis of photovoltaic power systems. Final report. Volume III(2). Technology

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, P.F.

    1977-05-01

    Conceptual designs were made and analyses were performed on three types of solar photovoltaic power systems. Included were Residential (1 to 10 kW), Intermediate (0.1 to 10 MW), and Central (50 to 1000 MW) Power Systems to be installed in the 1985 to 2000 time period. The following analyses and simulations are covered: residential power system computer simulations, intermediate power systems computer simulation, central power systems computer simulation, array comparative performance, utility economic and margin analyses, and financial analysis methodology.

  3. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    Science.gov (United States)

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data. PMID:26028079

  4. Systems Studies Department FY 78 activity report. Volume 2. Systems analysis. [Sandia Laboratories, Livermore

    Energy Technology Data Exchange (ETDEWEB)

    Gold, T.S.

    1979-02-01

    The Systems Studies Department at Sandia Laboratories Livermore (SLL) has two primary responsibilities: to provide computational and mathematical services and to perform systems analysis studies. This document (Volume 2) describes the FY Systems Analysis highlights. The description is an unclassified overview of activities and is not complete or exhaustive. The objective of the systems analysis activities is to evaluate the relative value of alternative concepts and systems. SLL systems analysis activities reflect Sandia Laboratory programs and in 1978 consisted of study efforts in three areas: national security: evaluations of strategic, theater, and navy nuclear weapons issues; energy technology: particularly in support of Sandia's solar thermal programs; and nuclear fuel cycle physical security: a special project conducted for the Nuclear Regulatory Commission. Highlights of these activities are described in the following sections. 7 figures. (RWR)

  5. Evaluation of Cylinder Volume Estimation Methods for In–Cylinder Pressure Trace Analysis

    Directory of Open Access Journals (Sweden)

    Adrian Irimescu

    2012-09-01

    Full Text Available In–cylinder pressure trace analysis is an important investigation tool frequently employed in the study of internal combustion engines. While technical data is usually available for experimental engines, in some cases measurements are performed on automotive engines for which only the most basic geometry features are available. Therefore, several authors aimed to determine the cylinder volume and length of the connecting rod by other methods than direct measurement. This study performs an evaluation of two such methods. The most appropriate way was found to be the estimation of connecting rod length based on general engine category as opposed to the use of an equation that predicts cylinder volume with good accuracy around top dead centre for most geometries.

  6. Power-line-induced ac potential on natural-gas pipelines for complex rights-of-way configurations. Volume 2. Graphical analysis handbook. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, M.

    1983-05-01

    Joint use of common corridors for overhead electric-power-transmission lines and buried natural-gas-transmission pipelines results in undesired coupling of electromagnetic energy onto the natural-gas-transmission pipelines. The project has resulted in the development of the methodology and techniques for analyzing such complex common corridor coupling problems. Field tests were conducted to verify key aspects of the analysis. Two complementary methods have been developed for solving coupling problems on common corridors: a handbook that provides simplified procedures and graphical aids that can be used to analyze many less complex common corridor concerns, and a computer program that provides the means for analyzing a wide range of more complex configurations. This volume presents the simplified graphical analysis.

  7. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    Science.gov (United States)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  8. Methodology to carry out a sensitivity and uncertainty analysis for cross sections using a coupled model Trace-Parcs

    International Nuclear Information System (INIS)

    A methodology was implemented to carry out a sensitivity and uncertainty analysis for cross sections used in a coupled model for Trace/Parcs in a transient of control rod fall of a BWR-5. A model of the reactor core for the neutronic code Parcs was used, in which the assemblies located in the core are described. Thermo-hydraulic model in Trace was a simple model, where only a component type Chan was designed to represent all the core assemblies, which it was within a single vessel and boundary conditions were established. The thermo-hydraulic part was coupled with the neutron part, first for the steady state and then a transient of control rod fall was carried out for the sensitivity and uncertainty analysis. To carry out the analysis of cross sections used in the coupled model Trace/Parcs during the transient, the Probability Density Functions for 22 parameters selected from the total of neutronic parameters that use Parcs were generated, obtaining 100 different cases for the coupled model Trace/Parcs, each one with a database of different cross sections. All these cases were executed with the coupled model, obtaining in consequence 100 different output files for the transient of control rod fall doing emphasis in the nominal power, for which an uncertainty analysis was realized at the same time generate the band of uncertainty. With this analysis is possible to observe the ranges of results of the elected responses varying the selected uncertainty parameters. The sensitivity analysis complements the uncertainty analysis, identifying the parameter or parameters with more influence on the results and thus focuses on these parameters in order to better understand their effects. Beyond the obtained results, because is not a model with real operation data, the importance of this work is to know the application of the methodology to carry out the sensitivity and uncertainty analyses. (Author)

  9. Development and applications of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    A methodology is developed to identify and flag significant trends related to the safety and availability of commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs. Those are corresponding to inconsistencies in the pattern of associated data. The program P3F for the analysis of multiway frequency tables in Biomedical Computer Programs may provide incorrect estimates for the expected frequencies if structural zeros, where the frequency is constrained to be zero, occur. Therefore, to get reasonable results, efforts to eliminate structural zero entries in multiway tables should be made. This may be accomplished by redefining the multiway tables or by using a more general computer program for the multiway tables. (author)

  10. PRESSURE-VOLUME ANALYSIS OF THE LUNG WITH AN EXPONENTIAL AND LINEAR-EXPONENTIAL MODEL IN ASTHMA AND COPD

    NARCIS (Netherlands)

    BOGAARD, JM; OVERBEEK, SE; VERBRAAK, AFM; VONS, C; FOLGERING, HTM; VANDERMARK, TW; ROOS, CM; STERK, PJ

    1995-01-01

    The prevalence of abnormalities in lung elasticity in patients with asthma or chronic obstructive pulmonary disease (COPD) is still unclear, This might be due to uncertainties concerning the method of analysis of quasistatic deflation long pressure-volume curves. Pressure-volume curves were obtained

  11. Combined cycle solar central receiver hybrid power system study. Volume III. Appendices. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    None

    1979-11-01

    A design study for a 100 MW gas turbine/steam turbine combined cycle solar/fossil-fuel hybrid power plant is presented. This volume contains the appendices: (a) preconceptual design data; (b) market potential analysis methodology; (c) parametric analysis methodology; (d) EPGS systems description; (e) commercial-scale solar hybrid power system assessment; and (f) conceptual design data lists. (WHK)

  12. Development of chiral methodologies by capillary electrophoresis with ultraviolet and mass spectrometry detection for duloxetine analysis in pharmaceutical formulations.

    Science.gov (United States)

    Sánchez-López, Elena; Montealegre, Cristina; Marina, María Luisa; Crego, Antonio L

    2014-10-10

    Two chiral methodologies were developed by capillary electrophoresis (CE) with UV and mass spectrometry (MS) detection to ensure the quality control of the drug duloxetine, commercialized as a pure enantiomer. Both methods were optimized to achieve a high baseline enantioresolution (Rs>2) and an acceptable precision (RSD values developed methods were validated and applied for the first time to the analysis of four pharmaceutical formulations. The content of R-duloxetine in all these samples was below the detection limit and the amount of S-duloxetine was in good agreement with the labeled content, obtaining results by the two methods that did not differ significantly (p-values >0.05).

  13. Portable laserscan for in-ditch dent profiling and strain analysis: methodology and application development

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Udayasankar; Tandon, Samarth; Gao, Ming; Krishnamurthy, Ravi [Blade Energy Partners, Houston, Texas (United States); Hanson, Ben; Rehman, Hamood; Fingerhut, Martin [Applus RTD, Houston, Texas (United States)

    2010-07-01

    Mechanical damage to pipelines has been assessed with two methodologies. The traditional one was a depth-based assessment, whose limitations could result in an underestimation of the dent severity. In more recent years, therefore, operators have preferred to complement this method with the use of strain-based criteria. The data from an ILI caliper tool can be used to calculate strain in order to determine dent severity. After every ILI run, verification of inspection performance is necessary, but this has been a challenge for the industry because of the lack of a unified protocol and other causes. According to a recent study, LaserScan 3D technology provides an accurate profile and is an ideal tool for verification of ILI performance. This paper introduces a portable LaserScan 3D mapping technology to measure dents, alone or associated with other anomalies. It discusses the accuracy and resolution of this technology and its appropriateness for pipelines.

  14. Study of a dosimetric methodology for plutonium by means of radiotonicological analysis in urine

    International Nuclear Information System (INIS)

    The present study is mainly concerned with an internal individual monitoring program for workers dealing with 239Pu, by measuring the 239Pu content in their urine. General aspects related with the plutonium radiotoxicity and its chemical, physical and metabolic properties are discussed. The methodology chosen for the 239Pu analyses in urine is based on the wet ashing of the urine sample, followed by the plutonium separation by precipitation with lanthanium nitrate and extraction with thenoyltrifluoroacetone. After the separation, the samples is electrodeposited and the activity measured by alpha spectrometry. The results were then analyzed by taking into account the couting efficiency obtained of 23.72%, the chemical recovery of 85.3% and the lower limit of detection of 1.1 x 10-3 Bq. Finally, the bases for the establishment of reference levels for urinary excretion are discussed by considering the maximum permissible body burden (MPBB) and the annual limit of intake (ALI). (author)

  15. A review on nuclear forensic methodology for analysis of nuclear material of unknown origin

    International Nuclear Information System (INIS)

    With the growing use of nuclear power and threat from illegal nuclear smuggling nuclear forensic provides an aid to the law enforcement to trace back modus operandi of such threats. Extensive nuclear proliferation, race among countries to acquire nuclear capability and global terrorism scenario has mandated Nuclear Forensic Science technology to tackle nuclear threats. Gamma spectrometry, alpha spectrometry, thermal ionization mass spectrometry, inductively coupled plasma mass spectrometry are employed for characterization and relative isotopic composition determinant of Nuclear material and techniques like SEM transmission electron TEM, FT-IR, GC-MS, Electrophoretic technique are used to characterize the contaminated materials in order to deceive investigative agencies. The present paper provide systematic forensic methodology for nuclear and radioactive materials encountered at any crime scene due to any accidental discharges or military activities. (author)

  16. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    Science.gov (United States)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (risks) associated with advanced technology; and application of heuristics to facilitate informed system-level technology utilization decisions earlier in the conceptual design phase. MERIT extends the state of the art in technology insertion assessment selection practice and, if adopted, may aid designers in determining the configuration of complex systems that meet essential requirements in a timely, cost-effective manner.

  17. Factor Analysis in Assessing the Research Methodology Quality of Systematic Reviews

    Directory of Open Access Journals (Sweden)

    Andrada Elena URDA-CÎMPEAN

    2011-12-01

    Full Text Available Introduction: Many high quality systematic reviews available from medical journals, data bases and other electronic sources differ in quality and provide different answers to the same question. The literature recommended the use of a checklist type approach, which exceeds many of the problems associated with measurements. Aim: This study proposes to identify in a checklist type approach the most commonly used factors (from a methodological point of view in assessing the quality of systematic reviews, and then mirror the actual stage of medical writing. We want to analyze the factors’ occurrence and / or their development in the text and in the abstract of systematic reviews published in 2011. Methods: The present study randomly selected only free full text systematic reviews published in 2011, systematic reviews found in Pubmed and in Cochrane Database. The most commonly used factors were identified in PRISMA statement and quality measurement tools. Results: The evaluated systematic reviews mentioned or developed several of the factors studied. Only 78% of the papers surveyed have used the correct IMRAD format and 59% of them have mentioned the sample size used. The correspondence between the content of the paper and its abstract is summarized in the proportion of 54.63% and 51.85% for the two sets of factors, and it can lead to scarce appreciation of the article provided that only abstracts are read. Conclusions: Researchers do not properly take into consideration scientific articles and assessment tools used for quality evaluation. They should place more value over methodological factors which help assess systematic review quality, while journals form the only party who can enforce quality standards in medical writing.

  18. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  19. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  20. Durability of recycled aggregate concrete designed with the Equivalent Mortar Volume (EMV method: Validation under the Spanish context and its adaptation to Bolomey methodology

    Directory of Open Access Journals (Sweden)

    Jiménez, C.

    2014-03-01

    Full Text Available Some durability properties are analyzed in concretes made with a novel method for recycled aggregates concrete (RAC proportioning, in order to validate it under the Spanish context. Two types of concrete mixes were elaborated; one following the guidelines of the named method, and other based on an adaptation of the method to Bolomey methodology. Two types of recycled concrete aggregates (RCA were used. RCA replacement for natural aggregates (NA ranged from 20% to 100%. The 20% was chosen in order to comply with Spanish recommendations. Water penetration under pressure, water absorption and chlorides attack were the studied properties. It is verified that the new method and the developed adaptation results in concrete mixes of better or similar properties to those of the natural aggregates concrete (NAC and the conventional RAC, saving important amounts of cement.Algunas propiedades de durabilidad son analizadas en hormigones elaborados con el nuevo método para la dosificación de hormigones con árido reciclado (HAR para validarlo bajo el contexto español. Se elaboraron dos tipos de hormigones; uno siguiendo las directrices del nuevo método y otro basado en una adaptación del anterior a la metodología Bolomey. Se utilizaron dos tipos de árido reciclado (ARH. Los reemplazos de áridos variaron entre 20% y 100%. El 20% ha sido elegido para cumplir con recomendaciones españolas sobre HAR. Las propiedades estudiadas fueron: penetración de agua bajo presión, absorción de agua y susceptibilidad al ataque de cloruros. Se verifica que el nuevo método y la adaptación desarrollada resultan en hormigones con mejores o similares características que las de un hormigón con áridos naturales (HAN y las de HAR convencional, ahorrando, además, importantes cantidades de cemento.