WorldWideScience

Sample records for analysis methodology volume

  1. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  2. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  3. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  4. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  5. A general methodology for three-dimensional analysis of variation in target volume delineation

    NARCIS (Netherlands)

    Remeijer, P.; Rasch, C.; Lebesque, J. V.; van Herk, M.

    1999-01-01

    A generic method for three-dimensional (3-D) evaluation of target volume delineation in multiple imaging modalities is presented. The evaluation includes geometrical and statistical methods to estimate observer differences and variability in defining the Gross Tumor Volume (GTV) in relation to the

  6. A normative price for a manufactured product: The SAMICS methodology. Volume 2: Analysis

    Science.gov (United States)

    Chamberlain, R. G.

    1979-01-01

    The Solar Array Manufacturing Industry Costing Standards provide standard formats, data, assumptions, and procedures for determining the price a hypothetical solar array manufacturer would have to be able to obtain in the market to realize a specified after-tax rate of return on equity for a specified level of production. The methodology and its theoretical background are presented. The model is sufficiently general to be used in any production-line manufacturing environment. Implementation of this methodology by the Solar Array Manufacturing Industry Simultation computer program is discussed.

  7. Transport of solid commodities via freight pipeline: demand analysis methodology. Volume IV. First year final report

    Energy Technology Data Exchange (ETDEWEB)

    Allen, W.B.; Plaut, T.

    1976-07-01

    In order to determine the feasibility of intercity freight pipelines, it was necessary to determine whether sufficient traffic flows currently exist between various origins and destinations to justify consideration of a mode whose operating characteristics became competitive under conditions of high-traffic volume. An intercity origin/destination freight-flow matrix was developed for a large range of commodities from published sources. A high-freight traffic-density corridor between Chicago and New York and another between St. Louis and New York were studied. These corridors, which represented 18 cities, had single-direction flows of 16 million tons/year. If trans-shipment were allowed at each of the 18 cities, flows of up to 38 million tons/year were found in each direction. These figures did not include mineral or agricultural products. After determining that such pipeline-eligible freight-traffic volumes existed, the next step was to determine the ability of freight pipeline to penetrate such markets. Modal-split models were run on aggregate data from the 1967 Census of Transportation. Modal-split models were also run on disaggregate data specially collected for this study. The freight pipeline service characteristics were then substituted into both the aggregate and disaggregate models (truck vs. pipeline and then rail vs. pipeline) and estimates of pipeline penetration into particular STCC commodity groups were made. Based on these very preliminary results, it appears that freight pipeline has market penetration potential that is consistent with high-volume participation in the intercity freight market.

  8. Socioeconomic effects of the DOE Gas Centrifuge Enrichment Plant. Volume 1: methodology and analysis

    International Nuclear Information System (INIS)

    1979-01-01

    The socioeconomic effects of the Gas Centrifuge Enrichment Plant being built in Portsmouth, Ohio were studied. Chapters are devoted to labor force, housing, population changes, economic impact, method for analysis of services, analysis of service impacts, schools, and local government finance

  9. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    International Nuclear Information System (INIS)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-01-01

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  10. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  11. Methodology for generating waste volume estimates

    International Nuclear Information System (INIS)

    Miller, J.Q.; Hale, T.; Miller, D.

    1991-09-01

    This document describes the methodology that will be used to calculate waste volume estimates for site characterization and remedial design/remedial action activities at each of the DOE Field Office, Oak Ridge (DOE-OR) facilities. This standardized methodology is designed to ensure consistency in waste estimating across the various sites and organizations that are involved in environmental restoration activities. The criteria and assumptions that are provided for generating these waste estimates will be implemented across all DOE-OR facilities and are subject to change based on comments received and actual waste volumes measured during future sampling and remediation activities. 7 figs., 8 tabs

  12. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  13. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  14. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  15. Incident analysis methodology

    International Nuclear Information System (INIS)

    Libmann, J.

    1986-05-01

    The number of French nuclear power stations in operation and their division into standardized plant series very soon led to the requirement for a precise organization, within both the nuclear safety authorities and the operator, Electricite de France. The methods of analysis have been gradually extended and diversified and we shall speak of them, but it is evident that a very precise definition of the boundaries between what concerns safety and what does not concern it, is needed. This report first deals with the criteria on which declarations are based before outlining the main guidelines of analysis methodology [fr

  16. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  17. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  18. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  19. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    OpenAIRE

    Tetyana KOVALCHUK

    2016-01-01

    The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted action...

  20. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  1. Recent Methodology in Ginseng Analysis

    Science.gov (United States)

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  2. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  3. METHODOLOGY OF MATHEMATICAL ANALYSIS IN POWER NETWORK

    OpenAIRE

    Jerzy Szkutnik; Mariusz Kawecki

    2008-01-01

    Power distribution network analysis is taken into account. Based on correlation coefficient authors establish methodology of mathematical analysis useful in finding substations bear responsibility for power stoppage. Also methodology of risk assessment will be carried out.

  4. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  5. Architectural and Behavioral Systems Design Methodology and Analysis for Optimal Habitation in a Volume-Limited Spacecraft for Long Duration Flights

    Science.gov (United States)

    Kennedy, Kriss J.; Lewis, Ruthan; Toups, Larry; Howard, Robert; Whitmire, Alexandra; Smitherman, David; Howe, Scott

    2016-01-01

    As our human spaceflight missions change as we reach towards Mars, the risk of an adverse behavioral outcome increases, and requirements for crew health, safety, and performance, and the internal architecture, will need to change to accommodate unprecedented mission demands. Evidence shows that architectural arrangement and habitability elements impact behavior. Net habitable volume is the volume available to the crew after accounting for elements that decrease the functional volume of the spacecraft. Determination of minimum acceptable net habitable volume and associated architectural design elements, as mission duration and environment varies, is key to enabling, maintaining, andor enhancing human performance and psychological and behavioral health. Current NASA efforts to derive minimum acceptable net habitable volumes and study the interaction of covariates and stressors, such as sensory stimulation, communication, autonomy, and privacy, and application to internal architecture design layouts, attributes, and use of advanced accommodations will be presented. Furthermore, implications of crew adaptation to available volume as they transfer from Earth accommodations, to deep space travel, to planetary surface habitats, and return, will be discussed.

  6. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  7. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  8. Benefit-Cost Analysis of Integrated Paratransit Systems : Volume 6. Technical Appendices.

    Science.gov (United States)

    1979-09-01

    This last volume, includes five technical appendices which document the methodologies used in the benefit-cost analysis. They are the following: Scenario analysis methodology; Impact estimation; Example of impact estimation; Sensitivity analysis; Agg...

  9. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  10. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    Directory of Open Access Journals (Sweden)

    Juan Carlos Rincón-Vásquez

    2011-12-01

    Full Text Available This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1 Structure Research, 2 Collection of informationand, 3 Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  11. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  12. Rat sperm motility analysis: methodologic considerations

    Science.gov (United States)

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  13. Methodology of human factor analysis

    International Nuclear Information System (INIS)

    Griffon-Fouco, M.

    1988-01-01

    The paper describes the manner in which the Heat Production Department of Electricite de France analyses the human factors in nuclear power plants. After describing the teams and structures set up to deal with this subject, the paper emphasizes two types of methods which are used, most often in complementary fashion: (1) an a posteriori analysis, which consists in studying the events which have taken place at nuclear power plants and in seeking the deepseated causes so as to prevent their reoccurrence in future; (2) an a priori analysis, which consists in analysing a work situation and in detecting all its potential failure factors so as to prevent their resulting once again in dysfunctions of the facility. To illustrate these two types of analysis, two examples are given: first, a study of the telephonic communications between operators in one plant (in which the a posteriori and a priori analysis are developed) and, second, a study of stress in a plant (in which only the a priori analysis is used). (author). 1 tab

  14. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  15. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  16. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  17. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  18. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  19. Intravascular volume in cirrhosis. Reassessment using improved methodology

    International Nuclear Information System (INIS)

    Rector, W.G. Jr.; Ibarra, F.

    1988-01-01

    Previous studies of blood volume (BV) in cirrhosis have either not adjusted BV properly for body size; determined plasma volume from the dilution of labeled albumin 10-20 min postinjection, when some extravascular redistribution has already occurred; and/or not used the correct whole body-peripheral hematocrit ratio (0.82) in calculating whole BV from plasma volume and the peripheral hematocrit. We measured BV with attention to these considerations in 19 patients with cirrhosis and reexamined the determinants of vascular volume and the relationship between vascular volume and sodium retention. BV was calculated as plasma volume (determined from extrapolated plasma activity of intravenously injected [ 131 I]+albumin at time 0) divided by (peripheral hematocrit X 0.82). The result was expressed per kilogram dry body weight, determined by subtracting the mass of ascites (measured by isotope dilution; 1 liter = 1 kg) from the actual body weight of nonedematous patients. Measured and expressed in this way, BV correlated strongly with esophageal variceal size (r = 0.87, P less than 0.05), although not with net portal, right atrial, inferior vena caval, or arterial pressure, and was significantly greater in patients with sodium retention as compared to patients without sodium retention. The principal modifier of vascular volume in cirrhosis is vascular capacity, which is probably mainly determined by the extent of the portasystemic collateral circulation. Increased vascular volume in patients with sodium retention as compared to patients without sodium retention supports the overflow theory of ascites formation

  20. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  1. Methodology update for estimating volume to service flow ratio.

    Science.gov (United States)

    2015-12-01

    Volume/service flow ratio (VSF) is calculated by the Highway Performance Monitoring System (HPMS) software as an indicator of peak hour congestion. It is an essential input to the Kentucky Transportation Cabinets (KYTC) key planning applications, ...

  2. Exploring participatory methodologies in organizational discourse analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork...

  3. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  4. METHODOLOGY FOR INSTITUTIONAL ANALYSIS OF STABILITY OF REGIONAL FINANCIAL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. V. Milenkov

    2016-01-01

    Full Text Available The relevance of the article due to the urgent need to develop a methodological framework in the field of regional finance research dictated by the substantial increase in the volume and composition of the socio-economic problems, the solution of which, including financial support, the responsibility of the public authorities of the Russian Federation. The article presents the results of the author's research in the field of institutional analysis of the stability of the regional financial system as a set of institutions and organizations interacting with the regional real sector of economy.Methodology. The methodological basis of this article are the economic and statistical methods of analysis, legal documents in the field of the sustainability of the regional financial system, publications in the field of economic and financial security.Conclusions / relevance. The practical significance of the work lies in the provisions of orientation, conclusions and recommendations aimed at the widespread use of search and adaptation of the institutional analysis of the sources of the regional stability of the financial system, which can be used by the legislative and executive authorities of the Russian Federation, the Ministry of Defence in the current activity.Methodological approaches to the structuring objectives of institutional analysis on the basis of the hierarchical representation of the institutional environment of functioning of federal subject the financial system.

  5. Recycle operations as a methodology for radioactive waste volume reduction

    International Nuclear Information System (INIS)

    Rasmussen, G.A.

    1985-01-01

    The costs for packaging, transportation and burial of low-level radioactive metallic waste have become so expensive that an alternate method of decontamination for volume reduction prior to disposal can now be justified. The operation of a large-scale centralized recycle center for decontamination of selected low level radioactive waste has been proven to be an effective method for waste volume reduction and for retrieving valuable materials for unlimited use. The centralized recycle center concept allows application of state-of-the-art decontamination technology resulting in a reduction in utility disposal costs and a reduction in overall net amount of material being buried. Examples of specific decontamination process activities at the centralized facility will be reviewed along with a discussion of the economic impact of decontamination for recycling and volume reduction. Based on almost two years of operation of a centralized decontamination facility, a demonstrated capability exists. The concept has been cost effective and proves that valuable resources can be recycled

  6. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  7. BIBLIOGRAPHIES ON ROLE METHODOLOGY AND PROPOSITIONS. STUDIES IN THE ROLE METHODOLOGY AND PROPOSITIONS, VOLUME D.

    Science.gov (United States)

    BIDDLE, BRUCE J.; AND OTHERS

    AT PRESENT, ROLE THEORY IS MOVING TOWARD A MORE EMINENT POSITION IN SOCIOLOGY. ROLE THEORY HAS DEFINITE AND BASIC DISTINCTIONS WITH RESPECT TO PERSONS, NUMBER, BACKGROUND, CHARACTERISTICS, AND COGNITIONS. THE BIBLIOGRAPHIC STUDIES OF PREVIOUS RESEARCH IN THE ROLE AREA ARE PRESENTED. FOCUS IS ON THE PREVIOUS USE OF ROLE METHODOLOGY AND…

  8. Graphic display development methodology: Volume 1, Theory: Final report

    International Nuclear Information System (INIS)

    Pankrantz, D.

    1986-11-01

    The Graphic Display Development Program is intended to develop computer-based displays which support the symptomatic emergency operating procedures for BWRs. The purpose is to provide a technical basis and methodology for linking two post-TMI safety initiatives: improved operating procedures and the Safety Parameter Display System (NUREG-0737 Supplement 1). Whereas consistency between displays and emergency operating procedures is desirable, no assumption of such an NRC requirement is either expressed or implied in this document. Accordingly, this program should be viewed not as the only acceptable approach to SPDS design but as one of many possible approaches which may be pursued. This program has been supported as a generic activity on behalf of the Boiling Water Reactor Owner's Group (BWROG). No endorsement by any individual utility member of the BWROG is either expressed or implied in this document, nor is any utility obligated to implement this program at any plant

  9. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  10. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  11. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  12. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  13. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  14. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  15. Bare-Hand Volume Cracker for Raw Volume Data Analysis

    Directory of Open Access Journals (Sweden)

    Bireswar Laha

    2016-09-01

    Full Text Available Analysis of raw volume data generated from different scanning technologies faces a variety of challenges, related to search, pattern recognition, spatial understanding, quantitative estimation, and shape description. In a previous study, we found that the Volume Cracker (VC 3D interaction (3DI technique mitigated some of these problems, but this result was from a tethered glove-based system with users analyzing simulated data. Here, we redesigned the VC by using untethered bare-hand interaction with real volume datasets, with a broader aim of adoption of this technique in research labs. We developed symmetric and asymmetric interfaces for the Bare-Hand Volume Cracker (BHVC through design iterations with a biomechanics scientist. We evaluated our asymmetric BHVC technique against standard 2D and widely used 3D interaction techniques with experts analyzing scanned beetle datasets. We found that our BHVC design significantly outperformed the other two techniques. This study contributes a practical 3DI design for scientists, documents lessons learned while redesigning for bare-hand trackers, and provides evidence suggesting that 3D interaction could improve volume data analysis for a variety of visual analysis tasks. Our contribution is in the realm of 3D user interfaces tightly integrated with visualization, for improving the effectiveness of visual analysis of volume datasets. Based on our experience, we also provide some insights into hardware-agnostic principles for design of effective interaction techniques.

  16. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  17. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  18. Methodological considerations for improving Western blot analysis.

    Science.gov (United States)

    MacPhee, Daniel J

    2010-01-01

    The need for a technique that could allow the determination of antigen specificity of antisera led to the development of a method that allowed the production of a replica of proteins, which had been separated electrophoretically on polyacrylamide gels, on to a nitrocellulose membrane. This method was coined Western blotting and is very useful to study the presence, relative abundance, relative molecular mass, post-translational modification, and interaction of specific proteins. As a result it is utilized routinely in many fields of scientific research such as chemistry, biology and biomedical sciences. This review serves to touch on some of the methodological conditions that should be considered to improve Western blot analysis, particularly as a guide for graduate students but also scientists who wish to continue adapting this now fundamental research tool. Copyright 2009 Elsevier Inc. All rights reserved.

  19. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  20. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  1. Risk analysis methodology designed for small and medium enterprises

    OpenAIRE

    Ladislav Beránek; Radim Remeš

    2009-01-01

    The aim of this paper is to present risk analysis procedures successfully applied by several Czech small and medium enterprises. The paper presents in detail the individual steps we use in risk analysis of small and medium enterprises in the Czech Republic. Suggested method to risk analysis is based on the modification of the FRAP methodology and the BITS recommendation. Modifications of both methodologies are described in detail. We propose modified risk analysis methodology which is quick a...

  2. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  3. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  4. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  5. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    This paper presents a goal based methodology for HAZOP studies in which a functional model of the plant is used to assist in a functional decomposition of the plant starting from the purpose of the plant and continuing down to the function of a single node, e.g. a pipe section. This approach lead...

  6. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  7. Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics

    Science.gov (United States)

    Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph

    2011-11-01

    Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.

  8. Methodology for risk analysis of nuclear installations

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Senne Junior, Murillo; Jordao, Elizabete

    2002-01-01

    Both the licensing standards for general uses in nuclear facilities and the specific ones require a risk assessment during their licensing processes. The risk assessment is carried out through the estimation of both probability of the occurrence of the accident, and their magnitudes. This is a complex task because the great deal of potential hazardous events that can occur in nuclear facilities difficult the statement of the accident scenarios. There are also many available techniques to identify the potential accidents, estimate their probabilities, and evaluate their magnitudes. In this paper is presented a new methodology that systematizes the risk assessment process, and orders the accomplishment of their several steps. (author)

  9. Methodology to Forecast Volume and Cost of Cancer Drugs in Low- and Middle-Income Countries

    Directory of Open Access Journals (Sweden)

    Yehoda M. Martei

    2018-02-01

    Full Text Available Purpose: In low- and middle-income countries (LMICs, frequent outages of the stock of cancer drugs undermine cancer care delivery and are potentially fatal for patients with cancer. The aim of this study is to describe a methodologic approach to forecast chemotherapy volume and estimate cost that can be readily updated and applied in most LMICs. Methods: Prerequisite data for forecasting are population-based incidence data and cost estimates per unit of drug to be ordered. We used the supplementary guidelines from the WHO list of essential medicines for cancer to predict treatment plans and ordering patterns. We used de-identified aggregate data from the Botswana National Cancer Registry to estimate incident cases. The WHO Management Sciences for Health International Price Indicator was used to estimate unit costs per drug. Results: Chemotherapy volume required for incident cancer cases was estimated as the product of the standardized dose required to complete a full treatment regimen per patient, with a given cancer diagnosis and stage, multiplied by the total number of incident cancer cases with the respective diagnosis. The estimated chemotherapy costs to treat the 10 most common cancers in the public health care sector of Botswana is approximately 2.3 million US dollars. An estimated 66% of the budget is allocated to costs of rituximab and trastuzumab alone, which are used by approximately 10% of the cancer population. Conclusion: This method provides a reproducible approach to forecast chemotherapy volume and cost in LMICs. The chemotherapy volume and cost outputs of this methodology provide key stakeholders with valuable information that can guide budget estimation, resource allocation, and drug-price negotiations for cancer treatment. Ultimately, this will minimize drug shortages or outages and reduce potential loss of lives that result from an erratic drug supply.

  10. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  11. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  12. Diversion Path Analysis handbook. Volume 4 (of 4 volumes). Computer Program 2

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 2 (DPACP-2) is used to produce tables and statistics on modifications identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 259088 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-2 assist the DPA team in analyzing and evaluating modifications to the plant's safeguards system that would eliminate, or reduce the severity of, vulnerabilities identified by means of the DPA. These vulnerabilities relate to the capability of the plant's material control and material accounting subsystems to indicate diversion of special nuclear material (SNM) by a knowledgeable insider

  13. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  14. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  15. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  16. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  17. [Free will and neurobiology: a methodological analysis].

    Science.gov (United States)

    Brücher, K; Gonther, U

    2006-04-01

    Whether or not the neurobiological basis of mental processes is compatible with the philosophical postulate of free will is a matter of committed debating in our days. What is the meaning of those frequently-quoted experiments concerning voluntary action? Both convictions, being autonomous subjects and exercising a strong influence on the world by applying sciences, have become most important for modern human self-conception. Now these two views are growing apart and appear contradictory because neurobiology tries to reveal the illusionary character of free will. In order to cope with this ostensible dichotomy it is recommended to return to the core of scientific thinking, i. e. to the reflection about truth and methods. The neurobiological standpoint referring to Libet as well as the philosophical approaches to free will must be analysed, considering pre-conceptions and context-conditions. Hence Libet's experiments can be criticised on different levels: methods, methodology and epistemology. Free will is a highly complex system, not a simple fact. Taking these very complicated details into account it is possible to define conditions of compatibility and to use the term free will still in a meaningful way, negotiating the obstacles called pure chance and determinism.

  18. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  19. Social Networks Analysis: Classification, Evaluation, and Methodologies

    Science.gov (United States)

    2011-02-28

    and time performance. We also focus on large-scale network size and dynamic changes in networks and research new capabilities in performing social networks analysis utilizing parallel and distributed processing.

  20. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  1. Understanding Skill in EVA Mass Handling. Volume 4; An Integrated Methodology for Evaluating Space Suit Mobility and Stability

    Science.gov (United States)

    McDonald, P. Vernon; Newman, Dava

    1999-01-01

    The empirical investigation of extravehicular activity (EVA) mass handling conducted on NASA's Precision Air-Bearing Floor led to a Phase I SBIR from JSC. The purpose of the SBIR was to design an innovative system for evaluating space suit mobility and stability in conditions that simulate EVA on the surface of the Moon or Mars. The approach we used to satisfy the Phase I objectives was based on a structured methodology for the development of human-systems technology. Accordingly the project was broken down into a number of tasks and subtasks. In sequence, the major tasks were: 1) Identify missions and tasks that will involve EVA and resulting mobility requirements in the near and long term; 2) Assess possible methods for evaluating mobility of space suits during field-based EVA tests; 3) Identify requirements for behavioral evaluation by interacting with NASA stakeholders;.4) Identify necessary and sufficient technology for implementation of a mobility evaluation system; and 5) Prioritize and select technology solutions. The work conducted in these tasks is described in this final volume of the series on EVA mass handling. While prior volumes in the series focus on novel data-analytic techniques, this volume addresses technology that is necessary for minimally intrusive data collection and near-real-time data analysis and display.

  2. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  3. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  4. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  5. Advanced Methodology for Containment M/E Release Analysis

    International Nuclear Information System (INIS)

    Kim, C. W.; Park, S. J.; Song, J. H.; Choi, H. R.; Seo, J. T.

    2006-01-01

    Recently, a new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been developed and adopted on small break LOCA (SBLOCA). This new M/E release analysis methodology for EEQ is extended to the M/E release analysis for the containment design for large break LOCA (LBLOCA) and main steam line break (MSLB) accident. The advanced methodology of the M/E release analysis for the containment design includes the same engine as the M/E methodology for EEQ, however, conservative approaches for the M/E release such as break spillage model and multiplier on heat transfer coefficient (HTC) etc. are added. The computer code systems used in this methodology are RELAP5K/CONTEMPT4 (or RELAP5- ME) like KREM (KEPRI Realistic Evaluation Model) which couples RELAP5/MOD3.1/K and CONTEMPT4/ MOD5. RELAP5K is based on RELAP5/MOD3.1/K and includes conservatisms for the M/E release and long-term analysis model. The advanced methodology adopting the recent analysis technology is able to calculate the various transient stages of a LOCA in a single code system and also can calculate the M/E release analysis during the long term cooling period with the containment response. This advanced methodology for the M/E release is developed based on the LOCA and applied to the MSLB. The results are compared with the Ulchin Nuclear Unit (UCN) 3 and 4 FSAR

  6. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  7. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  8. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  9. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  10. Diversion Path Analysis handbook. Volume 3 (of 4 volumes). Computer Program 1

    International Nuclear Information System (INIS)

    Schleter, J.C.

    1978-11-01

    The FORTRAN IV computer program, DPA Computer Program 1 (DPACP-1), is used to assemble and tabulate the data for Specific Diversion Paths (SDPs) identified when performing a Diversion Path Analysis (DPA) in accord with the methodology given in Volume 1. The program requires 255498 bytes exclusive of the operating system. The data assembled and tabulated by DPACP-1 are used by the DPA team to assist in analyzing vulnerabilities, in a plant's material control and material accounting subsystems, to diversion of special nuclear material (SNM) by a knowledgable insider. Based on this analysis, the DPA team can identify, and propose to plant management, modifications to the plant's safeguards system that would eliminate, or reduce the severity of, the identified vulnerabilities. The data are also used by plant supervision when investigating a potential diversion

  11. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  12. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  13. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  14. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  15. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  16. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  17. Methodological Tool or Methodology? Beyond Instrumentality and Efficiency with Qualitative Data Analysis Software

    Directory of Open Access Journals (Sweden)

    Pengfei Zhao

    2016-04-01

    Full Text Available Qualitative data analysis software (QDAS has become increasingly popular among researchers. However, very few discussions have developed regarding the effect of QDAS on the validity of qualitative data analysis. It is a pressing issue, especially because the recent proliferation of conceptualizations of validity has challenged, and to some degree undermined, the taken-for-granted connection between the methodologically neutral understanding of validity and QDAS. This article suggests an alternative framework for examining the relationship between validity and the use of QDAS. Shifting the analytic focus from instrumentality and efficiency of QDAS to the research practice itself, we propose that qualitative researchers should formulate a "reflective space" at the intersection of their methodological approach, the built-in validity structure of QDAS and the specific research context, in order to make deliberative and reflective methodological decisions. We illustrate this new framework through discussion of a collaborative action research project. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1602160

  18. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  19. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  20. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  1. Development of mass and energy release analysis methodology

    International Nuclear Information System (INIS)

    Kim, Cheol Woo; Song, Jeung Hyo; Park, Seok Jeong; Kim, Tech Mo; Han, Kee Soo; Choi, Han Rim

    2009-01-01

    Recently, new approaches to the accident analysis using the realistic evaluation have been attempted. These new approaches provide more margins to the plant safety, design, operation and maintenance. KREM (KEPRI Realistic Evaluation Methodology) for a large break loss-of-coolant accident (LOCA) is performed using RELAP5/MOD3 computer code including realistic evaluation models. KOPEC has developed KIMERA (KOPEC Improved Mass and Energy Release Analysis methodology) based on the realistic evaluation to improve the analysis method for the mass and energy (M/E) release and to obtain the adequate margin. KIMERA uses a simplified single code system unlike conventional M/E release analysis methodologies. This simple code system reduces the computing efforts especially for LOCA analysis. The computer code systems of this methodology are RELAP5K/CONTEMPT4 (or RELAP5-ME) like KREM methodology which couples RELAP5/MOD3.1/K and CONTEMPT4/MOD5. The new methodology, KIMERA based on the same engine as KREM, adopted conservative approaches for the M/E release such as break spillage model, multiplier on heat transfer coefficient (HTC), and long-term cooling model. KIMERA is developed based on a LOCA and applied to a main steam line break (MSLB) and approved by Korea Government. KIMERA has an ability of one-through calculation of the various transient stages of LOCAs in a single code system and calculate the M/E release analysis during the long term cooling period with the containment pressure and temperature (P/T) response. The containment P/T analysis results are compared with those of the Ulchin Nuclear Power Plant Units 3 and 4 (UCN 3 and 4) FSAR which is the OPR1000 (Optimized Power Reactor 1000) type nuclear power plant. The results of a large break LOCA and an MSLB are similar to those of FSAR for UCN 3 and 4. However, the containment pressure during the post-blowdown period of a large break LOCA has much lower second peak than the first peak. The resultant containment peak

  2. Interpersonal Dynamics in a Simulated Prison: A Methodological Analysis

    Science.gov (United States)

    Banuazizi, Ali; Movahedi, Siamak

    1975-01-01

    A critical overview is presented of the Stanford Prison Experiment, conducted by Zimbardo and his coinvestigators in which they attempted a structural analysis of the problems of imprisonment. Key assumptions are questioned, primarily on methodological grounds, which casts doubts on the plausibility of the experimenters' final causal inferences.…

  3. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  4. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  5. Active disturbance rejection control: methodology and theoretical analysis.

    Science.gov (United States)

    Huang, Yi; Xue, Wenchao

    2014-07-01

    The methodology of ADRC and the progress of its theoretical analysis are reviewed in the paper. Several breakthroughs for control of nonlinear uncertain systems, made possible by ADRC, are discussed. The key in employing ADRC, which is to accurately determine the "total disturbance" that affects the output of the system, is illuminated. The latest results in theoretical analysis of the ADRC-based control systems are introduced. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  7. Methodological Variability Using Electronic Nose Technology For Headspace Analysis

    Science.gov (United States)

    Knobloch, Henri; Turner, Claire; Spooner, Andrew; Chambers, Mark

    2009-05-01

    Since the idea of electronic noses was published, numerous electronic nose (e-nose) developments and applications have been used in analyzing solid, liquid and gaseous samples in the food and automotive industry or for medical purposes. However, little is known about methodological pitfalls that might be associated with e-nose technology. Some of the methodological variation caused by changes in ambient temperature, using different filters and changes in mass flow rates are described. Reasons for a lack of stability and reproducibility are given, explaining why methodological variation influences sensor responses and why e-nose technology may not always be sufficiently robust for headspace analysis. However, the potential of e-nose technology is also discussed.

  8. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  9. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  10. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  11. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  12. Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005

    Science.gov (United States)

    Coffman, Julia, Ed.

    2005-01-01

    This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…

  13. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    Science.gov (United States)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  14. RISK ANALYSIS IN CONSTRUCTION PROJECTS: A PRACTICAL SELECTION METHODOLOGY

    OpenAIRE

    Alberto De Marco; Muhammad Jamaluddin Thaheem

    2014-01-01

    Project Risk Management (PRM) is gaining attention from researchers and practitioners in the form of sophisticated tools and techniques to help construction managers perform risk management. However, the large variety of techniques has made selecting an appropriate solution a complex and risky task in itself. Accordingly, this study proposes a practical framework methodology to assist construction project managers and practitioners in choosing a suitable risk analysis technique based on selec...

  15. Estimates of emergency operating capacity in US manufacturing and nonmanufacturing industries - Volume 1: Concepts and Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B. (Pacific Northwest Lab., Richland, WA (USA)); Serot, D.E. (D/E/S Research, Richland, WA (USA)); Kellogg, M.A. (ERCE, Inc., Portland, OR (USA))

    1991-03-01

    Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 through 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)

  16. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  17. Crossing trend analysis methodology and application for Turkish rainfall records

    Science.gov (United States)

    Şen, Zekâi

    2018-01-01

    Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.

  18. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects

    Directory of Open Access Journals (Sweden)

    Naoki Hashimoto

    2018-01-01

    Discussion: A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  19. Revised Rapid Soils Analysis Kit (RSAK) - Wet Methodology

    Science.gov (United States)

    2018-01-01

    identify characteristics of buried explosives. The existing Rapid Soils Analysis Kit (RSAK), developed at ERDC, was modified to shrink its cube volume...under Project 354894 CALDERA. The technical monitor was Dr. John Q. Ehrgott Jr. The principal investigator for this study was completed by the...and depth of a buried improvised explosive device (IED) based on factors such as soil type, soil density, soil moisture, and dimensional

  20. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  1. Multi-resolution Convolution Methodology for ICP Waveform Morphology Analysis.

    Science.gov (United States)

    Shaw, Martin; Piper, Ian; Hawthorne, Christopher

    2016-01-01

    Intracranial pressure (ICP) monitoring is a key clinical tool in the assessment and treatment of patients in neurointensive care. ICP morphology analysis can be useful in the classification of waveform features.A methodology for the decomposition of an ICP signal into clinically relevant dimensions has been devised that allows the identification of important ICP waveform types. It has three main components. First, multi-resolution convolution analysis is used for the main signal decomposition. Then, an impulse function is created, with multiple parameters, that can represent any form in the signal under analysis. Finally, a simple, localised optimisation technique is used to find morphologies of interest in the decomposed data.A pilot application of this methodology using a simple signal has been performed. This has shown that the technique works with performance receiver operator characteristic area under the curve values for each of the waveform types: plateau wave, B wave and high and low compliance states of 0.936, 0.694, 0.676 and 0.698, respectively.This is a novel technique that showed some promise during the pilot analysis. However, it requires further optimisation to become a usable clinical tool for the automated analysis of ICP signals.

  2. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  3. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  4. Considerations on a methodological framework for the analysis of texts

    Directory of Open Access Journals (Sweden)

    David Andrés Camargo Mayorga

    2017-03-01

    Full Text Available This article presents a review of relevant literature for the construction of a methodological framework for the analysis of texts in applied social sciences, such as economics, which we have supported in the main hermeneutical approaches from philosophy, linguistics and social sciences. In essence, they assume that every discourse carries meaning - be it truthful or not - and that they express complex social relations. Thus, any analysis of content happens finally to be a certain type of hermeneutics (interpretation, while trying to account for multiple phenomena immersed in the production, application, use and reproduction of knowledge within the text. When applying discourse analysis in teaching texts in economic sciences, we find traces of legalistic, political, ethnocentric tendencies, among other discourses hidden from the text. For this reason, the analysis of the internal discourse of the text allows us to delve inside the state ideology and its underlying or latent discourses.

  5. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. New Methodology of Block Cipher Analysis using Chaos Game

    Directory of Open Access Journals (Sweden)

    Budi Sulistyo

    2011-08-01

    Full Text Available Block cipher analysis covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper defines bias which represents violation of a random sequence from true random sequence. In this paper, block cipher is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make the application of various analysis techniques developed in dynamical system field becomes possible. There are three main parts of the methodology presented in this paper: the dynamical system framework for block cipher analysis, a new chaos game scheme and an extended measure concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discrete dynamical system whose block cipher as its mapping function, random sequence construction, the random sequence usage as input of a chaos game scheme, output measurement of chaos game scheme using extended measure concept, analysis the result of the measurement. The analysis process and of a specific real or sample block cipher and the analysis result are beyond the scope of this paper.

  7. Proficiency examination in English language: Needs analysis and methodological proposals

    Directory of Open Access Journals (Sweden)

    Maria Elizete Luz Saes

    2014-04-01

    Full Text Available The purpose of this work is to provide tools for reflections on some learning difficulties presented by students when they are submitted to English proficiency examinations, as well as to suggest some methodological proposals that can be implemented among didactic support groups, monitoring or in classrooms, by means of face-to-face or distance learning activities. The observations resulting from the performance presented by the students have motivated the preparation of this paper, whose theoretical assumptions are based on the needs analysis of the target audience, the exploration of oral and written discursive genres and the possibilities of interaction provided by technological mediation.

  8. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  9. A Finite-Volume "Shaving" Method for Interfacing NASA/DAO''s Physical Space Statistical Analysis System to the Finite-Volume GCM with a Lagrangian Control-Volume Vertical Coordinate

    Science.gov (United States)

    Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)

    2001-01-01

    Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.

  10. National survey of drinking and driving attitudes and behaviors : 2008. Volume 3, methodology report

    Science.gov (United States)

    2010-08-01

    This report presents the details of the methodology used for the 2008 National Survey of Drinking and Driving Attitudes and Behaviors conducted by Gallup, Inc. for : the National Highway Traffic Safety Administration (NHTSA). This survey represents t...

  11. Plasma volume methodology: Evans blue, hemoglobin-hematocrit, and mass density transformations

    Science.gov (United States)

    Greenleaf, J. E.; Hinghofer-Szalkay, H.

    1985-01-01

    Methods for measuring absolute levels and changes in plasma volume are presented along with derivations of pertinent equations. Reduction in variability of the Evans blue dye dilution technique using chromatographic column purification suggests that the day-to-day variability in the plasma volume in humans is less than + or - 20 m1. Mass density determination using the mechanical-oscillator technique provides a method for measuring vascular fluid shifts continuously for assessing the density of the filtrate, and for quantifying movements of protein across microvascular walls. Equations for the calculation of volume and density of shifted fluid are presented.

  12. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  13. A methodology for reliability analysis in health networks.

    Science.gov (United States)

    Spyrou, Stergiani; Bamidis, Panagiotis D; Maglaveras, Nicos; Pangalos, George; Pappas, Costas

    2008-05-01

    A reliability model for a health care domain based on requirement analysis at the early stage of design of regional health network (RHN) is introduced. RHNs are considered as systems supporting the services provided by health units, hospitals, and the regional authority. Reliability assessment in health care domain constitutes a field-of-quality assessment for RHN. A novel approach for predicting system reliability in the early stage of designing RHN systems is presented in this paper. The uppermost scope is to identify the critical processes of an RHN system prior to its implementation. In the methodology, Unified Modeling Language activity diagrams are used to identify megaprocesses at regional level and the customer behavior model graph (CBMG) to describe the states transitions of the processes. CBMG is annotated with: 1) the reliability of each component state and 2) the transition probabilities between states within the scope of the life cycle of the process. A stochastic reliability model (Markov model) is applied to predict the reliability of the business process as well as to identify the critical states and compare them with other processes to reveal the most critical ones. The ultimate benefit of the applied methodology is the design of more reliable components in an RHN system. The innovation of the approach of reliability modeling lies with the analysis of severity classes of failures and the application of stochastic modeling using discrete-time Markov chain in RHNs.

  14. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  15. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    Science.gov (United States)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  16. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  17. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  18. Methodological appraisal of SPECT measurements of cerebral blood volume and cerebral tissue hematocrit. Chapter 25

    International Nuclear Information System (INIS)

    Sakai, Fumihiko

    1988-01-01

    In this communication a critical appraisal is given of the method for measuring cerebral blood volume (CBV) and cerebral hematocrit employing single-photon emission computed tomography (SPECT). 2 refs

  19. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  20. Volume totalizers analysis of pipelines operated by TRANSPETRO National Operational Control Center; Analise de totalizadores de volume em oleodutos operados pelo Centro Nacional de Controle e Operacao da TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Aramaki, Thiago Lessa; Montalvao, Antonio Filipe Falcao [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Marques, Thais Carrijo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2012-07-01

    This paper aims to present the results and methodology in the analysis of differences in volume totals used in systems such as batch tracking and leak detection of pipelines operated by the National Center for Operational Control (CNCO) at TRANSPETRO. In order to optimize this type of analysis, software was developed to acquisition and processing of historical data using the methodology developed. The methodology developed takes into account the particularities encountered in systems operated by TRANSPETRO, more specifically, by CNCO. (author)

  1. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  2. SLSF loop handling system. Volume I. Structural analysis. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, H.; Cowie, A.; Ma, D.

    1978-10-01

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision.

  3. SLSF loop handling system. Volume I. Structural analysis

    International Nuclear Information System (INIS)

    Ahmed, H.; Cowie, A.; Ma, D.

    1978-10-01

    SLSF loop handling system was analyzed for deadweight and postulated dynamic loading conditions, identified in Chapters II and III in Volume I of this report, using a linear elastic static equivalent method of stress analysis. Stress analysis of the loop handling machine is presented in Volume I of this report. Chapter VII in Volume I of this report is a contribution by EG and G Co., who performed the work under ANL supervision

  4. SCALE-4 analysis of pressurized water reactor critical configurations. Volume 1: Summary

    International Nuclear Information System (INIS)

    DeHart, M.D.

    1995-03-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original fresh composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power's Surry Unit 1 reactor for the Cycle 2 core; Volume 4 documents the calculations performed based on GPU Nuclear Corporation's Three Mile Island Unit 1 Cycle 5 core; and, lastly, Volume 5 describes the analysis of Virginia Power's North Anna Unit 1 Cycle 5 core. Each of the reactor-specific volumes provides the details of calculations performed to determine the effective multiplication factor for each reactor core for one or more critical configurations using the SCALE-4 system; these results are summarized in this volume. Differences between the core designs and their possible impact on the criticality calculations are also discussed. Finally, results are presented for additional analyses performed to verify that solutions were sufficiently converged

  5. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  6. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    Menzel, Francine; Sabundjian, Gaiane; D'auria, Francesco; Madeira, Alzira A.

    2015-01-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  7. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  8. Analysis of the low-altitude proton flux asymmetry: methodology

    CERN Document Server

    Kruglanski, M

    1999-01-01

    Existing East-West asymmetry models of the trapped proton fluxes at low altitudes depend on the local magnetic dip angle and a density scale height derived from atmospheric models. We propose an alternative approach which maps the directional flux over a drift shell (B sub m , L) in terms of the local pitch and azimuthal angles alpha and beta, where beta is defined in the local mirror plane as the angle between the proton arrival direction and the surface normal to the drift shell. This approach has the advantage that it only depends on drift shell parameters and does not involve an atmosphere model. A semi-empirical model based on the new methodology is able to reproduce the angular distribution of a set of SAMPEX/PET proton flux measurements. Guidelines are proposed for spacecraft missions and data analysis procedures that are intended to be used for the building of new trapped radiation environment models.

  9. Computational methodology for ChIP-seq analysis

    Science.gov (United States)

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  10. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    International Nuclear Information System (INIS)

    Meignan, Michel; Sasanelli, Myriam; Itti, Emmanuel; Casasnovas, Rene Olivier; Luminari, Stefano; Fioroni, Federica; Coriani, Chiara; Masset, Helene; Gobbi, Paolo G.; Merli, Francesco; Versari, Annibale

    2014-01-01

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on 18 F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with 18 F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm 3 with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV 41 ) and a variable visually adjusted SUVmax threshold (TMTV var ). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV 41 measurement was substantial (ρ c = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm 3 for Creteil vs. 206 ± 219 cm 3 for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV var . There was a significant direct correlation between TMTV 41 and normalized LDH (r = 0.652, CI 0.42 - 0.8, P 41 , but high TMTV 41 could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation. It should be evaluated in prospective studies. (orig.)

  11. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  12. The Study of Productivity Measurement and Incentive Methodology (Phase III - Paper Test). Volume 1

    Science.gov (United States)

    1986-03-14

    analogy of a crown of jewels might be used to illustrate the relationship. The models tested represent the jewels and are valuable in their own...right. However, when the jewels are placed in the crown (models built into an integrated methodology), they take on added value. Recommendat ions We...8217:hrou!;h product pries changes. The mathematical relationship remains jusc as crue for a whole induscry as for individual

  13. A Methodology for Selection of a Satellite Servicing Architecture. Volume 3. Appendices.

    Science.gov (United States)

    1985-12-01

    minimize pollution created by a system . Such things as water , air, and land pollution are desired to be minimized in a system selection. 4-6 Public Safety...Guy, 1983 note: SB will recirculate U76 - Life support mass requirements per person time [kg/person*hr) (ie air, water ,food etc) 0.1 < U76 < I...to System Performance Evaluation Methodology. Philippines : Addison-Wesley. 1978. Koelle. Dietric E. "The TRANSCOST - Model for Estimation of Launch

  14. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  15. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  16. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  17. New Methodology of Block Cipher Analysis Using Chaos Game

    Directory of Open Access Journals (Sweden)

    Budi Sulistyo

    2014-11-01

    Full Text Available Block cipher analysis  covers randomness analysis and cryptanalysis. This paper proposes a new method potentially used for randomness analysis and cryptanalysis. The method uses true random sequence  concept as a reference for measuring randomness level of a random sequence. By using this concept, this paper  defines  bias  which represents  violation  of  a  random  sequence  from  true random sequence. In this paper, block cipher   is treated as a mapping function of a discrete time dynamical system. The dynamical system framework is used to make  the  application  of  various  analysis  techniques  developed  in  dynamical system  field  becomes  possible.  There  are three  main parts of  the methodology presented  in  this  paper:  the  dynamical  system  framework  for  block  cipher analysis, a  new chaos game  scheme and an extended measure  concept related to chaos game and fractal analysis. This paper also presents the general procedures of the proposed method, which includes: symbolic dynamic analysis of discr ete dynamical system whose block cipher as its mapping function, random sequence construction,  the  random  sequence  usage  as  input  of  a  chaos  game  scheme, output  measurement  of  chaos  game  scheme  using  extended  measure  concept, analysis  the  result  of  the  measurement.  The  analysis  process  and  of  a  specific real or sample block cipher and the analysis result are beyond the scope of this paper.

  18. A Methodology for Loading the Advanced Test Reactor Driver Core for Experiment Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cowherd, Wilson M.; Nielsen, Joseph W.; Choe, Dong O.

    2016-11-01

    In support of experiments in the ATR, a new methodology was devised for loading the ATR Driver Core. This methodology will replace the existing methodology used by the INL Neutronic Analysis group to analyze experiments. Studied in this paper was the as-run analysis for ATR Cycle 152B, specifically comparing measured lobe powers and eigenvalue calculations.

  19. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  20. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  1. Methodology for calculating the volume of condensate droplets on topographically modified, microgrooved surfaces.

    Science.gov (United States)

    Sommers, A D

    2011-05-03

    Liquid droplets on micropatterned surfaces consisting of parallel grooves tens of micrometers in width and depth are considered, and a method for calculating the droplet volume on these surfaces is presented. This model, which utilizes the elongated and parallel-sided nature of droplets condensed on these microgrooved surfaces, requires inputs from two droplet images at ϕ = 0° and ϕ = 90°--namely, the droplet major axis, minor axis, height, and two contact angles. In this method, a circular cross-sectional area is extruded the length of the droplet where the chord of the extruded circle is fixed by the width of the droplet. The maximum apparent contact angle is assumed to occur along the side of the droplet because of the surface energy barrier to wetting imposed by the grooves--a behavior that was observed experimentally. When applied to water droplets condensed onto a microgrooved aluminum surface, this method was shown to calculate the actual droplet volume to within 10% for 88% of the droplets analyzed. This method is useful for estimating the volume of retained droplets on topographically modified, anisotropic surfaces where both heat and mass transfer occur and the surface microchannels are aligned parallel to gravity to assist in condensate drainage.

  2. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  3. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  4. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  5. Cost analysis in health centers using 'Step Down' methodology

    Directory of Open Access Journals (Sweden)

    Matejić S.

    2015-01-01

    Full Text Available Health care reform aims to improve Health System performance by achievement of one of the four objectives: reducing costs by increasing the efficiency of health care provision. Performance improvement implies acceptance of innovations in all Health care activities including health care financial management. Successful implementation of health care financing reform requires previous costs and activities analysis in health institutions. In the work we performed comparative analysis of the costs of 27 health institutions by applying innovative system for health care services costs analysis and control. Initialy spreadsheet system was made, by using internationaly recognised 'Step Down' methodology, for cost control and analisys in the hospitals and was adapted for Primary health care institutions. Results achieved: The dominant cost for employees salaries, on average around 80%, does not depend on the size of Primary health institution (Dom zdravlja; Significant differences in the percentage values of the cost of medicines, medical supplies, diagnostic services; There is an obvious difference percentage values of technical maintenance costs as a result of uneven percentage of the number of non-medical employees, differences in infrastructure organization, the difference in the condition and type of equipment, the difference in the type of space heating and type of fuel for heating, patients transportation obligations especialy of home treatment services and polyvalent patronage. There is a big difference in average cost per outpatient examination, as a consequence of uneven number of services performed, especialy in the dentistry services. There is a significant difference in the number of preventive health examinations performed which has a direct impact on the cost of these inspections. The main conclusion of the analysis done indicates that in the actual situation of disparities, in terms of costs, can joperdize implementation of Primary health

  6. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  7. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  8. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  9. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  10. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  11. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  12. Iterative Transport-Diffusion Methodology For LWR Core Analysis

    Science.gov (United States)

    Colameco, David; Ivanov, Boyan D.; Beacon, Daniel; Ivanov, Kostadin N.

    2014-06-01

    This paper presents an update on the development of an advanced methodology for core calculations that uses local heterogeneous solutions for on-the-fly nodal cross-section generation. The Iterative Transport-Diffusion Method is an embedded transport approach that is expected to provide results with near 3D transport accuracy for a fraction of the time required by a full 3D transport method. In this methodology, the infinite environment used for homogenized nodal cross-section generation is replaced with a simulated 3D environment of the diffusion calculation. This update focuses on burnup methodology, axial leakage and 3D modeling.

  13. CADDIS Volume 4. Data Analysis: Getting Started

    Science.gov (United States)

    Assembling data for an ecological causal analysis, matching biological and environmental samples in time and space, organizing data along conceptual causal pathways, data quality and quantity requirements, Data Analysis references.

  14. Longitudinal analysis of mouse SDOCT volumes

    Science.gov (United States)

    Antony, Bhavna J.; Carass, Aaron; Lang, Andrew; Kim, Byung-Jin; Zack, Donald J.; Prince, Jerry L.

    2017-03-01

    Spectral-domain optical coherence tomography (SDOCT), in addition to its routine clinical use in the diagnosis of ocular diseases, has begun to fund increasing use in animal studies. Animal models are frequently used to study disease mechanisms as well as to test drug efficacy. In particular, SDOCT provides the ability to study animals longitudinally and non-invasively over long periods of time. However, the lack of anatomical landmarks makes the longitudinal scan acquisition prone to inconsistencies in orientation. Here, we propose a method for the automated registration of mouse SDOCT volumes. The method begins by accurately segmenting the blood vessels and the optic nerve head region in the scans using a pixel classification approach. The segmented vessel maps from follow-up scans were registered using an iterative closest point (ICP) algorithm to the baseline scan to allow for the accurate longitudinal tracking of thickness changes. Eighteen SDOCT volumes from a light damage model study were used to train a random forest utilized in the pixel classification step. The area under the curve (AUC) in a leave-one-out study for the retinal blood vessels and the optic nerve head (ONH) was found to be 0.93 and 0.98, respectively. The complete proposed framework, the retinal vasculature segmentation and the ICP registration, was applied to a secondary set of scans obtained from a light damage model. A qualitative assessment of the registration showed no registration failures.

  15. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  16. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  17. Towards a sharp-interface volume-of-fluid methodology for modeling evaporation

    Science.gov (United States)

    Pathak, Ashish; Raessi, Mehdi

    2017-11-01

    In modeling evaporation, the diffuse-interface (one-domain) formulation yields inaccurate results. Recent efforts approaching the problem via a sharp-interface (two-domain) formulation have shown significant improvements. The reasons behind their better performance are discussed in the present work. All available sharp-interface methods, however, exclusively employ the level-set. In the present work, we develop a sharp-interface evaporation model in a volume-of-fluid (VOF) framework in order to leverage its mass-conserving property as well as its ability to handle large topographical changes. We start with a critical review of the assumptions underlying the mathematical equations governing evaporation. For example, it is shown that the assumption of incompressibility can only be applied in special circumstances. The famous D2 law used for benchmarking is valid exclusively to steady-state test problems. Transient is present over significant lifetime of a micron-size droplet. Therefore, a 1D spherical fully transient model is developed to provide a benchmark transient solution. Finally, a 3D Cartesian Navier-Stokes evaporation solver is developed. Some preliminary validation test-cases are presented for static and moving drop evaporation. This material is based upon work supported by the Department of Energy, Office of Energy Efficiency and Renewable Energy and the Department of Defense, Tank and Automotive Research, Development, and Engineering Center, under Award Number DEEE0007292.

  18. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  19. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  20. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  1. CADDIS Volume 4. Data Analysis: Download Software

    Science.gov (United States)

    Overview of the data analysis tools available for download on CADDIS. Provides instructions for downloading and installing CADStat, access to Microsoft Excel macro for computing SSDs, a brief overview of command line use of R, a statistical software.

  2. Methodologies for analysis of patterning in the mouse RPE sheet

    Science.gov (United States)

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  3. Methodologies for analysis of patterning in the mouse RPE sheet.

    Science.gov (United States)

    Boatright, Jeffrey H; Dalal, Nupur; Chrenek, Micah A; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E; Nickerson, John M

    2015-01-01

    Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20-50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell-cell contact in the sheet. To validate the software, human- and computer-analyzed results were compared. Whether

  4. Iterative transport-diffusion methodology for LWR core analysis

    International Nuclear Information System (INIS)

    Colameco, D.; Beacon, D.; Ivanov, K.N.; Inanov, B.D.

    2013-01-01

    This paper presents an update on the development of an advanced methodology for Light Water Reactor core calculations that uses local heterogeneous solutions for on-the-fly nodal cross-section generation. The Iterative Transport-Diffusion Method (ITDM) is an embedded transport approach that is expected to provide results with near 3D transport accuracy for a fraction of the time required by a full 3D transport method. In this methodology, the infinite environment used for homogenized nodal cross-section generation is replaced with a simulated 3D environment of the diffusion calculation. It is shown that the ITDM methodology provides very promising results when using partial currents as boundary conditions for loosely coupling a 2D lattice transport code to a 3D core nodal solver. The use of partial currents is a major improvement over the albedo concept: the solutions converged in a smoother manner

  5. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T.; Helton, J.C.; Murfin, W.B.; Hora, S.C.

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community

  6. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  7. The analysis of classroom talk: methods and methodologies.

    Science.gov (United States)

    Mercer, Neil

    2010-03-01

    This article describes methods for analysing classroom talk, comparing their strengths and weaknesses. Both quantitative and qualitative methods are described and assessed for their strengths and weaknesses, with a discussion of the mixed use of such methods. It is acknowledged that particular methods are often embedded in particular methodologies, which are based on specific theories of social action, research paradigms, and disciplines; and so a comparison is made of two contemporary methodologies, linguistic ethnography, and sociocultural research. The article concludes with some comments on the current state of development of this field of research and on ways that it might usefully progress.

  8. Analysis of urea distribution volume in hemodialysis.

    Science.gov (United States)

    Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F

    1994-01-01

    According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).

  9. Geometric nonlinear functional analysis volume 1

    CERN Document Server

    Benyamini, Yoav

    1999-01-01

    The book presents a systematic and unified study of geometric nonlinear functional analysis. This area has its classical roots in the beginning of the twentieth century and is now a very active research area, having close connections to geometric measure theory, probability, classical analysis, combinatorics, and Banach space theory. The main theme of the book is the study of uniformly continuous and Lipschitz functions between Banach spaces (e.g., differentiability, stability, approximation, existence of extensions, fixed points, etc.). This study leads naturally also to the classification of

  10. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  11. Seismic hazard analysis. A methodology for the Eastern United States

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  12. Qualitative Analysis of Comic Strip Culture: A Methodological Inquiry.

    Science.gov (United States)

    Newman, Isadore; And Others

    The paper is a methodological inquiry into the interpretation of qualitative data. It explores a grounded-theory approach to the synthesis of data and examines, in particular, the construction of categories. It focuses on ways of organizing and attaching meaning to data, as research problems embedded in a cultural context are explored. A…

  13. Grounded Theory and Educational Ethnography: A Methodological Analysis and Critique.

    Science.gov (United States)

    Smith, Louis M.; Pohland, Paul A.

    This paper analyzes and evaluates the methodological approach developed by B. G. Glaser and A. L. Strauss in THE DISCOVERY OF GROUNDED THEORY (Chicago: Aldine, 1967). Smith and Pohland's major intent is to raise Glaser and Strauss' most significant concepts and issues, analyze them in the context of seven of their own studies, and in conclusion…

  14. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  15. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  16. Statistical analysis of rockfall volume distributions: Implications for rockfall dynamics

    Science.gov (United States)

    Dussauge, Carine; Grasso, Jean-Robert; Helmstetter, AgnèS.

    2003-06-01

    We analyze the volume distribution of natural rockfalls on different geological settings (i.e., calcareous cliffs in the French Alps, Grenoble area, and granite Yosemite cliffs, California Sierra) and different volume ranges (i.e., regional and worldwide catalogs). Contrary to previous studies that included several types of landslides, we restrict our analysis to rockfall sources which originated on subvertical cliffs. For the three data sets, we find that the rockfall volumes follow a power law distribution with a similar exponent value, within error bars. This power law distribution was also proposed for rockfall volumes that occurred along road cuts. All these results argue for a recurrent power law distribution of rockfall volumes on subvertical cliffs, for a large range of rockfall sizes (102-1010 m3), regardless of the geological settings and of the preexisting geometry of fracture patterns that are drastically different on the three studied areas. The power law distribution for rockfall volumes could emerge from two types of processes. First, the observed power law distribution of rockfall volumes is similar to the one reported for both fragmentation experiments and fragmentation models. This argues for the geometry of rock mass fragment sizes to possibly control the rockfall volumes. This way neither cascade nor avalanche processes would influence the rockfall volume distribution. Second, without any requirement of scale-invariant quenched heterogeneity patterns, the rock mass dynamics can arise from avalanche processes driven by fluctuations of the rock mass properties, e.g., cohesion or friction angle. This model may also explain the power law distribution reported for landslides involving unconsolidated materials. We find that the exponent values of rockfall volume on subvertical cliffs, 0.5 ± 0.2, is significantly smaller than the 1.2 ± 0.3 value reported for mixed landslide types. This change of exponents can be driven by the material strength, which

  17. Research program for seismic qualification of nuclear plant electrical and mechanical equipment. Task 3. Recommendations for improvement of equipment qualification methodology and criteria. Volume 3

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-08-01

    The Research Program for Seismic Qualification of Nuclear Plant Electrical and Mechanical Equipment has spanned a period of three years and resulted in seven technical summary reports, each of which covered in detail the findings of different tasks and subtasks, and have been combined into five NUREG/CR volumes. Volume 3 presents recommendations for improvement of equipment qualification methodology and procedural clarification/modification. The fifth category identifies issues where adequate information does not exist to allow a recommendation to be made

  18. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  19. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  20. Sensitivity analysis of project appraisal variables. Volume I. Key variables

    Energy Technology Data Exchange (ETDEWEB)

    1979-07-01

    The Division of Fossil Fuel Utilization within the US Department of Energy (DOE) uses a project appraisal methodology for annual assessment of its research and development projects. Exercise of the methodology provides input to the budget preparation and planning process. Consequently, it is essential that all apraisal inputs and outputs are as accurate and credible as possible. The purpose of this task is to examine the accuracy and credibility of 1979 appraisal results by conducting a sensitivity analysis of several appraisal inputs. This analysis is designed to: examine the sensitivity of the results to adjustments in the values of selected parameters; explain the differences between computed ranks and professional judgment ranks; and revise the final results of 1979 project appraisal and provide the first inputs to refinement of the appraisal methodology for future applications.

  1. Application of KIMERA Methodology to Kori 3 and 4 LBLOCA M/E Release Analysis

    International Nuclear Information System (INIS)

    Song, Jeung Hyo; Hwang, Byung Heon; Kim, Cheol Woo

    2007-01-01

    A new mass and energy (M/E) release analysis methodology called KIMERA (KOPEC Improved Mass and Energy Release Analysis) has been developed. This is a realistic evaluation methodology of the M/E release analysis for the containment design and is applicable to a LOCA and a main steam line break (MSLB) accident. This KIMERA methodology has the same engine as KREM (KEPRI Realistic Evaluation Model) which is the realistic evaluation methodology for LOCA peak clad temperature analysis. This methodology also has several supplementary conservative models for the M/E release such as break spillage model and multiplier on heat transfer coefficient (HTC). For estimating the applicability of the KIMERA methodology to the licensing analysis, the large break LOCA (LBLOCA) M/E analysis was performed for UCN 3 and 4 which is the typical plant of OPR1000 type. The results showed that the peak pressure and temperature occurred earlier and had lower values than those of UCN 3 and 4 FSAR. The KIMERA methodology takes off the over-conservatism from the FSAR results during the post blowdown period for the large break LOCA and provides more margin in containment design. In this study, the LBLOCA M/E analysis using the KIMERA methodology is to be performed for Kori 3 and 4 which is the typical plant of Westinghouse type. The results are compared with those of the Kori Nuclear Unit 3 and 4 FSAR

  2. Full-Envelope Launch Abort System Performance Analysis Methodology

    Science.gov (United States)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  3. Methodology for seismic risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Kaplan, S.; Perla, H.F.; Bley, D.C.

    1983-01-01

    This methodology begins by quantifying the fragilty of all key components and structures in the plant. By means of the logic encoded in the plant event trees and fault trees, the component fragilities are combined to form fragilities for the occurrence of plant damage states or release categories. Combining these, in turn, with the seismicity curves yields the frequencies of those states or releases. Uncertainty is explicitly included at each step of the process

  4. Geometrical-Based Navigation System Performance Assessment in the Space Service Volume Using a Multiglobal Navigation Satellite System Methodology

    Science.gov (United States)

    Welch, Bryan W.

    2016-01-01

    NASA is participating in the International Committee on Global Navigation Satellite Systems (GNSS) (ICG)'s efforts towards demonstrating the benefits to the space user in the Space Service Volume (SSV) when a multi-GNSS solution space approach is utilized. The ICG Working Group: Enhancement of GNSS Performance, New Services and Capabilities has started a three phase analysis initiative as an outcome of recommendations at the ICG-10 meeting, in preparation for the ICG-11 meeting. The first phase of that increasing complexity and fidelity analysis initiative is based on a pure geometrically-derived access technique. The first phase of analysis has been completed, and the results are documented in this paper.

  5. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects.

    Science.gov (United States)

    Hashimoto, Naoki; Ito, Yoichi M; Okada, Naohiro; Yamamori, Hidenaga; Yasuda, Yuka; Fujimoto, Michiko; Kudo, Noriko; Takemura, Ariyoshi; Son, Shuraku; Narita, Hisashi; Yamamoto, Maeri; Tha, Khin Khin; Katsuki, Asuka; Ohi, Kazutaka; Yamashita, Fumio; Koike, Shinsuke; Takahashi, Tsutomu; Nemoto, Kiyotaka; Fukunaga, Masaki; Onitsuka, Toshiaki; Watanabe, Yoshiyuki; Yamasue, Hidenori; Suzuki, Michio; Kasai, Kiyoto; Kusumi, Ichiro; Hashimoto, Ryota

    2018-01-01

    The effect of duration of illness and antipsychotic medication on the volumes of subcortical structures in schizophrenia is inconsistent among previous reports. We implemented a large sample analysis utilizing clinical data from 11 institutions in a previous meta-analysis. Imaging and clinical data of 778 schizophrenia subjects were taken from a prospective meta-analysis conducted by the COCORO consortium in Japan. The effect of duration of illness and daily dose and type of antipsychotics were assessed using the linear mixed effect model where the volumes of subcortical structures computed by FreeSurfer were used as a dependent variable and age, sex, duration of illness, daily dose of antipsychotics and intracranial volume were used as independent variables, and the type of protocol was incorporated as a random effect for intercept. The statistical significance of fixed-effect of dependent variable was assessed. Daily dose of antipsychotics was positively associated with left globus pallidus volume and negatively associated with right hippocampus. It was also positively associated with laterality index of globus pallidus. Duration of illness was positively associated with bilateral globus pallidus volumes. Type of antipsychotics did not have any effect on the subcortical volumes. A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  6. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  7. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  8. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  9. Frequency Analysis of Gradient Estimators in Volume Rendering

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Lichtenbelt, Barthold B.A.; Malzbender, Tom

    1996-01-01

    Gradient information is used in volume rendering to classify and color samples along a ray. In this paper, we present an analysis of the theoretically ideal gradient estimator and compare it to some commonly used gradient estimators. A new method is presented to calculate the gradient at arbitrary

  10. Simplifying the spectral analysis of the volume operator

    NARCIS (Netherlands)

    Loll, R.

    1997-01-01

    The volume operator plays a central role in both the kinematics and dynamics of canonical approaches to quantum gravity which are based on algebras of generalized Wilson loops. We introduce a method for simplifying its spectral analysis, for quantum states that can be realized on a cubic

  11. Seismic hazard analysis. Application of methodology, results, and sensitivity studies

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1981-10-01

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectra for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimated seismic hazard in this region of the country. (author)

  12. Development of a methodology for analysis of the impact of modifying neutron cross sections

    International Nuclear Information System (INIS)

    Wenner, M. T.; Haghighat, A.; Adams, J. M.; Carlson, A. D.; Grimes, S. M.; Massey, T. N.

    2004-01-01

    Monte Carlo analysis of a Time-of-Flight (TOF) experiment can be utilized to examine the accuracy of nuclear cross section data. Accurate determination of this data is paramount in characterization of reactor lifetime. We have developed a methodology to examine the impact of modifying the current cross section libraries available in ENDF-6 format (1) where deficiencies may exist, and have shown that this methodology may be an effective methodology for examining the accuracy of nuclear cross section data. The new methodology has been applied to the iron scattering cross sections, and the use of the revised cross sections suggests that reactor pressure vessel fluence may be underestimated. (authors)

  13. Sectional analysis for volume determination and selection of volume equations for the Tapajos Nacional Forest

    Directory of Open Access Journals (Sweden)

    Renato Bezerra da Silva Ribeiro

    2014-12-01

    Full Text Available The aim of this study was to analyze different sections lengths for volume determination, fitting of volumetric models for timber production estimation in an area of forest management in the Tapajós National Forest (FNT. Six treatments for sectioning were tested in 152 logs of 12 commercial species. The obtained volumes were statistically compared by analysis of variance (ANOVA for the choice of the best method of sectioning and calculating the actual volume of 2,094 sample trees in different diameter commercial classes. Ten mathematical models were fitted to the whole data and to the species Manilkara huberi (Ducke Chevalier (maçaranduba Lecythis lurida (Miers Samori (jarana and Hymenaea courbaril L. (Jatobá. The criteria to choose the best model were adjusted coefficient of determination in percentage (R2adj%, standard error of estimate in percentage (Syx%, significance of the parameters, normality of residuals, Variance Inflation Factor (VIF and residuals graphic distribution. There was no statistical difference between the methods of sectioning and thus the total length of the logs was more operational in the field. The models in logarithmic form of Schumacher and Hall and Spurr were the best to estimate the volume for the species and for the whole sample set.

  14. Aerodynamic configuration design using response surface methodology analysis

    Science.gov (United States)

    Engelund, Walter C.; Stanley, Douglas O.; Lepsch, Roger A.; Mcmillin, Mark M.; Unal, Resit

    1993-01-01

    An investigation has been conducted to determine a set of optimal design parameters for a single-stage-to-orbit reentry vehicle. Several configuration geometry parameters which had a large impact on the entry vehicle flying characteristics were selected as design variables: the fuselage fineness ratio, the nose to body length ratio, the nose camber value, the wing planform area scale factor, and the wing location. The optimal geometry parameter values were chosen using a response surface methodology (RSM) technique which allowed for a minimum dry weight configuration design that met a set of aerodynamic performance constraints on the landing speed, and on the subsonic, supersonic, and hypersonic trim and stability levels. The RSM technique utilized, specifically the central composite design method, is presented, along with the general vehicle conceptual design process. Results are presented for an optimized configuration along with several design trade cases.

  15. Volume conduction effects on wavelet cross-bicoherence analysis

    International Nuclear Information System (INIS)

    Memon, I.A.; Channa, C.

    2013-01-01

    Cross-bicoherence analysis is one of the important nonlinear signal processing tools which is used to measure quadratic phase coupling between frequencies of two different time series. It is frequently used in the diagnosis of various cognitive and neurological disorders in EEG (Electroencephalography) analysis. Volume conduction effects of various uncorrelated sources present in the brain can produce biased estimates into the estimated values of cross-bicoherence function. Previous studies have discussed volume conduction effects on coherence function which is used to measure linear relationship between EEG signals in terms of their phase and amplitude. However, volume conduction effect on cross-bicoherence analysis which is quite a different technique has not been investigated up to now to the best of our knowledge. This study is divided into two major parts, the first part deals with the investigation of VCUS (Volume Conduction effects due to Uncorrelated Sources) characteristics on EEG-cross-bicoherence analysis. The simulated EEG data due to uncorrelated sources present in the brain was used in this part of study. The next part of study is based upon investigating the effects of VCUS on the statistical analysis of results of EEG-based cross-bicoherence analysis. The study provides an important clinical application because most of studies based on EEG cross-bicoherence analysis have avoided the issue of VCUS. The cross-bicoherence analysis was performed by detecting the change in MSCB (Magnitude Square Cross-Bicoherence Function) between EEG activities of change detection and no-change detection trials. The real EEG signals were used. (author)

  16. Analysis of the chemical equilibrium of combustion at constant volume

    Directory of Open Access Journals (Sweden)

    Marius BREBENEL

    2014-04-01

    Full Text Available Determining the composition of a mixture of combustion gases at a given temperature is based on chemical equilibrium, when the equilibrium constants are calculated on the assumption of constant pressure and temperature. In this paper, an analysis of changes occurring when combustion takes place at constant volume is presented, deriving a specific formula of the equilibrium constant. The simple reaction of carbon combustion in pure oxygen in both cases (constant pressure and constant volume is next considered as example of application, observing the changes occurring in the composition of the combustion gases depending on temperature.

  17. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  18. Micro analysis of fringe field formed inside LDA measuring volume

    International Nuclear Information System (INIS)

    Ghosh, Abhijit; Nirala, A K

    2016-01-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement. (paper)

  19. On the Application of Syntactic Methodologies in Automatic Text Analysis.

    Science.gov (United States)

    Salton, Gerard; And Others

    1990-01-01

    Summarizes various linguistic approaches proposed for document analysis in information retrieval environments. Topics discussed include syntactic analysis; use of machine-readable dictionary information; knowledge base construction; the PLNLP English Grammar (PEG) system; phrase normalization; and statistical and syntactic phrase evaluation used…

  20. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  1. Substance precedes methodology: on cost-benefit analysis and equity

    NARCIS (Netherlands)

    Martens, C.J.C.M.

    2011-01-01

    While distributive aspects have been a topic of discussion in relation to cost–benefit analysis (CBA), little systematic thought has been given in the CBA literature to the focus of such an equity analysis in evaluating transport projects. The goal of the paper is to provide an overview of the

  2. Transport of solid commodities via freight pipeline: cost estimating methodology. Volume III, parts A and B. First year final report

    Energy Technology Data Exchange (ETDEWEB)

    Warner, J.A.; Morlok, E.K.; Gimm, K.K.; Zandi, I.

    1976-07-01

    In order to examine the feasibility of an intercity freight pipeline, it was necessary to develop cost equations for various competing transportation modes. This volume presents cost-estimating equations for rail carload, trailer-on-flatcar, truck, and freight pipeline. Section A presents mathematical equations that approximate the fully allocated and variable costs contained in the ICC cost tables for rail carload, trailer-on-flatcar (TOFC) and truck common-carrier intercity freight movements. These equations were developed to enable the user to approximate the ICC costs quickly and easily. They should find use in initial studies of costs where exact values are not needed, such as in consideration of rate changes, studies of profitability, and in general inter-modal comparisons. Section B discusses the development of a set of engineering cost equations for pneumo-capsule pipelines. The development was based on an analysis of system components and can readily be extended to other types of pipeline. The model was developed for the purpose of a feasibility study. It employs a limited number of generalized parameters and its use is recommended when sufficient detailed and specific engineering information is lacking. These models were used in the comparison of modes presented in Volume I and hence no conclusions regarding relative costs or service of the modes are presented here. The primary conclusion is that the estimates of costs resulting from these models is subject to considerable uncertainty.

  3. Comprehensive Safety Analysis 2010 Safety Measurement System (SMS) Methodology, Version 2.1 Revised December 2010

    Science.gov (United States)

    2010-12-01

    This report documents the Safety Measurement System (SMS) methodology developed to support the Comprehensive Safety Analysis 2010 (CSA 2010) Initiative for the Federal Motor Carrier Safety Administration (FMCSA). The SMS is one of the major tools for...

  4. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  5. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  6. Stomatal oscillations in olive trees: analysis and methodological implications.

    Science.gov (United States)

    López-Bernal, Alvaro; García-Tejera, Omar; Testi, Luca; Orgaz, Francisco; Villalobos, Francisco J

    2017-10-13

    Stomatal oscillations have long been disregarded in the literature despite the fact that the phenomenon has been described for a variety of plant species. This study aims to characterize the occurrence of oscillations in olive trees (Olea europaea L.) under different growing conditions and its methodological implications. Three experiments with young potted olives and one with large field-grown trees were performed. Sap flow measurements were always used to monitor the occurrence of oscillations, with additional determinations of trunk diameter variations and leaf-level stomatal conductance, photosynthesis and water potential also conducted in some cases. Strong oscillations with periods of 30-60 min were generally observed for young trees, while large field trees rarely showed significant oscillations. Severe water stress led to the disappearance of oscillations, but moderate water deficits occasionally promoted them. Simultaneous oscillations were also found for leaf stomatal conductance, leaf photosynthesis and trunk diameter, with the former presenting the highest amplitudes. The strong oscillations found in young potted olive trees preclude the use of infrequent measurements of stomatal conductance and related variables to characterize differences between trees of different cultivars or subjected to different experimental treatments. Under these circumstances, our results suggest that reliable estimates could be obtained using measurement intervals below 15 min. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10 5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  8. Meta-analysis: Its role in psychological methodology

    Directory of Open Access Journals (Sweden)

    Andrej Kastrin

    2008-11-01

    Full Text Available Meta-analysis refers to the statistical analysis of a large collection of independent observations for the purpose of integrating results. The main objectives of this article are to define meta-analysis as a method of data integration, to draw attention to some particularities of its use, and to encourage researchers to use meta-analysis in their work. The benefits of meta-analysis include more effective exploitation of existing data from independent sources and contribution to more powerful domain knowledge. It may also serve as a support tool to generate new research hypothesis. The idea of combining results of independent studies addressing the same research question dates back to sixteenth century. Metaanalysis was reinvented in 1976 by Glass, to refute the conclusion of an eminent colleague, Eysenck, that psychotherapy was essentially ineffective. We review some major historical landmarks of metaanalysis and its statistical background. We present the concept of effect size measure, the problem of heterogeneity and two models which are used to combine individual effect sizes (fixed and random effect model in great details. Two visualization techniques, forest and funnel plot graphics are demonstrated. We developed RMetaWeb, simple and fast web server application to conduct meta-analysis online. RMetaWeb is the first web meta-analysis application and is completely based on R software environment for statistical computing and graphics.

  9. Development of Safety Margin Analysis Methodology on Aging Effect for CANDU Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Woong; Lee, Sang Kyu; Kim, Hyun Koon; Yoo, Kun Joong; Ryu, Yong Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Yoo, Jun Soo; Choi, Yong Won; Park, Chang Hwan [Seoul National Univ., Seoul (Korea, Republic of)

    2007-07-01

    Considering that operating year of Wolsong Unit 1 gets close to the design life, 30 years, the aging effect due to the component degradation takes into consideration as an important safety issue. However, since the thermalhydraulic effect due to the aging did not identify clearly, the safety analysis methodology is not be well established so far. Therefore, in this study, the aging effect affected by thermal-hydraulic characteristics was investigated and a safety margin analysis methodology considering aging effect was proposed.

  10. Development of Safety Margin Analysis Methodology on Aging Effect for CANDU Reactors (II)

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Man Woong; Lee, Sang Kyu; Kim, Hyun Koon; Yoo, Kun Joong; Ryu, Yong Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Choi, Yong Won; Lee, Un Chul [Nuclear Engr. Seoul Nat' l Univ., Seoul (Korea, Republic of)

    2007-10-15

    Considering that operating year of Wolsong Unit 1 gets close to the design life, 30 years, the aging effect due to the component degradation takes into consideration as an important safety issue. However, since the thermal hydraulic effect due to the aging did not identify clearly, the safety analysis methodology is not be well established so far. Therefore, in this study, the aging effect affected by thermal-hydraulic characteristics was investigated and a safety margin analysis methodology considering aging effect was proposed.

  11. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  12. Vapor Pressure Data Analysis and Correlation Methodology for Data Spanning the Melting Point

    Science.gov (United States)

    2013-10-01

    specimen is adequately degassed, the liquid menisci in the U-tube are brought to the same level and the pressure read on the manometer . The measurement...VAPOR PRESSURE DATA ANALYSIS AND CORRELATION METHODOLOGY FOR DATA SPANNING THE MELTING POINT ECBC-CR-135 David E...REPORT TYPE Final 3. DATES COVERED (From - To) Mar 2013 - June 2013 4. TITLE AND SUBTITLE Vapor Pressure Data Analysis and Correlation Methodology

  13. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  14. Biofuel transportation analysis tool : description, methodology, and demonstration scenarios

    Science.gov (United States)

    2014-01-01

    This report describes a Biofuel Transportation Analysis Tool (BTAT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Department of Defense (DOD) Office of Naval Research ...

  15. Common methodology for steady state harmonic analysis of inverters

    Energy Technology Data Exchange (ETDEWEB)

    Vittek, J. [Technical Univ. of Transport and Communication, Zilina (Slovakia). Dept. of Electric Traction and Energetics; Najjar, M.Y. [Cleveland State Univ., OH (United States)

    1995-07-01

    This paper shows the time advantage analysis of m-phase symmetrical inverter systems in complex plane. Computation time reduction occurs due to the 2m-side symmetry in complex plane of such systems. Equations of characteristic values of voltage and current waveforms in the complex domain are developed. The validity of the analysis is shown for single and three phase symmetrical systems under three different modulation techniques using the equations presented in this paper.

  16. Methodologies and techniques for analysis of network flow data

    Energy Technology Data Exchange (ETDEWEB)

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  17. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  18. Price-volume multifractal analysis of the Moroccan stock market

    Science.gov (United States)

    El Alaoui, Marwane

    2017-11-01

    In this paper, we analyzed price-volume multifractal cross-correlations of Moroccan Stock Exchange. We chose the period from January 1st 2000 to January 20th 2017 to investigate the multifractal behavior of price change and volume change series. Then, we used multifractal detrended cross-correlations analysis method (MF-DCCA) and multifractal detrended fluctuation analysis (MF-DFA) to analyze the series. We computed bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively cross-correlations. Furthermore, we used detrended cross-correlations coefficient (DCCA) and cross-correlation test (Q(m)) to analyze cross-correlation quantitatively and qualitatively. By analyzing results, we found existence of price-volume multifractal cross-correlations. The spectrum width has a strong multifractal cross-correlation. We remarked that volume change series is anti-persistent when we analyzed the generalized Hurst exponent for all moments q. The cross-correlation test showed the presence of a significant cross-correlation. However, DCCA coefficient had a small positive value, which means that the level of correlation is not very significant. Finally, we analyzed sources of multifractality and their degree of contribution in the series.

  19. Identifying radiotherapy target volumes in brain cancer by image analysis.

    Science.gov (United States)

    Cheng, Kun; Montgomery, Dean; Feng, Yang; Steel, Robin; Liao, Hanqing; McLaren, Duncan B; Erridge, Sara C; McLaughlin, Stephen; Nailon, William H

    2015-10-01

    To establish the optimal radiotherapy fields for treating brain cancer patients, the tumour volume is often outlined on magnetic resonance (MR) images, where the tumour is clearly visible, and mapped onto computerised tomography images used for radiotherapy planning. This process requires considerable clinical experience and is time consuming, which will continue to increase as more complex image sequences are used in this process. Here, the potential of image analysis techniques for automatically identifying the radiation target volume on MR images, and thereby assisting clinicians with this difficult task, was investigated. A gradient-based level set approach was applied on the MR images of five patients with grades II, III and IV malignant cerebral glioma. The relationship between the target volumes produced by image analysis and those produced by a radiation oncologist was also investigated. The contours produced by image analysis were compared with the contours produced by an oncologist and used for treatment. In 93% of cases, the Dice similarity coefficient was found to be between 60 and 80%. This feasibility study demonstrates that image analysis has the potential for automatic outlining in the management of brain cancer patients, however, more testing and validation on a much larger patient cohort is required.

  20. CADDIS Volume 4. Data Analysis: Exploratory Data Analysis

    Science.gov (United States)

    Intro to exploratory data analysis. Overview of variable distributions, scatter plots, correlation analysis, GIS datasets. Use of conditional probability to examine stressor levels and impairment. Exploring correlations among multiple stressors.

  1. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  2. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  3. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    Science.gov (United States)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  4. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  6. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  7. Physical data generation methodology for return-to-power steam line break analysis

    International Nuclear Information System (INIS)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new

  8. Pressure fluctuation analysis for charging pump of chemical and volume control system of nuclear power plant

    Directory of Open Access Journals (Sweden)

    Chen Qiang

    2016-01-01

    Full Text Available Equipment Failure Root Cause Analysis (ERCA methodology is employed in this paper to investigate the root cause for charging pump’s pressure fluctuation of chemical and volume control system (RCV in pressurized water reactor (PWR nuclear power plant. RCA project task group has been set up at the beginning of the analysis process. The possible failure modes are listed according to the characteristics of charging pump’s actual pressure fluctuation and maintenance experience during the analysis process. And the failure modes are analysed in proper sequence by the evidence-collecting. It suggests that the gradually untightened and loosed shaft nut in service should be the root cause. And corresponding corrective actions are put forward in details.

  9. Analysis of volume expansion data for periclase, lime, corundum ...

    Indian Academy of Sciences (India)

    Abstract. We have presented an analysis of the volume expansion data for periclase (MgO), lime (CaO), corundum. (Al2O3) and spinel (MgAl2O4) determined experimentally by Fiquet et al (1999) from 300K up to 3000K. The ther- mal equation of state due to Suzuki et al (1979) and Shanker et al (1997) are used to study the ...

  10. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan

    2016-01-01

    This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...... process operating at Novozymes A/S. Following the FUPCR methodology, the final product concentration could be predicted with an average prediction error of 7.4%. Multiple iterations of preprocessing were applied by implementing the methodology to identify the best data handling methods for the model....... It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify...

  11. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  12. Exploratory market structure analysis. Topology-sensitive methodology.

    OpenAIRE

    Mazanec, Josef

    1999-01-01

    Given the recent abundance of brand choice data from scanner panels market researchers have neglected the measurement and analysis of perceptions. Heterogeneity of perceptions is still a largely unexplored issue in market structure and segmentation studies. Over the last decade various parametric approaches toward modelling segmented perception-preference structures such as combined MDS and Latent Class procedures have been introduced. These methods, however, are not taylored for qualitative ...

  13. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  14. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    Analysis (Fourth Edition)”, McGraw Hill, Boston, 2007 7. Hogg and Tanis, “ Probability and Statistical Inference (Sixth Edition)”, Prentice Hall, 2001 ... Probability 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON John Nierwinski a...time assessment for one notional technology. This time assessment is a probability distribution [7] where the area under the curve totals 1.0

  15. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  16. Methodological Approach to the Energy Analysis of Unconstrained Historical Buildings

    OpenAIRE

    Chiara Burattini; Fabio Nardecchia; Fabio Bisegna; Lucia Cellucci; Franco Gugliermetti; Andrea de Lieto Vollaro; Ferdinando Salata; Iacopo Golasi

    2015-01-01

    The goal set by the EU of quasi-zero energy buildings is not easy to reach for a country like Italy, as it holds a wide number of UNESCO sites and most of them are entire historical old towns. This paper focuses on the problem of the improvement of energy performance of historical Italian architecture through simple interventions that respect the building without changing its shape and structure. The work starts from an energy analysis of a building located in the historic center of Tivoli, a...

  17. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  18. Criteria for the development and use of the methodology for environmentally-acceptable fossil energy site evaluation and selection. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Eckstein, L.; Northrop, G.; Scott, R.

    1980-02-01

    This report serves as a companion document to the report, Volume 1: Environmentally-Acceptable Fossil Energy Site Evaluation and Selection: Methodology and Users Guide, in which a methodology was developed which allows the siting of fossil fuel conversion facilities in areas with the least environmental impact. The methodology, known as SELECS (Site Evaluation for Energy Conversion Systems) does not replace a site specific environmental assessment, or an environmental impact statement (EIS), but does enhance the value of an EIS by thinning down the number of options to a manageable level, by doing this in an objective, open and selective manner, and by providing preliminary assessment and procedures which can be utilized during the research and writing of the actual impact statement.

  19. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  20. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  1. Respirable crystalline silica: Analysis methodologies; Silice cristalina respirable: Metodologias de analisis

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Tena, M. P.; Zumaquero, E.; Ibanez, M. J.; Machi, C.; Escric, A.

    2012-07-01

    This paper describes different analysis methodologies in occupational environments and raw materials. A review is presented of the existing methodologies, the approximations made, some of the constraints involved, as well as the best measurement options for the different raw materials. In addition, the different factors that might affect the precision and accuracy of the results are examined. With regard to the methodologies used for the quantitative analysis of any of the polymorph s, particularly of quartz, the study centres particularly on the analytical X-ray diffraction method. Simplified methods of calculation and experimental separation are evaluated for the estimation of this fraction in the raw materials, such as separation methods by centrifugation, sedimentation, and dust generation in controlled environments. In addition, a review is presented of the methodologies used for the collection of respirable crystalline silica in environmental dust. (Author)

  2. External Events Analysis for LWRS/RISMC Project: Methodology Development and Early Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Laboratory; Prescott, Steven Ralph [Idaho National Laboratory; Yorg, Richard Alan [Idaho National Laboratory; Coleman, Justin Leigh [Idaho National Laboratory; Szilard, Ronaldo Henriques [Idaho National Laboratory

    2016-02-01

    The ultimate scope of Industrial Application #2 (IA) of the LWRS/RISMC project is a realistic simulation of natural external hazards that impose threat to a NPP. This scope requires the development of a methodology and of a qualified set of tools able to perform advanced risk- informed safety analysis. In particular the methodology should be able to combine results from seismic, flooding and thermal-hydraulic (TH) deterministic calculations with dynamic PRA. This summary presents the key points of methodology being developed and the very first sample application of it to a simple problem (spent fuel pool).

  3. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  4. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  5. THE MURCHISON WIDEFIELD ARRAY 21 cm POWER SPECTRUM ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Daniel C.; Beardsley, A. P.; Bowman, Judd D. [Arizona State University, School of Earth and Space Exploration, Tempe, AZ 85287 (United States); Hazelton, B. J.; Sullivan, I. S.; Barry, N.; Carroll, P. [University of Washington, Department of Physics, Seattle, WA 98195 (United States); Trott, C. M.; Pindor, B.; Briggs, F.; Gaensler, B. M. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia); Dillon, Joshua S.; Oliveira-Costa, A. de; Ewall-Wice, A.; Feng, L. [MIT Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Pober, J. C. [Brown University, Department of Physics, Providence, RI 02912 (United States); Bernardi, G. [Department of Physics and Electronics, Rhodes University, Grahamstown 6140 (South Africa); Cappallo, R. J.; Corey, B. E. [MIT Haystack Observatory, Westford, MA 01886 (United States); Emrich, D., E-mail: daniel.c.jacobs@asu.edu [International Centre for Radio Astronomy Research, Curtin University, Perth, WA 6845 (Australia); and others

    2016-07-10

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  6. Mediation analysis in nursing research: a methodological review.

    Science.gov (United States)

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  7. Methodology for global nonlinear analysis of nuclear systems

    International Nuclear Information System (INIS)

    Cacuci, D.G.; Cacuci, G.L.

    1987-01-01

    This paper outlines a general method for globally computing the crucial features of nonlinear problems: bifurcations, limit points, saddle points, extrema (maxima and minima); our method also yields the local sensitivities (i.e., first order derivatives) of the system's state variables (e.g., fluxes, power, temperatures, flows) at any point in the system's phase space. We also present an application of this method to the nonlinear BWR model discussed in Refs. 8 and 11. The most significant novel feature of our method is the recasting of a general mathematical problem comprising three aspects: (1) nonlinear constrained optimization, (2) sensitivity analysis, into a fixed point problem of the form F[u(s), λ(s)] = 0 whose global zeros and singular points are related to the special features (i.e., extrema, bifurcations, etc.) of the original problem

  8. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  9. CADDIS Volume 4. Data Analysis: Selecting an Analysis Approach

    Science.gov (United States)

    An approach for selecting statistical analyses to inform causal analysis. Describes methods for determining whether test site conditions differ from reference expectations. Describes an approach for estimating stressor-response relationships.

  10. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  11. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  12. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  13. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  14. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  15. Notions and methodologies for uncertainty analysis in simulations of transitory events of a nuclear central

    International Nuclear Information System (INIS)

    Alva N, J.; Ortiz V, J.; Amador G, R.; Delfin L, A.

    2007-01-01

    The present work has as objective to gather the basic notions related with the uncertainty analysis and some of the methodologies to be applied in the studies of transitory events analysis of a nuclear power station, in particular of those thermal hydraulics phenomena. The concepts and methodologies mentioned in this work are the result of an exhaustive bibliographical investigation of the topic in the nuclear area. The methodologies of uncertainties analysis have been developed by diverse institutions and they are broadly used at world level for their application in the results of the computer codes of the class of better estimation in the thermal hydraulics analysis and safety of plants and nuclear reactors. The main sources of uncertainty, types of uncertainty and aspects related with the models of better estimation and better estimation method are also presented. (Author)

  16. Economy As A Phenomenon Of Culture: Theoretical And Methodological Analysis

    Directory of Open Access Journals (Sweden)

    S. N. Ivaskovsky

    2017-01-01

    Full Text Available The article redefines economy as a phenomenon of culture, a product of a historically and socially grounded set of values shared by members of a given society. The research shows that culture is not always identical to social utility, because there are multiple examples when archaic, traditionalist, irrational cultural norms hinder social and economic progress and trap nations into poverty and underdevelopment. One of the reasons for the lack of scholarly attention to cultural dimension of economy is the triumph of positivism in economics. Mathematics has become the dominant language of economic analysis. It leads to the transformation of the economics into a sort of «social physics», accompanied by the loss of its original humanitarian nature shared in the works of all the great economists of the past. The author emphasizes the importance of the interdisciplinary approach to the economic research and the incorporation of the achievements of the other social disciplines – history, philosophy, sociology and cultural studies - into the subject matter of economic theory. Substantiating the main thesis of the article, the author shows that there is a profound ontological bond between economy and culture, which primarily consists in the fact that these spheres of human relations are aimed at the solution of the same problem – the competitive selection of the best ways for survival of people, of satisfying the relevant living needs. In order to overcome the difficulties related to the inclusion of culture in the set of analytical tools used in the economic theory, the author suggests using a category of «cultural capital», which reestablishes the earlier and more familiar for the economists meaning of capital.

  17. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  18. A methodology for estimating the volume of Baltic timber to Spain using the Sound Toll Registers : 1670-1806

    NARCIS (Netherlands)

    Gallagher, Nathan

    2016-01-01

    The Sound Toll Registers Online project has opened a trove of information for historians, but calculating the actual volume of the trade it represents remains a challenge. Attempts have been made for products that were measured in weight or volume, but timber products were usually recorded by the

  19. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  20. Comparative analysis of two weight-of-evidence methodologies for integrated sediment quality assessment.

    Science.gov (United States)

    Khosrovyan, A; Rodríguez-Romero, A; Antequera Ramos, M; DelValls, T A; Riba, I

    2015-02-01

    The results of sediment quality assessment by two different weight-of-evidence methodologies were compared. Both methodologies used the same dataset but as criteria and procedures were different, the results emphasized different aspects of sediment contamination. One of the methodologies integrated the data by means of a multivariate analysis and suggested bioavailability of contaminants and their spatial distribution. The other methodology, used in the dredged material management framework recently proposed in Spain, evaluated sediment toxicity in general by assigning categories. Despite the differences in the interpretation and presentation of results, the methodologies evaluated sediment risk similarly, taking into account chemical concentrations and toxicological effects. Comparison of the results of different approaches is important to define their limitations and thereby avoid implications of potential environmental impacts from different management options, as in the case of dredged material risk assessment. Consistent results of these two methodologies emphasized validity and robustness of the integrated, weight-of-evidence, approach to sediment quality assessment. Limitations of the methodologies were discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Study for Improving the Definition of the Army Objective Force Methodology, Phase II (IDOFOR II). Volume I. Executive Summary.

    Science.gov (United States)

    1981-10-01

    produced by the methodology are expected to have a shelf life of 2 years or more. e. The improved methodology has embedded in it the capability to as...have embedded in It the capability to ascribe funding and other resources to each future objective force design considered. Cost estimates must be...Jan 80 (CONFIDENTIAL) Headquarters, US Army Material Development and Readiness Comand (DARCOM) DARCOM Pamphlet 706-102, Engineering Design Handbook

  2. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  3. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  4. Experimental analysis of fuzzy controlled energy efficient demand controlled ventilation economizer cycle variable air volume air conditioning system

    Directory of Open Access Journals (Sweden)

    Rajagopalan Parameshwaran

    2008-01-01

    Full Text Available In the quest for energy conservative building design, there is now a great opportunity for a flexible and sophisticated air conditioning system capable of addressing better thermal comfort, indoor air quality, and energy efficiency, that are strongly desired. The variable refrigerant volume air conditioning system provides considerable energy savings, cost effectiveness and reduced space requirements. Applications of intelligent control like fuzzy logic controller, especially adapted to variable air volume air conditioning systems, have drawn more interest in recent years than classical control systems. An experimental analysis was performed to investigate the inherent operational characteristics of the combined variable refrigerant volume and variable air volume air conditioning systems under fixed ventilation, demand controlled ventilation, and combined demand controlled ventilation and economizer cycle techniques for two seasonal conditions. The test results of the variable refrigerant volume and variable air volume air conditioning system for each techniques are presented. The test results infer that the system controlled by fuzzy logic methodology and operated under the CO2 based mechanical ventilation scheme, effectively yields 37% and 56% per day of average energy-saving in summer and winter conditions, respectively. Based on the experimental results, the fuzzy based combined system can be considered to be an alternative energy efficient air conditioning scheme, having significant energy-saving potential compared to the conventional constant air volume air conditioning system.

  5. Safety assessment methodologies for near surface disposal facilities. Results of a co-ordinated research project (ISAM). Volume 1: Review and enhancement of safety assessment approaches and tools. Volume 2: Test cases

    International Nuclear Information System (INIS)

    2004-07-01

    the Safety Guide on 'Safety Assessment for Near Surface Disposal of Radioactive Waste' (Safety Standards Series No. WS-G- 1.1). The report of this CRP is presented in two volumes; Volume 1 contains a summary and a complete description of the ISAM project methodology and Volume 2 presents the application of the methodology to three hypothetical test cases

  6. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  7. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  8. Analysis of increasing trend of mortgage volume in the Czech Republic

    Directory of Open Access Journals (Sweden)

    Petra Střelcová

    2009-01-01

    Full Text Available The aim of this paper is an empirical analysis of mortgage volume in the Czech Republic and factors identification of the increasing trend of the mortgage volume in the period from 2001 to 2007. Firstly, analysis of quarterly time series of mortgage volume and average mortgage rate are performed. Consequently, causality between mortgage volume and average mortgage rate is analysed. The morgage rate is the most important factor for economic subjects decision of residential investment. Afterwards, it is analysed causality between mortgage volume and selected factors via multiple regression analysis. Based on this analysis, influencing factors for multiple regression analysis describing mortgage volume are selected. Our empirical analysis validate the causality between mortgage volume and mortgage rate, unemployment rate and price level of real estates. Part of this paper is also economic eduction of causality and estimation of expect progress of mortgage volume especially in connection with present economic and business recession.

  9. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  10. French Epistemology and its Revisions: Towards a Reconstruction of the Methodological Position of Foucaultian Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-05-01

    Full Text Available This article reconstructs epistemology in the tradition of Gaston BACHELARD as one of the main foundations of the methodology of FOUCAULTian discourse analysis. Foundational concepts and the methodological approach of French epistemology are one of the continuities in the work of Michel FOUCAULT. BACHELARDian epistemology (and of his successor Georges CANGUILHEM can be used for the reconstruction of the FOUCAULTian methodology and it can also be used to instruct the practices of FOUCAULTian discourse analysis as a stand-alone form of qualitative social research. French epistemology was developed in critical opposition to the phenomenology of Edmund HUSSERL, and to phenomenological theories of science. Because the phenomenology of HUSSERL is one foundation of social phenomenology, the reconstruction of the FOUCAULTian methodology—as built on the French tradition of BACHELARDian epistemology—makes it clear that FOUCAULTian discourse analysis is incommensurable with approaches derived from social phenomenology. The epistemology of BACHELARD is portrayed as a proto-version of discourse analysis. Discourses as well as discourse analyses are conceived as forms of socio-epistemological practice. In this article, the main concepts and strategies of French epistemology are introduced and related to discourse analysis. The consequences of epistemology for a self-reflexive methodology and its practice are discussed. URN: urn:nbn:de:0114-fqs0702241

  11. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  12. Is Article Methodological Quality Associated With Conflicts of Interest?: An Analysis of the Plastic Surgery Literature.

    Science.gov (United States)

    Cho, Brian H; Lopez, Joseph; Means, Jessica; Lopez, Sandra; Milton, Jacqueline; Tufaro, Anthony P; May, James W; Dorafshar, Amir H

    2017-12-01

    Conflicts of interest (COI) are an emerging area of discussion within the field of plastic surgery. Recently, several reports have found that research studies that disclose COI are associated with publication of positive outcomes. We hypothesize that this association is driven by higher-quality studies receiving industry funding. This study aimed to investigate the association between industry support and study methodological quality. We reviewed all entries in Plastic and Reconstructive Surgery, Annals of Plastic Surgery, and Journal of Plastic, Reconstructive, and Aesthetic Surgery within a 1-year period encompassing 2013. All clinical research articles were analyzed. Studies were evaluated blindly for methodology quality based on a validated scoring system. An ordinal logistic regression model was used to examine the association between methodology score and COI. A total of 1474 articles were reviewed, of which 483 met our inclusion criteria. These articles underwent methodological quality scoring. Conflicts of interest were reported in 28 (5.8%) of these articles. After adjusting for article characteristics in the ordinal logistic regression analysis, there was no significant association between articles with COI and higher methodological scores (P = 0.7636). Plastic surgery studies that disclose COI are not associated with higher methodological quality when compared with studies that do not disclose COI. These findings suggest that although the presence of COI is associated with positive findings, the association is not shown to be driven by higher-quality studies.

  13. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    Science.gov (United States)

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items.

  14. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  15. An analysis of the methodological underpinnings of social learning research in natural resource Management

    NARCIS (Netherlands)

    Rodela, R.; Cundill, G.; Wals, A.E.J.

    2012-01-01

    This analysis is focused on research that uses a social learning approach to study natural resource issues. We map out the prevailing epistemological orientation of social learning research through the de-construction of the methodological choices reported in current social learning literature.

  16. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  17. Analysis of Introducing Active Learning Methodologies in a Basic Computer Architecture Course

    Science.gov (United States)

    Arbelaitz, Olatz; José I. Martín; Muguerza, Javier

    2015-01-01

    This paper presents an analysis of introducing active methodologies in the Computer Architecture course taught in the second year of the Computer Engineering Bachelor's degree program at the University of the Basque Country (UPV/EHU), Spain. The paper reports the experience from three academic years, 2011-2012, 2012-2013, and 2013-2014, in which…

  18. Genome-wide expression studies of atherosclerosis: critical issues in methodology, analysis, interpretation of transcriptomics data

    NARCIS (Netherlands)

    Bijnens, A. P. J. J.; Lutgens, E.; Ayoubi, T.; Kuiper, J.; Horrevoets, A. J.; Daemen, M. J. A. P.

    2006-01-01

    During the past 6 years, gene expression profiling of atherosclerosis has been used to identify genes and pathways relevant in vascular (patho)physiology. This review discusses some critical issues in the methodology, analysis, and interpretation of the data of gene expression studies that have made

  19. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  20. Validation of On-Orbit Methodology for the Assessment of Cardiac Function and Changes in the Circulating Volume Using Ultrasound and Braslet-M Occlusion Cuffs

    Science.gov (United States)

    Hamilton, Douglas; Sargsyan, Ashot E.; Ebert, Douglas; Duncan, Michael; Bogomolov, Valery V.; Alferova, Irina V.; Matveev, Vladimir P.; Dulchavsky, Scott A.

    2010-01-01

    The objective of this joint U.S. - Russian project was the development and validation of an in-flight methodology to assess a number of cardiac and vascular parameters associated with circulating volume and its manipulation in long-duration space flight. Responses to modified Valsalva and Mueller maneuvers were measured by cardiac and vascular ultrasound (US) before, during, and after temporary volume reduction by means of Braslet-M thigh occlusion cuffs (Russia). Materials and Methods: The study protocol was conducted in 14 sessions on 9 ISS crewmembers, with an average exposure to microgravity of 122 days. Baseline cardiovascular measurements were taken by echocardiography in multiple modes (including tissue Doppler of both ventricles) and femoral and jugular vein imaging on the International Space Station (ISS). The Braslet devices were then applied and measurements were repeated after >10 minutes. The cuffs were then released and the hemodynamic recovery process was monitored. Modified Valsalva and Mueller maneuvers were used throughout the protocol. All US data were acquired by the HDI-5000 ultrasound system aboard the ISS (ATL/Philips, USA) during remotely guided sessions. The study protocol, including the use of Braslet-M for this purpose, was approved by the ISS Human Research Multilateral Review Board (HRMRB). Results: The effects of fluid sequestration on a number of echocardiographic and vascular parameters were readily detectable by in-flight US, as were responses to respiratory maneuvers. The overall volume status assessment methodology appears to be valid and practical, with a decrease in left heart lateral E (tissue Doppler) as one of the most reliable measures. Increase in the femoral vein cross-sectional areas was consistently observed with Braslet application. Other significant differences and trends within the extensive cardiovascular data were also observed. (Decreased - RV and LV preload indices, Cardiac Output, LV E all maneuvers, LV Stroke

  1. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  2. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  3. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  4. Low Tidal Volume versus Non-Volume-Limited Strategies for Patients with Acute Respiratory Distress Syndrome. A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Walkey, Allan J; Goligher, Ewan C; Del Sorbo, Lorenzo; Hodgson, Carol L; Adhikari, Neill K J; Wunsch, Hannah; Meade, Maureen O; Uleryk, Elizabeth; Hess, Dean; Talmor, Daniel S; Thompson, B Taylor; Brower, Roy G; Fan, Eddy

    2017-10-01

    Trials investigating use of lower tidal volumes and inspiratory pressures for patients with acute respiratory distress syndrome (ARDS) have shown mixed results. To compare clinical outcomes of mechanical ventilation strategies that limit tidal volumes and inspiratory pressures (LTV) to strategies with tidal volumes of 10 to 15 ml/kg among patients with ARDS. This is a systematic review and meta-analysis of clinical trials investigating LTV mechanical ventilation strategies. We used random effects models to evaluate the effect of LTV on 28-day mortality, organ failure, ventilator-free days, barotrauma, oxygenation, and ventilation. Our primary analysis excluded trials for which the LTV strategy was combined with the additional strategy of higher positive end-expiratory pressure (PEEP), but these trials were included in a stratified sensitivity analysis. We performed metaregression of tidal volume gradient achieved between intervention and control groups on mortality effect estimates. We used Grading of Recommendations Assessment, Development, and Evaluation methodology to determine the quality of evidence. Seven randomized trials involving 1,481 patients met eligibility criteria for this review. Mortality was not significantly lower for patients receiving an LTV strategy (33.6%) as compared with control strategies (40.4%) (relative risk [RR], 0.87; 95% confidence interval [CI], 0.70-1.08; heterogeneity statistic I 2  = 46%), nor did an LTV strategy significantly decrease barotrauma or ventilator-free days when compared with a lower PEEP strategy. Quality of evidence for clinical outcomes was downgraded for imprecision. Metaregression showed a significant inverse association between larger tidal volume gradient between LTV and control groups and log odds ratios for mortality (β, -0.1587; P = 0.0022). Sensitivity analysis including trials that protocolized an LTV/high PEEP cointervention showed lower mortality associated with LTV (nine trials and 1

  5. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 2. Appendixes

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume 2 contains program listings including subroutines for the four TSC frequency domain programs described in V...

  6. Review of Department of Defense Education Activity (DODEA) Schools. Volume II: Quantitative Analysis of Educational Quality

    National Research Council Canada - National Science Library

    Anderson, Lowell

    2000-01-01

    This volume compiles, and presents in integrated form, IDA's quantitative analysis of educational quality provided by DoD's dependent schools, It covers the quantitative aspects of volume I in greater...

  7. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  8. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  9. Environmentally-acceptable fossil energy site evaluation and selection: methodology and user's guide. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Northrop, G.M.

    1980-02-01

    This report is designed to facilitate assessments of environmental and socioeconomic impacts of fossil energy conversion facilities which might be implemented at potential sites. The discussion of methodology and the User's Guide contained herein are presented in a format that assumes the reader is not an energy technologist. Indeed, this methodology is meant for application by almost anyone with an interest in a potential fossil energy development - planners, citizen groups, government officials, and members of industry. It may also be of instructional value. The methodology is called: Site Evaluation for Energy Conversion Systems (SELECS) and is organized in three levels of increasing sophistication. Only the least complicated version - the Level 1 SELECS - is presented in this document. As stated above, it has been expressly designed to enable just about anyone to participate in evaluating the potential impacts of a proposed energy conversion facility. To accomplish this objective, the Level 1 calculations have been restricted to ones which can be performed by hand in about one working day. Data collection and report preparation may bring the total effort required for a first or one-time application to two to three weeks. If repeated applications are made in the same general region, the assembling of data for a different site or energy conversion technology will probably take much less time.

  10. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  11. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    International Nuclear Information System (INIS)

    Kaye, R.D.; Henriksen, K.; Jones, R.; Morisseau, D.S.; Serig, D.I.

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included

  12. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  13. Characterisation of radiotherapy planning volumes using textural analysis.

    Science.gov (United States)

    Nailon, William H; Redpath, Anthony T; McLaren, Duncan B

    2008-01-01

    Computer-based artificial intelligence methods for classification and delineation of the gross tumour volume (GTV) on computerised tomography (CT) and magnetic resonance (MR) images do not, at present, provide the accuracy required for radiotherapy applications. This paper describes an image analysis method for classification of distinct regions within the GTV, and other clinically relevant regions, on CT images acquired on eight bladder cancer patients at the radiotherapy planning stage and thereafter at regular intervals during treatment. Statistical and fractal textural features (N=27) were calculated on the bladder, rectum and a control region identified on axial, coronal and sagittal CT images. Unsupervised classification results demonstrate that with a reduced feature set (N=3) the approach offers significant classification accuracy on axial, coronal and sagittal CT image planes and has the potential to be developed further for radiotherapy applications, particularly towards an automatic outlining approach.

  14. Characterisation of radiotherapy planning volumes using textural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nailon, William H.; Redpath, Anthony T.; McLaren, Duncan B. (Dept. of Oncology Physics, Edinburgh Cancer Centre, Western General Hospital, Edinburgh (United Kingdom))

    2008-08-15

    Computer-based artificial intelligence methods for classification and delineation of the gross tumour volume (GTV) on computerised tomography (CT) and magnetic resonance (MR) images do not, at present, provide the accuracy required for radiotherapy applications. This paper describes an image analysis method for classification of distinct regions within the GTV, and other clinically relevant regions, on CT images acquired on eight bladder cancer patients at the radiotherapy planning stage and thereafter at regular intervals during treatment. Statistical and fractal textural features (N=27) were calculated on the bladder, rectum and a control region identified on axial, coronal and sagittal CT images. Unsupervised classification results demonstrate that with a reduced feature set (N=3) the approach offers significant classification accuracy on axial, coronal and sagittal CT image planes and has the potential to be developed further for radiotherapy applications, particularly towards an automatic outlining approach

  15. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... nuclear volume in meningiomas might help identify a group at risk of recurrence....

  16. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    National Research Council Canada - National Science Library

    Patev, Robert C; Putcha, Chandra; Foltz, Stuart D

    2005-01-01

    .... This report summarizes research on methodologies to assist in quantifying risks related to dam gates and associated operating equipment, and how those risks relate to overall spillway failure risk...

  17. Three-Dimensional Hysterosalpingo Contrast Sonography with Gel Foam: Methodology and Feasibility to Obtain 3-Dimensional Volumes of Tubal Shape.

    Science.gov (United States)

    Exacoustos, Caterina; Pizzo, Alessandra; Lazzeri, Lucia; Pietropolli, Adalgisa; Piccione, Emilio; Zupi, Errico

    To investigate the feasibility of hysterosalpingo foam sonography (HyFoSy) with automated 3-dimensional (3D) software in the evaluation of tubal patency and visualization of the tubal course by obtaining a 3D volume acquisition of tubes. Prospective observational study (Canadian Task Force classification III). University hospital. A total of 132 infertile females evaluated between October 2013 and February 2015. All patients underwent HyFoSy with the new automated 3D coded contrast imaging (CCI) followed by 2-dimensional (2D) real-time HyFoSy. To evaluate the feasibility of 3D visualization of the tubal course, consecutive volume acquisitions were performed during gel foam contrast agent injection. Conventional 2D real-time hysterosalpingo contrast sonography (HyCoSy) by detection of gel foam moving through the tubes and around the ovaries was finally performed and considered to indicate the final results of tubal status. All the patients underwent 3D CCI HyFoSy, followed by 2D real-time HyFoSy. After both procedures, we observed 108 patients (81.8%) with bilateral tubal patency, 22 patients (16.6%) with unilateral tubal patency, and 2 patients (1.5%) with bilateral tubal occlusion. The concordance rate for tubal status between the first and second 3D volume acquisitions and the final 2D real-time evaluation was 84.8% and 97.0%, respectively. Transvaginal ultrasound HyFoSy with 3D volume reconstruction of the uterus and tubes is an accurate and safe technique that allows complete visualization of tubal shape and patency with high patient compliance. Copyright © 2017 AAGL. Published by Elsevier Inc. All rights reserved.

  18. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  19. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  20. Methodological choices for research in Information Science: Contributions to domain analysis

    Directory of Open Access Journals (Sweden)

    Juliana Lazzarotto FREITAS

    Full Text Available Abstract The article focuses on the ways of organizing studies according to their methodological choices in the Base Referencial de Artigos de Periódicos em Ciência da Informação (Reference Database of Journal articles in Information Science. We highlight how the organization of scientific production by the methodological choices in Information Science contributes to the identification of its production features and domain analysis. We studied research categories and proposed five classification criteria: research purposes, approaches, focus, techniques and type of analysis. The proposal of a corpus in Information Science is empirically applied, represented by 689 articles, 10% of the production indexed in Base Referencial de Artigos de Periódicos em Ciência da Informação from 1972 to 2010. We adopt content analysis to interpret the methodological choices of authors identified in the corpus. The results point out that exploratory studies are more predominant when considering the research purpose; regarding the research approach, bibliographic and documentary studies are more predominant; systematic observation, questionnaire and interview were the most widely used techniques; document analysis and content analysis are the most widely used types of analysis; the research focus of theoretical, historical and bibliometric studies are more predominant. We found that some studies use two methodological choices and explicit epistemological approaches, such as the studies following the positivist approach in the 1970s, and those influenced by the phenomenological approach in the 1980s, which increased the use of methods in qualitative research.

  1. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  2. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  3. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    Science.gov (United States)

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  6. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  7. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  8. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  9. Predicted costs of environmental controls for a commercial oil shale industry. Volume 1. An engineering analysis

    Energy Technology Data Exchange (ETDEWEB)

    Nevens, T.D.; Culbertson, W.J. Jr.; Wallace, J.R.; Taylor, G.C.; Jovanovich, A.P.; Prien, C.H.; Hicks, R.E.; Probstein, R.F.; Domahidy, G.

    1979-07-01

    The pollution control costs for a commercial oil shale industry were determined in a joint effort by Denver Research Institute, Water Purification Associates of Cambridge, and Stone and Webster Engineering of Boston and Denver. Four commercial oil shale processes were considered. The results in terms of cost per barrel of syncrude oil are predicted to be as follows: Paraho Process, $0.67 to $1.01; TOSCO II Process, $1.43 to $1.91; MIS Process, $2.02 to $3.03; and MIS/Lurgi-Ruhrgas Process, $1.68 to $2.43. Alternative pollution control equipment and integrated pollution control strategies were considered and optimal systems selected for each full-scale plant. A detailed inventory of equipment (along with the rationale for selection), a detailed description of control strategies, itemized costs and predicted emission levels are presented for each process. Capital and operating cost data are converted to a cost per barrel basis using detailed economic evaluation procedures. Ranges of cost are determined using a subjective self-assessment of uncertainty approach. An accepted methodology for probability encoding was used, and cost ranges are presented as subjective probability distributions. Volume I presents the detailed engineering results. Volume II presents the detailed analysis of uncertainty in the predicted costs.

  10. A retrospective analysis of complications of large volume liposuction; local perspective from a third world country

    International Nuclear Information System (INIS)

    Arshad, S.M.; Latif, S.; Altaf, H.N.

    2017-01-01

    This study was aimed at evaluating the complications that occurred in patients undergoing large volume liposuction and to see if there was a correlation between amount of aspirate and the rate of complications. Methodology: A detailed history, complete physical examination, BMI, and anthropometric measurements were documented for all patients. All patients under went liposuction using tumescent technique under general anesthesia in Yusra General Hospital. Patients were discharged home after 24 to 48 hours. Pressure garments were advised for 6 weeks and were called for weekly follow up for 6 weeks. Pressure garments were advised for 6 weeks. Complications were documented. SPSS version 20 was used for analysis of data. Results: Out of 217 patients, 163 (75%) were female and 54 male. Mean age was 37.1 SD+-6.7 years. Bruising and seroma were most common complications; 4.1% and 2.3%, respectively. The incidence of infection was 0.9%. One patient had over-correction and four patients (1.8%) had under-correction. Significant blood loss was encountered in one patient. Two patients (0.9%) had pulmonary embolism and 2(0.9%) suffered from necrotizing fasciitis. None of our patients undergoing large volume liposuction had fat embolism and there was no mortality. Conclusion: Careful patient selection and strict adherence to guidelines can ensure a good outcome and can minimize risk of complications. Both physicians and patients should be educated to have realistic expectations to avoid complications and improve patient safety. (author)

  11. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  12. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  13. Two-dimensional transient thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Yalle Negreiros; Silva, Mário Augusto Bezerra da; Lira, Carlos Alberto de Oliveira, E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear

    2017-07-01

    One of the greatest concerns when studying a nuclear reactor is the warranty of safe temperature limits all over the system at all time. The preservation of core structure along with the constraint of radioactive material into a controlled system are the main focus during the operation of a reactor. The purpose of this paper is to present the temperature distribution for a nominal channel of the AP1000 reactor developed by Westinghouse Co. during steady-state and transient operations. In the analysis, the system was subjected to normal operation conditions and then to blockages of the coolant flow. The time necessary to achieve a new safe stationary stage (when it was possible) was presented. The methodology applied in this analysis was based on a two-dimensional survey accomplished by the application of Finite Volume Method (FVM). A steady solution is obtained and compared with an analytical analysis that disregard axial heat transport to determine its relevance. The results show the importance of axial heat transport consideration in this type of study. A transient analysis shows the behavior of the system when submitted to coolant blockage at channel's entrance. Three blockages were simulated (10%, 20% and 30%) and the results show that, for a nominal channel, the system can still be considerate safe (there's no bubble formation until that point). (author)

  14. Pertinent anatomy and analysis for midface volumizing procedures.

    Science.gov (United States)

    Surek, Christopher C; Beut, Javier; Stephens, Robert; Jelks, Glenn; Lamb, Jerome

    2015-05-01

    The study was conducted to construct an anatomically inspired midfacial analysis facilitating safe, accurate, and dynamic nonsurgical rejuvenation. Emphasis is placed on determining injection target areas and adverse event zones. Twelve hemifacial fresh cadavers were dissected in a layered fashion. Dimensional measurements between the midfacial fat compartments, prezygomatic space, mimetic muscles, and neurovascular bundles were used to develop a topographic analysis for clinical injections. A longitudinal line from the base of the alar crease to the medial edge of the levator anguli oris muscle (1.9 cm), lateral edge of the levator anguli oris muscle (2.6 cm), and zygomaticus major muscle (4.6 cm) partitions the cheek into two aesthetic regions. A six-step facial analysis outlines three target zones and two adverse event zones and triangulates the point of maximum cheek projection. The lower adverse event zone yields an anatomical explanation to inadvertent jowling during anterior cheek injection. The upper adverse event zone localizes the palpebral branch of the infraorbital artery. The medial malar target area isolates quadrants for anterior cheek projection and tear trough effacement. The middle malar target area addresses lid-cheek blending and superficial compartment turgor. The lateral malar target area highlights lateral cheek projection and locates the prezygomatic space. This stepwise analysis illustrates target areas and adverse event zones to achieve midfacial support, contour, and profile in the repose position and simultaneous molding of a natural shape during animation. This reproducible method can be used both procedurally and in record-keeping for midface volumizing procedures.

  15. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  16. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  17. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  18. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  19. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  20. Methodological Principles of Assessing the Volume of Investment Influx from Non-State Pension Funds into the Economy of Ukraine

    Directory of Open Access Journals (Sweden)

    Dmitro Leonov

    2004-11-01

    Full Text Available This article addresses the processes of forming investment resources from nonstate pension funds under current conditions in Ukraine and the laws and regula tions that define the principles of the formation of in vestment institutions. Based on factors that in the near est future will affect the decisionmaking process by which different kinds of investors make payments to non state pension funds, we develop a procedure for assessing the volume of investment influx from nonstate pension funds into the economy and propose a procedure for long and shortterm prognosis of the volume of investment in flux from nonstate pension funds into the Ukrainian economy.

  1. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  2. Tidal volume and mortality in mechanically ventilated children: a systematic review and meta-analysis of observational studies*.

    Science.gov (United States)

    de Jager, Pauline; Burgerhof, Johannes G M; van Heerde, Marc; Albers, Marcel J I J; Markhorst, Dick G; Kneyber, Martin C J

    2014-12-01

    To determine whether tidal volume is associated with mortality in critically ill, mechanically ventilated children. MEDLINE, EMBASE, and CINAHL databases from inception until July 2013 and bibliographies of included studies without language restrictions. Randomized clinical trials and observational studies reporting mortality in mechanically ventilated PICU patients. Two authors independently selected studies and extracted data on study methodology, quality, and patient outcomes. Meta-analyses were performed using the Mantel-Haenszel random-effects model. Heterogeneity was quantified using I. Study quality was assessed using the Newcastle-Ottawa Score for cohort studies. Out of 142 citations, seven studies met the inclusion criteria, and additional two articles were identified from references of the found articles. One was excluded. These eight studies included 1,756 patients. Mortality rates ranged from 13% to 42%. There was no association between tidal volume and mortality when tidal volume was dichotomized at 7, 8, 10, or 12 mL/kg. Comparing patients ventilated with tidal volume less than 7 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg and tidal volume less than 8 mL/kg and greater than 10 mL/kg or greater than 12 mL/kg also showed no association between tidal volume and mortality. Limiting the analysis to patients with acute lung injury/acute respiratory distress syndrome did not change these results. Heterogeneity was observed in all pooled analyses. A relationship between tidal volume and mortality in mechanically ventilated children could not be identified, irrespective of the severity of disease. The significant heterogeneity observed in the pooled analyses necessitates future studies in well-defined patient populations to understand the effects of tidal volume on patient outcome.

  3. Methodology of demand forecast by market analysis of electric power and load curves

    International Nuclear Information System (INIS)

    Barreiro, C.J.; Atmann, J.L.

    1989-01-01

    A methodology for demand forecast of consumer classes and their aggregation is presented. An analysis of the actual attended market can be done by appropriate measures and load curves studies. The suppositions for the future market behaviour by consumer classes (industrial, residential, commercial, others) are shown, and the actions for optimise this market are foreseen, obtained by load curves modulations. The process of future demand determination is obtained by the appropriate aggregation of this segmented demands. (C.G.C.)

  4. An alternative methodology for the analysis of electrical resistivity data from a soil gas study

    OpenAIRE

    Johansson, Sara; Rosqvist, Hakan; Svensson, Mats; Dahlin, Torleif; Leroux, Virginie

    2011-01-01

    The aim of this paper is to present an alternative method for the analysis of resistivity data. The methodology was developed during a study to evaluate if electrical resistivity can be used as a tool for analysing subsurface gas dynamics and gas emissions from landfills. The main assumption of this study was that variations in time of resistivity data correspond to variations in the relative amount of gas and water in the soil pores. Field measurements of electrical resistivity, static chamb...

  5. Environmental Analysis of Springs in Urban Areas–A Methodological Proposal

    OpenAIRE

    Milton Pavezzi Netto; Gustavo D'Almeida Scarpinella; Ricardo Siloto da Silva

    2013-01-01

    The springs located in urban areas are the outpouring of surface water, which can serve as water supply, effluent receptors and important local macro-drainage elements. With unplanned occupation, non-compliance with environmental legislation and the importance of these water bodies, it is vital to analyze the springs within urban areas, considering the Brazilian forest code. This paper submits an analysis and discussion methodology proposal of environmental compliance fun...

  6. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  7. Unique Approach to Threat Analysis Mapping: A Malware Centric Methodology for Better Understanding the Adversary Landscape

    Science.gov (United States)

    2016-04-05

    A Unique Approach to Threat Analysis Mapping: A Malware-Centric Methodology for Better Understanding the Adversary Landscape Deana Shick Kyle...allows at- tackers to execute arbitrary code via unspecified vectors [Mitre 2016]. Again the wide landscape and usage of Adobe Flash Player made this...after-free vulnerability in Microsoft Internet Ex- plorer affecting versions 9 and 10 [Mitre 2016]. The attack landscape of these vulnerabilities was

  8. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  9. Tourism Methodologies - New Perspectives, Practices and Procedures

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in differen...... codings and analysis, and tapping into the global network of social media.......This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  10. Efficacy of bronchoscopic lung volume reduction: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Iftikhar IH

    2014-05-01

    Full Text Available Imran H Iftikhar,1 Franklin R McGuire,1 Ali I Musani21Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, University of South Carolina, Columbia, SC, USA; 2Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, National Jewish Health, Denver, CO, USABackground: Over the last several years, the morbidity, mortality, and high costs associated with lung volume reduction (LVR surgery has fuelled the development of different methods for bronchoscopic LVR (BLVR in patients with emphysema. In this meta-analysis, we sought to study and compare the efficacy of most of these methods.Methods: Eligible studies were retrieved from PubMed and Embase for the following BLVR methods: one-way valves, sealants (BioLVR, LVR coils, airway bypass stents, and bronchial thermal vapor ablation. Primary study outcomes included the mean change post-intervention in the lung function tests, the 6-minute walk distance, and the St George's Respiratory Questionnaire. Secondary outcomes included treatment-related complications.Results: Except for the airway bypass stents, all other methods of BLVR showed efficacy in primary outcomes. However, in comparison, the BioLVR method showed the most significant findings and was the least associated with major treatment-related complications. For the BioLVR method, the mean change in forced expiratory volume (in first second was 0.18 L (95% confidence interval [CI]: 0.09 to 0.26; P<0.001; in 6-minute walk distance was 23.98 m (95% CI: 12.08 to 35.88; P<0.01; and in St George's Respiratory Questionnaire was −8.88 points (95% CI: −12.12 to −5.64; P<0.001.Conclusion: The preliminary findings of our meta-analysis signify the importance of most methods of BLVR. The magnitude of the effect on selected primary outcomes shows noninferiority, if not equivalence, when compared to what is known for surgical LVR.Keyword: emphysema, endobronchial valves, sealants, stents, coils

  11. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  12. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  13. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  14. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    International Nuclear Information System (INIS)

    Reimund, Kevin K.

    2015-01-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π(1+√w -1 ), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at "maximum power density operating pressure" requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  15. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  16. Analysis of individual tree volume equations for Cupressus ...

    African Journals Online (AJOL)

    Three different volume equations were fitted to individual tree volume (V) data collected on 260 Cupressus lusitanica trees from 49 plantations in Munessa Shashemene Forest, Ethiopia. The data were first split randomly into equation development and equation testing data sets of equal size. Diameter at breast height (D) ...

  17. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  18. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stéphanie Martine; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler

  19. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  20. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  1. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  2. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  3. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, T. B.; Ketzel, Matthias; Skov, H.

    2016-01-01

    Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to successfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach...

  4. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  5. Wet weather highway accident analysis and skid resistance data management system (volume II : user's manual).

    Science.gov (United States)

    1992-06-01

    The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...

  6. Wet weather highway accident analysis and skid resistance data management system (volume I).

    Science.gov (United States)

    1992-06-01

    The objectives and scope of this research are to establish an effective methodology for wet weather accident analysis and to develop a database management system to facilitate information processing and storage for the accident analysis process, skid...

  7. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  8. Analysis of local bus markets : volume I – methodology and findings : final report.

    Science.gov (United States)

    2017-07-04

    Despite having an extensive network of public transit, traffic congestion and transportation-related greenhouse gas (GHG) emissions are significant concerns in New Jersey. This research hypothesizes that traffic congestion and air quality concerns in...

  9. FAA Air Traffic Control Operations Concepts. Volume 1. ATC Background and Analysis Methodology. Change 1

    Science.gov (United States)

    1988-07-29

    u.a.,85 :ALPHANUME IC ENTRIES PROCESO WINFORMATION RMK DATA PROCESSI-NG SUBSYSTEM ..u....u.a.....u...........h................u....h.u..u.. FDEP...upon or control a situation or state of affairs. These usually are of a verbal or motor nature. (OUTPUT) These three forms of tasks would appear in... Motor (voice, tactile) processes. However, many tasks are a composite of several of these processes, as noted in the characterizations of task type

  10. A SAS2H/KENO-V methodology for 3D fuel burnup analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.

    2002-01-01

    An efficient methodology for 3D fuel burnup analysis of LWR reactors is described in this paper. This methodology is founded on coupling Monte Carlo method for 3D calculation of node power distribution, and transport method for depletion calculation in ID Wigner-Seitz equivalent cell for each node independently. The proposed fuel burnup modeling, based on application of SCALE-4.4a control modules SAS2H and KENO-V.a is verified for the case of 2D x-y model of IRIS 15 x 15 fuel assembly (with reflective boundary condition) by using two well benchmarked code systems. The one is MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility code, and the second is KENO-V.a/ORIGEN2.1 code system recently developed by authors of this paper. The proposed SAS2H/KENO-V.a methodology was applied for 3D burnup analysis of IRIS-1000 benchmark.44 core. Detailed k sub e sub f sub f and power density evolution with burnup are reported. (author)

  11. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  12. A gap analysis methodology for collecting crop genepools: a case study with phaseolus beans.

    Directory of Open Access Journals (Sweden)

    Julián Ramírez-Villegas

    Full Text Available BACKGROUND: The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis of crop wild relatives as a means to guide efficient and effective collecting activities. METHODOLOGY/PRINCIPAL FINDINGS: The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5% are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap "hotspots", representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. CONCLUSIONS/SIGNIFICANCE: Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding. Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources.

  13. Vicinity analysis: a methodology for the identification of similar protein active sites.

    Science.gov (United States)

    McGready, A; Stevens, A; Lipkin, M; Hudson, B D; Whitley, D C; Ford, M G

    2009-05-01

    Vicinity analysis (VA) is a new methodology developed to identify similarities between protein binding sites based on their three-dimensional structure and the chemical similarity of matching residues. The major objective is to enable searching of the Protein Data Bank (PDB) for similar sub-pockets, especially in proteins from different structural and biochemical series. Inspection of the ligands bound in these pockets should allow ligand functionality to be identified, thus suggesting novel monomers for use in library synthesis. VA has been developed initially using the ATP binding site in kinases, an important class of protein targets involved in cell signalling and growth regulation. This paper defines the VA procedure and describes matches to the phosphate binding sub-pocket of cyclin-dependent protein kinase 2 that were found by searching a small test database that has also been used to parameterise the methodology.

  14. Next generation iterative transport-diffusion methodology (ITDM), for LWR core analysis

    Science.gov (United States)

    Colameco, David V.

    The work in this dissertation shows that the neutronic modeling of a Pressurized Water Reactor (PWR) could be greatly improved through the use of an iterative transport-diffusion method (one-step procedure) compared to the current once through transport to diffusion methodology (two-step procedure). The current methodology is efficient; however the infinite environment approximation of the transport lattice calculation introduces errors in the diffusion core calculation due to the lack of the effect of the core environment. This iterative transportdiffusion method replaces the infinite environment with a simulated 3D environment of the diffusion calculation. This dissertation further develops previous work of ITDM in 2D, into a 3D simulated environment with contributions being made in axial leakage treatment. Burnup steps are simulated over a cycle, and in the future simple thermal modeling can be added, for full core fuel cycle analysis. (Abstract shortened by UMI.).

  15. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  16. [Risk analysis study for patients care in radiotherapy: Learning curve and methodological evolution].

    Science.gov (United States)

    El Bakri, S; Fleury, B; Le Grévellec, M

    2015-10-01

    To describe the evaluation of our risk mapping methodology over the past two years. Based on the FMEA (failure mode effects analysis) method, some aspects have been adapted, e.g. the concept of risk control and effort scale, some others have been introduced, e.g. the concept of residual risk management. A weekly meeting is scheduled by a multidisciplinary team in order to support the different projects. Experiments and practice have led us to upgrade our scales of gravity and detectability, identify critical points and introduce the residual risk management concept. Some difficulties with regards to the multiplicity of scenarios still prevail. Risk mapping is an essential tool in the implementation of risk quality management, specifically when the methodology is progressive and takes into consideration all the members of a pluridisciplinary team. Copyright © 2015 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  17. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems....... The transformation of ordinary systems to element-based ones and the aggregation of non-key elements allow the important design parameters, such as the number of stages, feed stage and minimum reflux ratio, to be determined by using simple diagrams similar to those regularly employed for non-reactive systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...

  18. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  19. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  20. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  1. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Faculdade de Medicina. Dept. de Biologia Molecular], e-mail: mejia_famerp@yahoo.com.br; Braga, J. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Div. de Astrofisica; Correa, R. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Ciencia Espacial e Atmosferica; Leite, J.P. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Neurologia, Psiquiatria e Psicologia Medica; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica

    2009-08-15

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  2. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    International Nuclear Information System (INIS)

    Mejia, J.; Galvis-Alonso, O.Y.; Braga, J.; Correa, R.; Leite, J.P.; Simoes, M.V.

    2009-01-01

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  3. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    Directory of Open Access Journals (Sweden)

    J. Mejia

    2009-08-01

    Full Text Available Single-photon emission computed tomography (SPECT is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.

  4. An Analysis of Insider Trading in the Credit Derivatives Market Using the Event Study Methodology

    Directory of Open Access Journals (Sweden)

    Ewa Wareluk

    2013-12-01

    Full Text Available Purpose: In this paper I investigate the information fl ow between the credit default swap market and the stock market as well as insider trading in the credit default swap market. Methodology: For my analysis I use the event study methodology. Using the event study methodology I calculate abnormal stock returns and abnormal credit default swap premium changes. The analysis is based on 175,874 observations collected for 92 companies between the years 2001 and 2010. Findings: The results show that the information fl ow from the credit default swap market to the stock market is the most signifi cant in terms of negative rating outlooks. The information fl ow is much less signifi cant in relations to negative surprises during announcements of annual fi nancial results and rating upgrades. Evidence of insider trading is also most evident with reference to negative rating outlooks. Additionally, a distinctive feature of the credit default swap market and the stock market is the asymmetric response to negative and positive credit information. Research limitations: The event study methodology does not consider other potentially important reasons for the information flow between markets than the ones actually investigated. The credit events and credit risk information used in this research are just a proposal and can be extended by future researchers. Originality: This paper discusses a new research area. The main research area in terms of insider trading is still the stock market, with special focus on the US market. I decided to explore the insider trading phenomenon in the credit default swap market. I only considered contracts that are quoted with reference to European underlying assets. This part of the fi nancial market is attractive in terms of economic research as credit derivatives are more commonly used not only in North America but also in Europe.

  5. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    International Nuclear Information System (INIS)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-01-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3)

  7. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  8. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  9. METHODOLOGICAL ANALYSIS OF STUDYING THE PROBLEM OF PERCEPTION IN FUTURE MUSIC TEACHERS’ PROFESSIONAL TRAINING

    Directory of Open Access Journals (Sweden)

    Zhang Bo

    2017-04-01

    Full Text Available In the article the methodological analysis of problem of perception in future music teachers’ professional training is presented. The author of the article analyses works of outstanding scientists in philosophy, psychology, and art education. The hierarchical system of musical perception options is revealed. A methodological foundation is supported by consideration of the following modern research in specialty – a theory and methodology of musical study that gives proper appearance and circumstantiality to the presented material. Studying the vocal and choral researches in the field of forming the valued music art perception by future music teachers, an author sets an aim to present the methodological analysis of the problem of perception in future music teachers’ professional training. Realization of the system approach to updating the problem of forming the valued music art perception of future music teachers while being trained to vocal and choral work with senior pupils extends their artistic awareness; contributes to distinguishing art works, phenomena; to seeing their properties; to providing orientation in the informative content of music art works. The special attention is paid to revealing methodological principles of perception of category research in the aspect of the valued understanding images of music art works. As a result of analysing scientific sources on the issue of voice production the author of the article finds out that perception is densely related to transformation of external information, conditioning for forming images, operating category attention, memory, thinking, and emotions. The features of perception of maintaining vocal and choral studies and students’ extrapolation are analysed in the process of future professional activity with senior pupils in the aspects of perception and transformation of musical and intonation information, analysis, object perception, and interpretation in accordance with future

  10. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  11. Study of Automobile Market Dynamics : Volume 2. Analysis.

    Science.gov (United States)

    1977-08-01

    Volume II describes the work in providing statistical inputs to a computer model by examining the effects of various options on the number of automobiles sold; the distribution of sales among small, medium and large cars; the distribution between aut...

  12. Drug research methodology. Volume 3, The detection and quantitation of drugs of interest in body fluids from drivers

    Science.gov (United States)

    1980-03-01

    This report presents the findings of a workshop on the chemical analysis of human body fluids for drugs of interest in highway safety. A cross-disciplinary panel of experts reviewed the list of drugs of interest developed in a previous workshop and d...

  13. Influence of Software Tool and Methodological Aspects of Total Metabolic Tumor Volume Calculation on Baseline [18F]FDG PET to Predict Survival in Hodgkin Lymphoma.

    Science.gov (United States)

    Kanoun, Salim; Tal, Ilan; Berriolo-Riedinger, Alina; Rossi, Cédric; Riedinger, Jean-Marc; Vrigneaud, Jean-Marc; Legrand, Louis; Humbert, Olivier; Casasnovas, Olivier; Brunotte, François; Cochet, Alexandre

    2015-01-01

    To investigate the respective influence of software tool and total metabolic tumor volume (TMTV0) calculation method on prognostic stratification of baseline 2-deoxy-2-[18F]fluoro-D-glucose positron emission tomography ([18F]FDG-PET) in newly diagnosed Hodgkin lymphoma (HL). 59 patients with newly diagnosed HL were retrospectively included. [18F]FDG-PET was performed before any treatment. Four sets of TMTV0 were calculated with Beth Israel (BI) software: based on an absolute threshold selecting voxel with standardized uptake value (SUV) >2.5 (TMTV02.5), applying a per-lesion threshold of 41% of the SUV max (TMTV041) and using a per-patient adapted threshold based on SUV max of the liver (>125% and >140% of SUV max of the liver background; TMTV0125 and TMTV0140). TMTV041 was also determined with commercial software for comparison of software tools. ROC curves were used to determine the optimal threshold for each TMTV0 to predict treatment failure. Median follow-up was 39 months. There was an excellent correlation between TMTV041 determined with BI and with the commercial software (r = 0.96, pfree survival (PFS) were respectively: 313 ml and 0.70, 432 ml and 0.68, 450 ml and 0.68, 330 ml and 0.68. There was no significant difference between ROC curves. High TMTV0 value was predictive of poor PFS in all methodologies: 4-years PFS was 83% vs 42% (p = 0.006) for TMTV02.5, 83% vs 41% (p = 0.003) for TMTV041, 85% vs 40% (p<0.001) for TMTV0125 and 83% vs 42% (p = 0.004) for TMTV0140. In newly diagnosed HL, baseline metabolic tumor volume values were significantly influenced by the choice of the method used for determination of volume. However, no significant differences were found in term of prognosis.

  14. Three years of the OCRA methodology in Brazil: critical analysis and results.

    Science.gov (United States)

    Ruddy, Facci; Eduardo, Marcatto; Edoardo, Santino

    2012-01-01

    The Authors make a detailed analysis of the introduction of the OCRA Methodology in Brazil that started in August 2008 with the launching of the "OCRA Book" translated to Portuguese. They evaluate the importance of the assessment of the exposure of the upper limbs to the risk due to repetitive movements and efforts, according to the national and international legislation, demonstrating the interconnection of the OCRA Methodology with the Regulating Norms of the Ministry of Labor and Work (NRs - MTE), especially with the NR-17 and its Application Manual. They discuss the new paradigms of the OCRA Method in relation to the classic paradigms of the ergonomic knowledge. They indicate the OCRA Method as the tool to be used for the confirmation or not of the New Previdentiary Epidemiologic Nexus NTEP/FAP. The Authors present their conclusions based on the practical results the "participants certified by the OCRA Methodology" achieved in the application on different laboral activities in diverse economic segments, showing the risk reduction and the productivity of the companies.

  15. Multiscale Entropy Analysis of Center-of-Pressure Dynamics in Human Postural Control: Methodological Considerations

    Directory of Open Access Journals (Sweden)

    Brian J. Gow

    2015-11-01

    Full Text Available Multiscale entropy (MSE is a widely used metric for characterizing the nonlinear dynamics of physiological processes. Significant variability, however, exists in the methodological approaches to MSE which may ultimately impact results and their interpretations. Using publications focused on balance-related center of pressure (COP dynamics, we highlight sources of methodological heterogeneity that can impact study findings. Seventeen studies were systematically identified that employed MSE for characterizing COP displacement dynamics. We identified five key methodological procedures that varied significantly between studies: (1 data length; (2 frequencies of the COP dynamics analyzed; (3 sampling rate; (4 point matching tolerance and sequence length; and (5 filtering of displacement changes from drifts, fidgets, and shifts. We discuss strengths and limitations of the various approaches employed and supply flowcharts to assist in the decision making process regarding each of these procedures. Our guidelines are intended to more broadly inform the design and analysis of future studies employing MSE for continuous time series, such as COP.

  16. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  17. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  18. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  19. Quantitative studies of rhubarb using quantitative analysis of multicomponents by single marker and response surface methodology.

    Science.gov (United States)

    Sun, Jiachen; Wu, Yueting; Dong, Shengjie; Li, Xia; Gao, Wenyuan

    2017-10-01

    In this work, we developed a novel approach to evaluate the contents of bioactive components in rhubarb. The present method was based on the quantitative analysis of multicomponents by a single-marker and response surface methodology approaches. The quantitative analysis of multicomponents by a single-marker method based on high-performance liquid chromatography coupled with photodiode array detection was developed and applied to determine the contents of 12 bioactive components in rhubarb. No significant differences were found in the results from the quantitative analysis of multicomponents by a single-marker and the external standard method. In order to maximize the extraction of 12 bioactive compounds in rhubarb, the ultrasonic-assisted extraction conditions were obtained by the response surface methodology coupled with Box-Behnken design. According to the obtained results, we showed that the optimal conditions would be as follows: proportion of ethanol/water 74.39%, solvent-to-solid ratio 24.07:1 v/w, extraction time 51.13 min, and extraction temperature 63.61°C. The analytical scheme established in this research should be a reliable, convenient, and appropriate method for quantitative determination of bioactive compounds in rhubarb. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous......) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...

  1. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    , is presented and detailed to a level enabling an analysis of the relation to the impact categories at midpoint level considered in life cycle (LC) methodologies.The interpretation of the definition of sustainability as outlined in Our Common Future (WCED 1987) suggests that the assessment of a product...... sustainability assessment (LCSA) if focusing on the monetary gains or losses for the poor. Yet, this is an aspect which is already considered in several SLCA approaches.The current consensus that LCSA can be performed through combining the results from an SLCA, LCA and LCC is only partially supported...

  2. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  3. Problem solving and data analysis using Minitab a clear and easy guide to six sigma methodology

    CERN Document Server

    Khan, Rehman M

    2013-01-01

    Six Sigma statistical methodology using Minitab Problem Solving and Data Analysis using Minitab presents example-based learning to aid readers in understanding how to use MINITAB 16 for statistical analysis and problem solving. Each example and exercise is broken down into the exact steps that must be followed in order to take the reader through key learning points and work through complex analyses. Exercises are featured at the end of each example so that the reader can be assured that they have understood the key learning points. Key features: Provides readers with a step by step guide to problem solving and statistical analysis using Minitab 16 which is also compatible with version 15. Includes fully worked examples with graphics showing menu selections and Minitab outputs. Uses example based learning that the reader can work through at their pace. Contains hundreds of screenshots to aid the reader, along with explanations of the statistics being performed and interpretation of results. Presents the core s...

  4. Adaptation of SW-846 methodology for the organic analysis of radioactive mixed wastes

    International Nuclear Information System (INIS)

    Griest, W.H.; Schenley, R.L.; Tomkins, B.A.; Caton, J.E. Jr.; Fleming, G.S.; Harmon, S.H.; Wachter, L.J.; Garcia, M.E.; Edwards, M.D.

    1990-01-01

    Modifications to SW-846 sample preparation methodology permit the organic analysis of radioactive mixed waste with minimum personal radiation exposure and equipment contamination. This paper describes modifications to SW-846 methods 5030 and 3510-3550 for sample preparation in radiation-zoned facilities (hood, glove box, and hot cell) and GC-MS analysis of the decontaminated organic extracts in a conventional laboratory for volatile and semivolatile organics by methods 8240 and 8270 (respectively). Results will be presented from the analysis of nearly 70 nuclear waste storage tank liquids and 17 sludges. Regulatory organics do not account for the organic matter suggested to be present by total organic carbon measurements. 7 refs., 5 tabs

  5. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  6. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  7. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-01-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  8. CT volumetric analysis of pleural effusions: a comparison with thoracentesis volumes.

    Science.gov (United States)

    Chiao, David; Hanley, Michael; Olazagasti, Juan M

    2015-09-01

    The primary objective of this study was to compare computed tomography (CT) volumetric analysis of pleural effusions with thoracentesis volumes. The secondary objective of this study was to compare subjective grading of pleural effusion size with thoracentesis volumes. This was a retrospective study of 67 patients with free-flowing pleural effusions who underwent therapeutic thoracentesis. CT volumetric analysis was performed on all patients; the CT volumes were compared with the thoracentesis volumes. In addition, the subjective grading of pleural effusion size was compared with the thoracentesis volumes. The average difference between CT volume and thoracentesis volume was 9.4 mL (1.3%) ± 290 mL (30%); these volumes were not statistically different (P = .79, paired two-tailed Student's t-test). The thoracentesis volume of a "small," "moderate," and "large" pleural effusion, as graded on chest CT, was found to be approximately 410 ± 260 cc, 770 ± 270 mL and 1370 ± 650 mL, respectively; the thoracentesis volume of a "small," "moderate," and "large" pleural effusion, as graded on chest radiograph, was found to be approximately 610 ± 320 mL, 1040 ± 460 mL, and 1530 ± 830 mL, respectively. CT volumetric analysis is an accessible tool that can be used to accurately quantify the size of pleural effusions. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  9. Statistical forensic methodology for oil spill source identification using two-tailed student's t approach. Volume 1

    International Nuclear Information System (INIS)

    Yang, C.; Wang, Z.; Hollebone, B.; Brown, C.E.; Landriault, M.

    2007-01-01

    A thorough chemical characterization of oil must be conducted following an oil spill in order to determine the source of the oil, to distinguish the spilled oil from background hydrocarbons and to quantitatively evaluate the extent of impact of the spill. Gas chromatography, flame ionization and mass spectrometry analysis was used in conjunction with statistical data analysis to determine the source of a spill that occurred in 2004 in a harbor in the Netherlands. Three oil samples were collected from the harbor spill, where a thick layer of oil was found between a bunker boat and the quay next to the bunker centre. The 3 samples were sent to different laboratories for a round robin test to defensively correlate the spilled oil to the suspected source candidates. The source characterization and identification was validated by quantitative evaluation of 5 petroleum-characteristic alkylated PAH homologous series (naphthalene, phenanthrene, dibenzothiophene, fluorene and chrysene), pentacyclic biomarkers, bicyclic sesquiterpanes and diamondoid compounds. The use of biomarkers for identifying the source of spilled oils has also increased in recent years due to their specificity and high resistance to biodegradation. There was no strong difference among the 3 oil samples according to radar plots of diagnostic ratios of PAHs, isoprenoids, biomarkers, bicyclic sesquiterpanes and diamondoids. The two-tailed unpaired student's t-tests provided strong evidence for which ship was responsible for the oil spill incident. However, it was cautioned that although two-tailed unpaired student's t-tests along with oil fingerprinting successfully identified the spill source, the method has limitations. Experimental results showed that the spilled oil and two source candidates were quite similar in both chemical fingerprints and concentration profiles for determined target hydrocarbons. 17 refs., 4 tabs., 3 figs

  10. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    Science.gov (United States)

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  11. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil

    Directory of Open Access Journals (Sweden)

    Gustavo A. Silva

    2018-03-01

    Full Text Available Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power, hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization. It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining. Keywords: Earth sciences, Business, Economics, Industry

  12. Gait analysis methodology for the measurement of biomechanical parameters in total knee arthroplasties. A literature review.

    Science.gov (United States)

    Papagiannis, Georgios I; Triantafyllou, Athanasios I; Roumpelakis, Ilias M; Papagelopoulos, Panayiotis J; Babis, George C

    2018-03-01

    Gait analysis using external skin markers provides scope for the study of kinematic and kinetic parameters shown on different total knee arthroplasties (TKA). Thus an appropriate methodology is of great importance for the collection and correlation of valid data. Calibration of equipment is of great importance before measurements, to assure accuracy. Force plates should be calibrated to 1080 Hz and optoelectronic cameras should use 120 Hz frequency, because of the nature of gait activity. Davis model which accurately defines the position of the markers is widely accepted and cited, for the gait analysis of TKA's. To ensure the reproducibility of the measurement, a static trial at the anatomical position must be captured. Following, all acquisitions of dynamic data must be checked for consistency in walking speed, and abnormal gait style because of fatigue or distraction. To establish the repeatability of the measurement, this procedure must be repeated at a pre-defined number of 3-5 gait cycles. Anthropometric measurements should be combined with three-dimensional marker data from the static trial to provide positions of the joint's center and define anatomical axes of total knee arthroplasty. Kinetic data should be normalized to bodyweight (BW) and percentage of BW and height depending on the study. External moments should also be calculated by using inverse dynamics and amplitude-normalized to body mass (Nm/kg). Gait analysis using external skin markers provides scope for the study of biomechanical parameters shown on different TKAs. Thus a standard gait analysis methodology when measuring TKA biomechanical parameters is necessary for the collection and correlation of accurate, adequate, valid and reproducible data. Further research should be done to clarify if the development of a specific kinematic model is appropriate for a more accurate definition of total knee implant joint center in measurements concerning 3D gait analysis.

  13. A Closed-Loop Optimal Neural-Network Controller to Optimize Rotorcraft Aeromechanical Behaviour. Volume 1; Theory and Methodology

    Science.gov (United States)

    Leyland, Jane Anne

    2001-01-01

    Given the predicted growth in air transportation, the potential exists for significant market niches for rotary wing subsonic vehicles. Technological advances which optimise rotorcraft aeromechanical behaviour can contribute significantly to both their commercial and military development, acceptance, and sales. Examples of the optimisation of rotorcraft aeromechanical behaviour which are of interest include the minimisation of vibration and/or loads. The reduction of rotorcraft vibration and loads is an important means to extend the useful life of the vehicle and to improve its ride quality. Although vibration reduction can be accomplished by using passive dampers and/or tuned masses, active closed-loop control has the potential to reduce vibration and loads throughout a.wider flight regime whilst requiring less additional weight to the aircraft man that obtained by using passive methads. It is ernphasised that the analysis described herein is applicable to all those rotorcraft aeromechanical behaviour optimisation problems for which the relationship between the harmonic control vector and the measurement vector can be adequately described by a neural-network model.

  14. Methodology for the analysis of self-tensioned wooden structural floors

    Directory of Open Access Journals (Sweden)

    F. Suárez-Riestra

    2017-09-01

    Full Text Available It is described a self-tensioning system constituted by a force multiplying device which, attached to the supports of the ends of the structural element, is able to convert the vertical resultant from the gravitatonial actions into an effective tensioning action, through the movement that was induced by a set of rods. The self-tensioning system is able to offer a high performance, thanks to the beneficial effect of the opposite deflection generated by the tensioning, in proportion to the increasing of the gravitational action. This allows to design long-span timber ribbed floors using reduced depths. The complexity of calculation due to the non-linearity of the system can be obviated with the methodology of analysis developed in the article. In order to illustrate the advantages of the self-tensioning system and the methodology of analysis which were developed, six cases of ribbed floors have been analysed, with spans of 9, 12 and 15 m and variable using loads of 3,00 kN/m2 and 5,00 kN/m2.

  15. Investigating DMOs through the Lens of Social Network Analysis: Theoretical Gaps, Methodological Challenges and Practitioner Perspectives

    Directory of Open Access Journals (Sweden)

    Dean HRISTOV

    2015-06-01

    Full Text Available The extant literature on networks in tourism management research has traditionally acknowledged destinations as the primary unit of analysis. This paper takes an alternative perspective and positions Destination Management Organisations (DMOs at the forefront of today’s tourism management research agenda. Whilst providing a relatively structured approach to generating enquiry, network research vis-à-vis Social Network Analysis (SNA in DMOs is often surrounded by serious impediments. Embedded in the network literature, this conceptual article aims to provide a practitioner perspective on addressing the obstacles to undertaking network studies in DMO organisations. A simple, three-step methodological framework for investigating DMOs as interorganisational networks of member organisations is proposed in response to complexities in network research. The rationale behind introducing such framework lies in the opportunity to trigger discussions and encourage further academic contributions embedded in both theory and practice. Academic and practitioner contributions are likely to yield insights into the importance of network methodologies applied to DMO organisations.

  16. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  17. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  18. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  19. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained

  20. Geometrical considerations in dose volume analysis in intracavitary treatment

    International Nuclear Information System (INIS)

    Deshpande, D.D.; Shrivastava, S.K.; Pradhan, A.S.; Viswanathan, P.S.; Dinshaw, K.A.

    1996-01-01

    The present work was aimed at to study the relationship between the volume enclosed by reference iodose surface and various geometrical parameters of the intracavitary applicator in treatment of carcinoma of cervix. Pearshape volume of the reference isodose derived from the Total Reference Air Kerma (TRAK) and the product of its dimensions, height H, width W and thickness T which is dependent on the applicator geometry, were estimated for 100 intracavitary applications treated by Selectron LDR machine. Orthogonal radiographs taken for each patient were used for measurement of actual geometric dimensions of the applicator and carrying out the dosimetry on TP-11 treatment planning system. The dimensions H, W and T of reference isodose surface (60 Gy) were also noted. Ratio of the product HWT and the pearshape volume was found mainly to be a function of colpostat separation and not of other geometrical parameters like maximum vertical and anterio-posterior dimension of the applicator. The ratio remained almost constant for a particular combination of uterine tandem and colpostat. Variation in the ratios were attributed to the non-standard geometry. The ratio of the volume of reference isodose surface to the product of its dimensions in the applicator depends upon the colpostat separation. (orig./MG) [de

  1. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  2. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    Bataillon, Remy; Lafont, Marielle; Rousse, Carole; Vuillez, Jean-Philippe; Ducou Le Pointe, Hubert; Grenier, Nicolas; Lartigau, Eric; Orcel, Philippe; Dujarric, Francis; Beaupin, Alain; Bar, Olivier; Blondet, Emmanuelle; Combe, Valerie; Pages, Frederique

    2012-11-01

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  3. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  4. [The perspective directions of development of methodology of the analysis of risk in Russia].

    Science.gov (United States)

    Avaliani, S L; Bezpal'ko, L E; Bobkova, I E; Mishina, A L

    2013-01-01

    In the article the perspective directions of development of methodology of the analysis of risk in Russia with taking into account the last world achievements in this area and requirements to harmonization of a system for control of environment quality are considered. Main problem questions of the analysis of risk in relation to regulation of nature protection activity were emphasized to be related as well with insufficiency of legislative, standard and executive support for this direction of administrative activity as with the need of the solution of the methodical questions concerning new tendencies of justification and use of reference levels of chemicals, development of modern approaches to the specification of cancerogenic and not cancerogenic risks, including cumulative ones.

  5. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.; Petrovic, B.

    2003-04-01

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  6. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans.

    Science.gov (United States)

    Bernaldo de Quirós, Yara; González-Díaz, Oscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D; Mazzariol, Sandro; Di Guardo, Giovanni; Fernández, Antonio

    2011-01-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  7. [Evaluation on methodological problems in reports concerning quantitative analysis of syndrome differentiation of diabetes mellitus].

    Science.gov (United States)

    Chen, Bi-Cang; Wu, Qiu-Ying; Xiang, Cheng-Bin; Zhou, Yi; Guo, Ling-Xiang; Zhao, Neng-Jiang; Yang, Shu-Yu

    2006-01-01

    To evaluate the quality of reports published in recent 10 years in China about quantitative analysis of syndrome differentiation for diabetes mellitus (DM) in order to explore the methodological problems in these reports and find possible solutions. The main medical literature databases in China were searched. Thirty-one articles were included and evaluated by the principles of clinical epidemiology. There were many mistakes and deficiencies in these articles, such as clinical trial designs, diagnosis criteria for DM, standards of syndrome differentiation of DM, case inclusive and exclusive criteria, sample size and estimation, data comparability and statistical methods. It is necessary and important to improve the quality of reports concerning quantitative analysis of syndrome differentiation of DM in light of the principles of clinical epidemiology.

  8. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    Science.gov (United States)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  9. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    Bojacá, Carlos Ricardo; Casilimas, Héctor Albeiro; Gil, Rodrigo; Schrevens, Eddie

    2012-01-01

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha −1 to produce 96.4 GJ ha −1 , resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  10. The Spirit of OMERACT: Q Methodology Analysis of Conference Characteristics Valued by Delegates.

    Science.gov (United States)

    Flurey, Caroline A; Kirwan, John R; Hadridge, Phillip; Richards, Pamela; Grosskleg, Shawna; Tugwell, Peter S

    2015-10-01

    To identify the major features of OMERACT meetings as valued by frequent participants and to explore whether there are groups of participants with different opinions. Using Q methodology (a qualitative and quantitative approach to grouping people according to subjective opinion), participants (who attended more than 1 OMERACT conference) sorted 66 statements relating to the "spirit of OMERACT" according to level of agreement across a normal distribution grid. Data were examined using Q factor analysis. Of 226 potential participants, 105 responded (46%). All participants highly ranked the focus on global standardization of methods, outcome measures, data-driven research, methodological discussion, and international collaboration. Four factors describing the "spirit of OMERACT" were identified: "Evidence not eminence" (n = 31) valued the data- and evidence-driven research above personality and status; "Collaboration and collegiality" (n = 19) valued the international and cross-stakeholder collaboration, interaction, and collegiality; "Equal voices, equal votes, common goals" (n = 12) valued equality in discussion and voting, with everyone striving toward the same goal; "principles and product, not process" (n = 8) valued the principles of focusing on outcome measures and the product of guiding clinical trials, but were unsure whether the process is necessary to reach this. The factors did not segregate different stakeholder groups. Delegates value different elements of OMERACT, and thus the "spirit of OMERACT" encompasses evidence-based research, collaboration, and equality, although a small group are unsure whether the process is necessary to achieve the end result. Q methodology may prove useful for conference organizers to identify their delegates' different needs to tailor conference content.

  11. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    International Nuclear Information System (INIS)

    1999-01-01

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports (DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively). At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  12. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-08-31

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports [DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively]. At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  13. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    2003-01-01

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  14. Southern forest inventory and analysis volume equation user’s guide

    Science.gov (United States)

    Christopher M. Oswalt; Roger C. Conner

    2011-01-01

    Reliable volume estimation procedures are fundamental to the mission of the Forest Inventory and Analysis (FIA) program. Moreover, public access to FIA program procedures is imperative. Here we present the volume estimation procedures used by the southern FIA program of the U.S. Department of Agriculture Forest Service Southern Research Station. The guide presented...

  15. Analysis of some nuclear waste management options. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Berman, L.E.; Ensminger, D.A.; Giuffre, M.S.; Koplik, C.M.; Oston, S.G.; Pollak, G.D.; Ross, B.I.

    1978-10-10

    This report describes risk analyses performed on that portion of a nuclear fuel cycle which begins following solidification of high-level waste. Risks associated with handling, interim storage and transportation of the waste are assessed, as well as the long term implications of disposal in deep mined cavities. The risk is expressed in terms of expected dose to the general population and peak dose to individuals in the population. This volume consists of appendices which provide technical details of the work performed.

  16. Analysis of airborne radiometric data. Volume 3. Topical reports

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.H.; Shreve, D.C.; Sperling, M.; Woolson, W.A.

    1978-05-01

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors.

  17. Predicted stand volume for Eucalyptus plantations by spatial analysis

    Science.gov (United States)

    Latifah, Siti; Teodoro, RV; Myrna, GC; Nathaniel, CB; Leonardo, M. F.

    2018-03-01

    The main objective of the present study was to assess nonlinear models generated by integrating the stand volume growth rate to estimate the growth and yield of Eucalyptus. The primary data was done for point of interest (POI) of permanent sample plots (PSPs) and inventory sample plots, in Aek Nauli sector, Simalungun regency,North Sumatera Province,Indonesia. from December 2008- March 2009. Today,the demand for forestry information has continued to grow over recent years. Because many forest managers and decision makers face complex decisions, reliable information has become the necessity. In the assessment of natural resources including plantation forests have been widely used geospatial technology.The yield of Eucalyptus plantations represented by merchantable volume as dependent variable while factors affecting yield namely stands variables and the geographic variables as independent variables. The majority of the areas in the study site has stand volume class 0 - 50 m3/ha with 16.59 ha or 65.85 % of the total study site.

  18. Analysis of maternal and child health policies in Malawi: The methodological perspective.

    Science.gov (United States)

    Daire, J; Khalil, D

    2015-12-01

    The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context.

  19. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  20. The U-tube sampling methodology and real-time analysis of geofluids

    International Nuclear Information System (INIS)

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-01-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood (1973), provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO 2 storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO 2 from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO 2 storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  1. A methodology for selection of wind energy system locations using multicriterial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sansevic, M.; Rabadan, Lj. Pilic [Croatia Univ., Faculty of Electrical Engineering, Mechanical Engineering and Naval Architecture, Split (Croatia)

    1996-12-31

    The effectiveness of a wind turbine generator depends not only on its performance but also on the site`s wind resource. Thus the problem of location selection should be approached systematically, by considering a set of relevant parameters particularly those having a significant economical and ecological impact. This paper presents the methodology used in location selection for the operation of wind energy system. It is based on a multicriterial analysis which enables comparison and ranking of locations according to a set of different parameters. Principal objectives (criteria) in location selection are: energy-economical, technical-technological, physical planning and environment and life protection objectives. For the mathematical modeling of this multicriterial problem the PROMETHEE method is chosen which is developed especially for the solution of rather ``poorly`` structured problems, thus justifying its application in the preliminary stage of site selection for the wind energy systems. The developed methodology is applied in selecting the locations in the island of Rhodes using the available database of the Geographic Information System and the wind potential data obtained by means of the AIOLOS program. (Author)

  2. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  3. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Luis Adrian Zuñiga Aviles

    2010-12-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  4. Systematic Analysis of an IEED Unit Based in a New Methodology for M&S

    Directory of Open Access Journals (Sweden)

    Jesus Carlos Pedraza Ortega

    2011-01-01

    Full Text Available In the field of modeling and simulation of mechatronics designs of high complexity, is presented a systematic analysis of an IEDD unit [1] (improvised explosive device disposal based in a new methodology for modeling and simulation divided into 6 stages in order to increase the accuracy of validation of whole system, this mechatronic unit is a Non-holonomic Unmanned wheeled mobile manipulator, MU-NH-WMM, formed by a Differential Traction and a manipulator arm with 4 degrees of freedom mounted on wheeled mobile base, hence the contribution of this work is a novel methodology based on a practice proposal of philosophy of mechatronics design, which establishes the suitable kinematics to coupled wheeled mobile manipulator, where the motion equations and kinematics transformations are the base of the specific stages in order to obtain the dynamic of coupled system, validating the behavior and the trajectories tracking, in order to achieve the complex tasks of approaching to work area and the appropiatehandling of explosive device, this work is focused in the first of them; such that the errors in the model can be detected and later confined by proposed control.

  5. A methodology for analysis of impacts of grid integration of renewable energy

    Energy Technology Data Exchange (ETDEWEB)

    George, Mel [Department of Energy Science and Engineering, Indian Institute of Technology, Bombay, Powai, Mumbai 400 076 (India); Banerjee, Rangan, E-mail: rangan@iitb.ac.i [Department of Energy Science and Engineering, Indian Institute of Technology, Bombay, Powai, Mumbai 400 076 (India)

    2011-03-15

    Present electricity grids are predominantly thermal (coal, gas) and hydro based. Conventional power planning involves hydro-thermal scheduling and merit order dispatch. In the future, modern renewables (hydro, solar and biomass) are likely to have a significant share in the power sector. This paper presents a method to analyse the impacts of renewables in the electricity grid. A load duration curve based approach has been developed. Renewable energy sources have been treated as negative loads to obtain a modified load duration curve from which capacity savings in terms of base and peak load generation can be computed. The methodology is illustrated for solar, wind and biomass power for Tamil Nadu (a state in India). The trade-offs and interaction between renewable sources are analysed. The impacts on capacity savings by varying the wind regime have also been shown. Scenarios for 2021-22 have been constructed to illustrate the methodology proposed. This technique can be useful for power planners for an analysis of renewables in future electricity grids. - Research highlights: {yields} A new method to analyse impacts of renewables in the electricity grid. {yields} Effects of wind, solar PV and biomass power on load duration curve and capacity savings are shown. {yields} Illustration of intermittent renewables and their interplay for sites in India and the UK. {yields} Future scenarios constructed for generation expansion planning with higher levels of renewable.

  6. A novel registration-based methodology for prediction of trabecular bone fabric from clinical QCT: A comprehensive analysis.

    Science.gov (United States)

    Chandran, Vimal; Reyes, Mauricio; Zysset, Philippe

    2017-01-01

    Osteoporosis leads to hip fractures in aging populations and is diagnosed by modern medical imaging techniques such as quantitative computed tomography (QCT). Hip fracture sites involve trabecular bone, whose strength is determined by volume fraction and orientation, known as fabric. However, bone fabric cannot be reliably assessed in clinical QCT images of proximal femur. Accordingly, we propose a novel registration-based estimation of bone fabric designed to preserve tensor properties of bone fabric and to map bone fabric by a global and local decomposition of the gradient of a non-rigid image registration transformation. Furthermore, no comprehensive analysis on the critical components of this methodology has been previously conducted. Hence, the aim of this work was to identify the best registration-based strategy to assign bone fabric to the QCT image of a patient's proximal femur. The normalized correlation coefficient and curvature-based regularization were used for image-based registration and the Frobenius norm of the stretch tensor of the local gradient was selected to quantify the distance among the proximal femora in the population. Based on this distance, closest, farthest and mean femora with a distinction of sex were chosen as alternative atlases to evaluate their influence on bone fabric prediction. Second, we analyzed different tensor mapping schemes for bone fabric prediction: identity, rotation-only, rotation and stretch tensor. Third, we investigated the use of a population average fabric atlas. A leave one out (LOO) evaluation study was performed with a dual QCT and HR-pQCT database of 36 pairs of human femora. The quality of the fabric prediction was assessed with three metrics, the tensor norm (TN) error, the degree of anisotropy (DA) error and the angular deviation of the principal tensor direction (PTD). The closest femur atlas (CTP) with a full rotation (CR) for fabric mapping delivered the best results with a TN error of 7.3 ± 0.9%, a DA

  7. Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

  8. Evaluation of methodologies for assessing the overall diet: dietary quality scores and dietary pattern analysis.

    Science.gov (United States)

    Ocké, Marga C

    2013-05-01

    This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.

  9. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  10. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  11. A MODEL OF ANALYSIS IN ANALYTICAL METHODOLOGY FOR BIOPHARMACEUTICAL QUALITY CONTROL.

    Science.gov (United States)

    Andrade, Cleyton; de la O Herrera, Miguel; Lemes, Elezer

    2018-02-14

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA (rcDNA). To determine small amounts of DNA (around 100pg) that may be in a biologically-derived drug substance, an analytical method should be sensitive, robust, reliable and accurate. In principle, three techniques have the ability to measure rcDNA: radioactive dot-blot a type of Hybridization; Threshold and quantitative Polymerase Chain Reaction (qPCR). Quality Risk Management (QRM) is a systematic process for evaluating, controlling and reporting of risks which may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates by QRM, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool Hazard Analysis and Critical Control Points (HACCP). HACCP provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we concluded that the radioactive dot-blot assay has the largest number of critical control points (CCP), followed by qPCR and Threshold. From the analysis of hazards (i.e. points of method failure) and the associated method procedure CCP, we concluded that the analytical methodology with the lowest risk for performance failure for rcDNA testing is the qPCR. Copyright © 2018, Parenteral Drug Association.

  12. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  13. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  14. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  15. IAEA methodology of the ITDB information analysis from nuclear security perspective

    International Nuclear Information System (INIS)

    2010-01-01

    The IAEA methodology of the Illicit Trafficking database analyses general and specific risks, trends and patterns. This methodology assist in identification of security needs that are specific to material , activity , location ,country or even regional.Finally the methodology also analyses the lessons learned.

  16. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis

    DEFF Research Database (Denmark)

    Ringius, L.; Grohnheit, Poul Erik; Nielsen, Lars Henrik

    2002-01-01

    , carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, andrecommends methodologies for and approaches to baseline development. To present the application...... the highest and the lowest baselines, by disregarding this baseline option altogether the difference between the lowest and the highest is reduced to 16%. The ES3-model, which the Systems Analysis Department at Risø National Laboratory has developed,makes it possible for this report to explore the project......-specific approach to baseline development in some detail. Based on quite disaggregated data on the Egyptian electricity system, including the wind power production profile of Zafarana, the emissionrates estimated by runs with 1 hour time-steps of the simulation tool ES3 range from 0.590 tCO2/MWh to 0.610 tCO2/MWh...

  17. Methodology for adding and amending glycaemic index values to a nutrition analysis package.

    LENUS (Irish Health Repository)

    Levis, Sharon P

    2011-04-01

    Since its introduction in 1981, the glycaemic index (GI) has been a useful tool for classifying the glycaemic effects of carbohydrate foods. Consumption of a low-GI diet has been associated with a reduced risk of developing CVD, diabetes mellitus and certain cancers. WISP (Tinuviel Software, Llanfechell, Anglesey, UK) is a nutrition software package used for the analysis of food intake records and 24 h recalls. Within its database, WISP contains the GI values of foods based on the International Tables 2002. The aim of the present study is to describe in detail a methodology for adding and amending GI values to the WISP database in a clinical or research setting, using data from the updated International Tables 2008.

  18. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines - analysis and comparison

    Science.gov (United States)

    Michał, Lipian; Maciej, Karczewski; Jakub, Molinski; Krzysztof, Jozwik

    2016-01-01

    Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT) geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM) simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM) in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP), Lodz University of Technology (TUL). An attempt to find an efficient method (with a compromise between accuracy and design time) for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  19. Containment pressure analysis methodology during a LBLOCA with iteration between RELAP5 and COCOSYS

    International Nuclear Information System (INIS)

    Silva, Dayane Faria; Sabundjian, Gaianê; Souza, Ana Cecília Lima

    2017-01-01

    The pressure conditions inside the containment in the case of a Large Break Loss of Coolant Accident (LBLOCA) are more severe in the case of hot leg rupture, due to the large amount of mass and energy that is thrown from the break that lies just after the pressure vessel. This work presents a methodology of pressure analysis within the containment of a Brazilian PWR, Angra 2, with an iterative process between the code that simulates guillotine rupture - RELAP5 - and the COCOSYS code, which analyzes the containment pressure from the accident conditions. The results show that the iterative process between the codes allows the convergence of pressure data to a more realistic approach. (author)

  20. New methodology developed for the differential scanning calorimetry analysis of polymeric matrixes incorporating phase change materials

    International Nuclear Information System (INIS)

    Barreneche, Camila; Solé, Aran; Miró, Laia; Martorell, Ingrid; Cabeza, Luisa F; Fernández, A Inés

    2012-01-01

    Nowadays, thermal comfort needs in buildings have led to an increase in energy consumption of the residential and service sectors. For this reason, thermal energy storage is shown as an alternative to achieve reduction of this high consumption. Phase change materials (PCM) have been studied to store energy due to their high storage capacity. A polymeric material capable of macroencapsulating PCM was developed by the authors of this paper. However, difficulties were found while measuring the thermal properties of these materials by differential scanning calorimetry (DSC). The polymeric matrix interferes in the detection of PCM properties by DSC. To remove this interfering effect, a new methodology which replaces the conventional empty crucible used as a reference in the DSC analysis by crucibles composed of the polymeric matrix was developed. Thus, a clear signal from the PCM is obtained by subtracting the new full crucible signal from the sample signal. (paper)

  1. A new methodological approach to nonverbal behavior analysis in cultural perspective.

    Science.gov (United States)

    Agliati, Alessia; Vescovo, Antonietta; Anolli, Luigi

    2006-08-01

    The measurement of human behavior is a complex task, both for psychologists and human sciences researchers and with respect to technology, since advanced and sophisticated instruments may have to be implemented to manage the plurality of variables involved. In this article, an observational study is presented in which a quantitative procedure, the external variables method (Duncan & Fiske, 1977), was integrated with a structural analysis (Magnusson, 1993, 2000) in order to detect the hidden organization of nonverbal behavior in Italian and Icelandic interactions. To this aim, Theme software was introduced and employed. The results showed that both the frequency and the typology of gestures deeply change as a function of culture. Moreover, a high number of patterns was detected in both Italian and Icelandic interactions: They appeared to be complex sequences in which a huge number of events were constantly happening and recurring. In this domain, Theme software provides a methodological progression from the quantitative to the structural approach.

  2. Analysis of ICT Investments. Towards a Methodological Guide with Focus on Estimation of Intangible Benefits

    Directory of Open Access Journals (Sweden)

    LINDO, O. D.

    2017-06-01

    Full Text Available Investments in Information and Communication Technologies (ICTs can provide firms with both tangible and intangible benefits (IBs. However, these investments are generally analyzed and evaluated by means of traditional methods, which focus on the measurement of the tangible to determine, mainly, the amount of profit that has been obtained. In this paper we develop the foundation of a methodological guide with focus on the estimation of IBs. For this, we present evidence that suggests that an analysis of the impact of ICT investments by focusing on the estimation of IBs may be based on the value chain of a business unit; we also provide a list of factors of intangible value of ICTs identified in the literature reviewed and propose stages to fulfill and variables to use so as to contribute for a comprehensive evaluation of said investments.

  3. Computer Assisted Data Analysis in the Dye Dilution Technique for Plasma Volume Measurement.

    Science.gov (United States)

    Bishop, Marvin; Robinson, Gerald D.

    1981-01-01

    Describes a method for undergraduate physiology students to measure plasma volume by the dye dilution technique, in which a computer is used to interpret data. Includes the computer program for the data analysis. (CS)

  4. Civic Improvement Program. Volume 2. Fallout Protection Factor Analysis Capability

    Science.gov (United States)

    1987-08-15

    be comprised of a 0.5 inch plasterboard layer on each side 7 supported by 1.5-in x 3.5-in two-by-four studding on 16-inch centers. The volume fractions...for these components would be 0.222 (1/4.5) for the plasterboard and 0.073 (1.5 x 3.5/ 4.5 x 16) for the wood. When density information is available...thick drywall plasterboard below. The foundation thickness is 10 inches of poured concrete. Figure 12 shows front and rear views of the baseline two

  5. A new methodology for assessing health policy and systems research and analysis capacity in African universities.

    Science.gov (United States)

    Lê, Gillian; Mirzoev, Tolib; Orgill, Marsha; Erasmus, Ermin; Lehmann, Uta; Okeyo, Stephen; Goudge, Jane; Maluka, Stephen; Uzochukwu, Benjamin; Aikins, Moses; de Savigny, Don; Tomson, Goran; Gilson, Lucy

    2014-10-08

    The importance of health policy and systems research and analysis (HPSR+A) has been increasingly recognised, but it is still unclear how most effectively to strengthen the capacity of the different organisations involved in this field. Universities are particularly crucial but the expansive literature on capacity development has little to offer the unique needs of HPSR+A activity within universities, and often overlooks the pivotal contribution of capacity assessments to capacity strengthening. The Consortium for Health Policy and Systems Analysis in Africa 2011-2015 designed and implemented a new framework for capacity assessment for HPSR+A within universities. The methodology is reported in detail. Our reflections on developing and conducting the assessment generated four lessons for colleagues in the field. Notably, there are currently no published capacity assessment methodologies for HPSR+A that focus solely on universities - we report a first for the field to initiate the dialogue and exchange of experiences with others. Second, in HPSR+A, the unit of assessment can be a challenge, because HPSR+A groups within universities tend to overlap between academic departments and are embedded in different networks. Third, capacity assessment experience can itself be capacity strengthening, even when taking into account that doing such assessments require capacity. From our experience, we propose that future systematic assessments of HPSR+A capacity need to focus on both capacity assets and needs and assess capacity at individual, organisational, and systems levels, whilst taking into account the networked nature of HPSR+A activity. A genuine partnership process between evaluators and those participating in an assessment can improve the quality of assessment and uptake of results in capacity strengthening.

  6. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology.

    Science.gov (United States)

    Hammami, Muhammad M; Hammami, Safa; Amer, Hala A; Khodr, Nesrine A

    2016-01-01

    Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females' end-of-life choices. A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology. The mean age of the females in the sample was 30.3 years (range, 19-55 years). Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: "physical and emotional privacy concerned, family caring" (younger, lower religiosity), "whole person" (higher religiosity), "pain and informational privacy concerned" (lower life quality), "decisional privacy concerned" (older, higher life quality), and "life quantity concerned, family dependent" (high life quality, low life satisfaction). Out of the extreme 14 priorities/dis-priorities for each group, 21%-50% were not represented among the extreme 20 priorities/dis-priorities for the entire sample. Consistent with the previously reported findings in Saudi males, transcendence and dying in the hospital were the extreme end-of-life priority and dis

  7. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...

  8. Methodology of economic analysis of evidence of cartel in the resale market of fuels

    International Nuclear Information System (INIS)

    Costa, Cleber Ribeiro da Silva; Tiryaki, Gisele Ferreira; Ramos, Maria Olivia

    2010-01-01

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at the constitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  9. Methodology of analysis of economic evidence of cartel in the resale retail of the fuel sector

    International Nuclear Information System (INIS)

    Costa, Cleber Ribeiro da Silva; Tiryaki, Gisele Ferreira; Ramos, Maria Olivia

    2010-01-01

    The existence of anti competitive conduct such as cartels would lead to a situation of high prices and profits harming competition and society in general. The methodology of economic analysis of evidence of cartel by the ANP in the resale market of fuels involves analysis of the behavior of the average prices of resale and distribution, the nominal average gross margin on resale, the coefficient of variation of prices of resale and distribution of fuel for a given period by the municipality. Combining the analysis of these elements, the ANP has suggested the investigation into possible cartels. This text aims to bring contributions for a better definition of the relevant market in the analysis of economic evidence in cartel in the market for resale of fuel and add elements currently not considered in the analysis of ANP and regulation of the sector. To this end, this article is organized into three sections besides the introduction and final consideration. The first section takes place at deconstitution some myths about cartels thread reseller retailer of automotive fuel by analyzing the main causes leading to complaints by consumers. Then presents a conceptual analysis of relevant market, since this definition is essential to characterize anti-competitive practices of operations performed by companies holding market power, notably the formation of cartels. Finally, it is a discussion on how the action of the main bodies involved in dismantling of anti competitive practices in the industry. Expected to find results that work with greater integration between agencies to safeguard competition and better defining the relevant market segment for the resale of fuel. (author)

  10. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  11. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  12. Evaluating the effects of dam breach methodologies on Consequence Estimation through Sensitivity Analysis

    Science.gov (United States)

    Kalyanapu, A. J.; Thames, B. A.

    2013-12-01

    Dam breach modeling often includes application of models that are sophisticated, yet computationally intensive to compute flood propagation at high temporal and spatial resolutions. This results in a significant need for computational capacity that requires development of newer flood models using multi-processor and graphics processing techniques. Recently, a comprehensive benchmark exercise titled the 12th Benchmark Workshop on Numerical Analysis of Dams, is organized by the International Commission on Large Dams (ICOLD) to evaluate the performance of these various tools used for dam break risk assessment. The ICOLD workshop is focused on estimating the consequences of failure of a hypothetical dam near a hypothetical populated area with complex demographics, and economic activity. The current study uses this hypothetical case study and focuses on evaluating the effects of dam breach methodologies on consequence estimation and analysis. The current study uses ICOLD hypothetical data including the topography, dam geometric and construction information, land use/land cover data along with socio-economic and demographic data. The objective of this study is to evaluate impacts of using four different dam breach methods on the consequence estimates used in the risk assessments. The four methodologies used are: i) Froehlich (1995), ii) MacDonald and Langridge-Monopolis 1984 (MLM), iii) Von Thun and Gillete 1990 (VTG), and iv) Froehlich (2008). To achieve this objective, three different modeling components were used. First, using the HEC-RAS v.4.1, dam breach discharge hydrographs are developed. These hydrographs are then provided as flow inputs into a two dimensional flood model named Flood2D-GPU, which leverages the computer's graphics card for much improved computational capabilities of the model input. Lastly, outputs from Flood2D-GPU, including inundated areas, depth grids, velocity grids, and flood wave arrival time grids, are input into HEC-FIA, which provides the

  13. Methodology сomparative statistical analysis of Russian industry based on cluster analysis

    Directory of Open Access Journals (Sweden)

    Sergey S. Shishulin

    2017-01-01

    Full Text Available The article is devoted to researching of the possibilities of applying multidimensional statistical analysis in the study of industrial production on the basis of comparing its growth rates and structure with other developed and developing countries of the world. The purpose of this article is to determine the optimal set of statistical methods and the results of their application to industrial production data, which would give the best access to the analysis of the result.Data includes such indicators as output, output, gross value added, the number of employed and other indicators of the system of national accounts and operational business statistics. The objects of observation are the industry of the countrys of the Customs Union, the United States, Japan and Erope in 2005-2015. As the research tool used as the simplest methods of transformation, graphical and tabular visualization of data, and methods of statistical analysis. In particular, based on a specialized software package (SPSS, the main components method, discriminant analysis, hierarchical methods of cluster analysis, Ward’s method and k-means were applied.The application of the method of principal components to the initial data makes it possible to substantially and effectively reduce the initial space of industrial production data. Thus, for example, in analyzing the structure of industrial production, the reduction was from fifteen industries to three basic, well-interpreted factors: the relatively extractive industries (with a low degree of processing, high-tech industries and consumer goods (medium-technology sectors. At the same time, as a result of comparison of the results of application of cluster analysis to the initial data and data obtained on the basis of the principal components method, it was established that clustering industrial production data on the basis of new factors significantly improves the results of clustering.As a result of analyzing the parameters of

  14. GIS methodology for geothermal play fairway analysis: Example from the Snake River Plain volcanic province

    Science.gov (United States)

    DeAngelo, Jacob; Shervais, John W.; Glen, Jonathan; Nielson, Dennis L.; Garg, Sabodh; Dobson, Patrick; Gasperikova, Erika; Sonnenthal, Eric; Visser, Charles; Liberty, Lee M.; Siler, Drew; Evans, James P.; Santellanes, Sean

    2016-01-01

    Play fairway analysis in geothermal exploration derives from a systematic methodology originally developed within the petroleum industry and is based on a geologic and hydrologic framework of identified geothermal systems. We are tailoring this methodology to study the geothermal resource potential of the Snake River Plain and surrounding region. This project has contributed to the success of this approach by cataloging the critical elements controlling exploitable hydrothermal systems, establishing risk matrices that evaluate these elements in terms of both probability of success and level of knowledge, and building automated tools to process results. ArcGIS was used to compile a range of different data types, which we refer to as ‘elements’ (e.g., faults, vents, heatflow…), with distinct characteristics and confidence values. Raw data for each element were transformed into data layers with a common format. Because different data types have different uncertainties, each evidence layer had an accompanying confidence layer, which reflects spatial variations in these uncertainties. Risk maps represent the product of evidence and confidence layers, and are the basic building blocks used to construct Common Risk Segment (CRS) maps for heat, permeability, and seal. CRS maps quantify the variable risk associated with each of these critical components. In a final step, the three CRS maps were combined into a Composite Common Risk Segment (CCRS) map for analysis that reveals favorable areas for geothermal exploration. Python scripts were developed to automate data processing and to enhance the flexibility of the data analysis. Python scripting provided the structure that makes a custom workflow possible. Nearly every tool available in the ArcGIS ArcToolbox can be executed using commands in the Python programming language. This enabled the construction of a group of tools that could automate most of the processing for the project. Currently, our tools are repeatable

  15. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings

    International Nuclear Information System (INIS)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV 241 Pu and 208-keV 237 U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings

  16. Descriptive analysis of bacon smoked with Brazilian woods from reforestation: methodological aspects, statistical analysis, and study of sensory characteristics.

    Science.gov (United States)

    Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J

    2018-02-21

    The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  18. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    Science.gov (United States)

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Comparative analysis study for piping seismic design criteria. Part 1: Methodology and objectives

    International Nuclear Information System (INIS)

    Adams, T.M.; Branch, E.B.; Landers, D.F.; Tagart, S.W. Jr.

    1994-01-01

    As part of the First of a Kind Engineering effort for the Advanced Light Water Reactor the Advanced Reactor Corporation formed a Piping Technical Core Group to develop a set of improved ASME Boiler and Pressure Vessel Code, Section III design rules and approaches for ALWR plant piping and support design. In developing this proposed criteria the Technical Core Group conducted an extensive Comparative Analysis Study of many of the eighteen proposals for new or alternative seismic piping design criteria under consideration by The ASME Code Committee. As part of this comparative study the Technical Core Group also devised a process for determining the relative conservatism of each design criteria studied. In addition a method to relate these relative margins to EPRI/USNRC piping component test margins was developed which allows an assessment of the minimum margin to failure that could be expected from each set of design rules and criteria studied. The study was conducted in two phases and the methodology for each of these phases is presented. Phase 1 was done to develop the relative margins and relate these margins to the actual EPRI/USNRC test data. Phase 2 was done to study the optimized support configurations which could be created using each of the different criteria reviewed. Also, presented in this paper is a discussion of the different criteria selected for study, Analysis Methods/Criteria applied, the piping models for the study, and the specific inputs for the comparative analyses

  20. Nanoparticle analysis and characterization methodologies in environmental risk assessment of engineered nanoparticles.

    Science.gov (United States)

    Hassellöv, Martin; Readman, James W; Ranville, James F; Tiede, Karen

    2008-07-01

    Environmental risk assessments of engineered nanoparticles require thorough characterization of nanoparticles and their aggregates. Furthermore, quantitative analytical methods are required to determine environmental concentrations and enable both effect and exposure assessments. Many methods still need optimization and development, especially for new types of nanoparticles in water, but extensive experience can be gained from the fields of environmental chemistry of natural nanomaterials and from fundamental colloid chemistry. This review briefly describes most methods that are being exploited in nanoecotoxicology for analysis and characterization of nanomaterials. Methodological aspects are discussed in relation to the fields of nanometrology, particle size analysis and analytical chemistry. Differences in both the type of size measures (length, radius, aspect ratio, etc.), and the type of average or distributions afforded by the specific measures are compared. The strengths of single particle methods, such as electron microscopy and atomic force microscopy, with respect to imaging, shape determinations and application to particle process studies are discussed, together with their limitations in terms of counting statistics and sample preparation. Methods based on the measurement of particle populations are discussed in terms of their quantitative analyses, but the necessity of knowing their limitations in size range and concentration range is also considered. The advantage of combining complementary methods is highlighted.

  1. Methodological factors affecting joint moments estimation in clinical gait analysis: a systematic review.

    Science.gov (United States)

    Camomilla, Valentina; Cereatti, Andrea; Cutti, Andrea Giovanni; Fantozzi, Silvia; Stagni, Rita; Vannozzi, Giuseppe

    2017-08-18

    Quantitative gait analysis can provide a description of joint kinematics and dynamics, and it is recognized as a clinically useful tool for functional assessment, diagnosis and intervention planning. Clinically interpretable parameters are estimated from quantitative measures (i.e. ground reaction forces, skin marker trajectories, etc.) through biomechanical modelling. In particular, the estimation of joint moments during motion is grounded on several modelling assumptions: (1) body segmental and joint kinematics is derived from the trajectories of markers and by modelling the human body as a kinematic chain; (2) joint resultant (net) loads are, usually, derived from force plate measurements through a model of segmental dynamics. Therefore, both measurement errors and modelling assumptions can affect the results, to an extent that also depends on the characteristics of the motor task analysed (i.e. gait speed). Errors affecting the trajectories of joint centres, the orientation of joint functional axes, the joint angular velocities, the accuracy of inertial parameters and force measurements (concurring to the definition of the dynamic model), can weigh differently in the estimation of clinically interpretable joint moments. Numerous studies addressed all these methodological aspects separately, but a critical analysis of how these aspects may affect the clinical interpretation of joint dynamics is still missing. This article aims at filling this gap through a systematic review of the literature, conducted on Web of Science, Scopus and PubMed. The final objective is hence to provide clear take-home messages to guide laboratories in the estimation of joint moments for the clinical practice.

  2. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  3. Area overhead analysis of SEF: A design methodology for tolerating SEU

    International Nuclear Information System (INIS)

    Blaquiere, Y.; Savaria, Y.

    1987-01-01

    Soft-Error filtering (SEF) is a design methodology proposed recently for implementing machines tolerant to SEU. This paper deals mainly with the evaluation and the reduction of the area overhead brought by SEF. A new shift register filtering latch configuration is proposed. The use of this latch, optimized for minimum area, reduces the area overhead by a factor of 2.6, when compared with latches optimized for time performance. A detailed analysis of the area overhead with SEF implemented on two relatively complex machines produced the following results: a SEF version of the 6800 microprocessor would require an area overhead varying between 12% and 69% depending on the SEF latch used and, a SEF version of the RISCII microprocessor would result in a 38.8% area overhead. An analysis of the cost of implementing the Hamming error correcting code on a register array is presented and this cost is compared with that of implementing SEU tolerance directly with SEF. Finally, a hybrid approach is proposed where a large register array is protected by an error correcting code whereas the isolated latches are replaced by filtering latches. This hybrid approach reduces the area overhead to 18.8% for the RISCII architecture

  4. Modeling and Design Analysis Methodology for Tailoring of Aircraft Structures with Composites

    Science.gov (United States)

    Rehfield, Lawrence W.

    2004-01-01

    Composite materials provide design flexibility in that fiber placement and orientation can be specified and a variety of material forms and manufacturing processes are available. It is possible, therefore, to 'tailor' the structure to a high degree in order to meet specific design requirements in an optimum manner. Common industrial practices, however, have limited the choices designers make. One of the reasons for this is that there is a dearth of conceptual/preliminary design analysis tools specifically devoted to identifying structural concepts for composite airframe structures. Large scale finite element simulations are not suitable for such purposes. The present project has been devoted to creating modeling and design analysis methodology for use in the tailoring process of aircraft structures. Emphasis has been given to creating bend-twist elastic coupling in high aspect ratio wings or other lifting surfaces. The direction of our work was in concert with the overall NASA effort Twenty- First Century Aircraft Technology (TCAT). A multi-disciplinary team was assembled by Dr. Damodar Ambur to work on wing technology, which included our project.

  5. The Cultural Analysis of Soft Systems Methodology and the Configuration Model of Organizational Culture

    Directory of Open Access Journals (Sweden)

    Jürgen Staadt

    2015-06-01

    Full Text Available Organizations that find themselves within a problematic situation connected with cultural issues such as politics and power require adaptable research and corresponding modeling approaches so as to grasp the arrangements of that situation and their impact on the organizational development. This article originates from an insider-ethnographic intervention into the problematic situation of the leading public housing provider in Luxembourg. Its aim is to describe how the more action-oriented cultural analysis of soft systems methodology and the theory-driven configuration model of organizational culture are mutually beneficial rather than contradictory. The data collected between 2007 and 2013 were analyzed manually as well as by means of ATLAS.ti. Results demonstrate that the cultural analysis enables an in-depth understanding of the power-laden environment within the organization bringing about the so-called “socio-political system” and that the configuration model makes it possible to depict the influence of that system on the whole organization. The overall research approach thus contributes toward a better understanding of the influence and the impact of oppressive social environments and evolving power relations on the development of an organization.

  6. COMPARATIVE ANALYSIS OF INDICATORS OBTAINED BY CORINELAND COVER METHODOLOGY FOR SUSTAINABLE USE OF FOREST ECOSYSTEMS

    Directory of Open Access Journals (Sweden)

    Slaviša Popović

    2015-07-01

    Full Text Available Serbian Environmental Protection Agency followed international and national indicators to do monitoring of forested landscape area for the period 1990-2000. Based on the data obtained by Corine Land Cover methodology following the indicators like Forest area, Forested landscape, Forest land and Forest and semi natural area, analysis was done. The forested landscape indicators analysis helped trends monitoring during the period from 1990 - 2000 year. Dynamic of forested area changes could have direct impact on the practical implementation of indicators. Indicator Forest area can be used in planning sustainable use of forests. Recorded growth rates value in 2000year, compared to the 1990th is 0.296%. Indicator Forested landscape increase for 0.186% till 2000 year, while the indicator Forested Land recorded value growth rate of 0.193%. Changes in rates of those indicators can be used in the future for “emission trading”. The smallest increment of rate change of 0.1% was recorded in indicator Forests and semi natural area. Information given by this indicator can be used for monitoring habitats in high mountain areas.

  7. Legal basis for risk analysis methodology while ensuring food safety in the Eurasian Economic union and the Republic of Belarus

    Directory of Open Access Journals (Sweden)

    E.V. Fedorenko

    2015-09-01

    Full Text Available Health risk analysis methodology is an internationally recognized tool for ensuring food safety. Three main elements of risk analysis are risk assessment, risk management and risk communication to inform the interested parties on the risk, are legislated and implemented in the Eurasian Economic Union and the Republic of Belarus. There is a corresponding organizational and functional framework for the application of risk analysis methodology as in the justification of production safety indicators and the implementation of public health surveillance. Common methodological approaches and criteria for evaluating public health risk are determined, which are used in the development and application of food safety requirements. Risk assessment can be used in justifying the indicators of safety (contaminants, food additives, and evaluating the effectiveness of programs on enrichment of food with micronutrients.

  8. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  9. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  10. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Sexual orientation and adolescent substance use: a meta-analysis and methodological review.

    Science.gov (United States)

    Marshal, Michael P; Friedman, Mark S; Stall, Ron; King, Kevin M; Miles, Jonathan; Gold, Melanie A; Bukstein, Oscar G; Morse, Jennifer Q

    2008-04-01

    Several decades of research have shown that lesbian, gay and bisexual (LGB) adults are at high risk for substance use and substance use disorders (SUDs). These problems may often start prior to young adulthood; however, relatively little is known about risk for substance use in LGB adolescents. The primary aims of this paper were to conduct a meta-analysis of the relationship between sexual orientation and adolescent substance use and a systematic review and critique of the methodological characteristics of this literature. Medical and social science journals were searched using Medline and PsychInfo. Studies were included if they tested the relationship between sexual orientation and adolescent substance use. Eighteen published studies were identified. Data analysis procedures followed expert guidelines, and used National Institutes of Health (NIH)-sponsored meta-analysis software. LGB adolescents reported higher rates of substance use compared to heterosexual youth (overall odds ratio = 2.89, Cohen's d = 0.59). Effect sizes varied by gender, bisexuality status, sexual orientation definition and recruitment source. None of the studies tested mediation and only one tested moderation. One employed a matched comparison group design, one used a longitudinal design, and very few controlled for possible confounding variables. The odds of substance use for LGB youth were, on average, 190% higher than for heterosexual youth and substantially higher within some subpopulations of LGB youth (340% higher for bisexual youth, 400% higher for females). Causal mechanisms, protective factors and alternative explanations for this effect, as well as long-term substance use outcomes in LGB youth, remain largely unknown.

  12. Chemical analysis of incense smokes used in Shaxi, Southwest China: a novel methodological approach in ethnobotany.

    Science.gov (United States)

    Staub, Peter O; Schiestl, Florian P; Leonti, Marco; Weckerle, Caroline S

    2011-10-31

    Characterization and comparative analysis of the main VOCs (volatile organic compounds) present in the smoke of 11 experimentally combusted plant species used as incense in Shaxi, Southwest China. Substances which may be responsible for the pleasant smell of the smokes as well as substances with a potential pharmacological activity are discussed. We adopt the dynamic headspace sorption method for the collection of smoke samples as a novel methodological approach in ethnobotany. The VOCs were identified using gas chromatography-mass spectrometry (GC-MS). Principal component analysis and canonical discriminant analysis were performed using PASW statistics (Version 18.0.2). Among the identified compounds were 10 monoterpenoids, 7 sesquiterpenoids, 6 linear hydrocarbons, 6 methoxy phenolics, 2 benzenoids, 2 polycyclic aromatic hydrocarbons, and 2 fatty acids. Based on their volatile profiles, the species are well clustered intraspecifically and separated interspecifically. The most abundant among the compounds potentially responsible for the pleasant smells of the smokes are methyl salicylate (12.28±3.90%) for Gaultheria fragrantissima leaves, δ-cadinene (15.58±2.29%) for Juniperus squamata wood, and α-Pinene for Cupressus funebris branches (9.16±7.73%) and Pistacia weinmanniifolia branches (19.52±8.66%). A couple of substances found are known for pharmacological activity, such as methylsalycilate, beta-caryophyllene and cedrol. The species used by the local people in Shaxi for incense differ clearly with respect to the chemical compounds of their smoke. Further, incense contains substances, which are of pharmacological interest and might support medicinal uses of smoke. Cedrol with its pleasant smell and sedative properties may be an important factor why specific plants are chosen as incense. Our findings support the idea that the effects of the use of incense as well as medicinal smoke depend on both, the cultural as well as the pharmacological context

  13. Analysis of modeling cumulative noise from simultaneous flights volume 1 : analysis at four national parks

    Science.gov (United States)

    2012-12-31

    This is the first of two volumes of the report on modeling cumulative noise from simultaneous flights. This volume includes: an overview of the time compression algorithms used to model simultaneous aircraft; revised summary of a preliminary study (w...

  14. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  15. Left ventricular pressure and volume data acquisition and analysis using LabVIEW.

    Science.gov (United States)

    Cassidy, S C; Teitel, D F

    1997-03-01

    To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.

  16. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally

  17. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed toget...

  18. Photovoltaic venture analysis. Final report. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    A description of the integrating model for photovoltaic venture analysis is given; input assumptions for the model are described; and the integrating model program listing is given. The integrating model is an explicit representation of the interactions between photovoltaic markets and supply under alternative sets of assumptions. It provides a consistent way of assembling and integrating the various assumptions, data, and information that have been obtained on photovoltaic systems supply and demand factors. Secondly, it provides a mechanism for understanding the implications of all the interacting assumptions. By representing the assumptions in a common, explicit framework, much more complex interactions can be considered than are possible intuitively. The integrating model therefore provides a way of examining the relative importance of different assumptions, parameters, and inputs through sensitivity analysis. Also, detailed results of model sensitivity analysis and detailed market and systems information are presented. (WHK)

  19. Comparative analysis of two methods for measuring sales volumes during malaria medicine outlet surveys.

    Science.gov (United States)

    Patouillard, Edith; Kleinschmidt, Immo; Hanson, Kara; Pok, Sochea; Palafox, Benjamin; Tougher, Sarah; O'Connell, Kate; Goodman, Catherine

    2013-09-05

    There is increased interest in using commercial providers for improving access to quality malaria treatment. Understanding their current role is an essential first step, notably in terms of the volume of diagnostics and anti-malarials they sell. Sales volume data can be used to measure the importance of different provider and product types, frequency of parasitological diagnosis and impact of interventions. Several methods for measuring sales volumes are available, yet all have methodological challenges and evidence is lacking on the comparability of different methods. Using sales volume data on anti-malarials and rapid diagnostic tests (RDTs) for malaria collected through provider recall (RC) and retail audits (RA), this study measures the degree of agreement between the two methods at wholesale and retail commercial providers in Cambodia following the Bland-Altman approach. Relative strengths and weaknesses of the methods were also investigated through qualitative research with fieldworkers. A total of 67 wholesalers and 107 retailers were sampled. Wholesale sales volumes were estimated through both methods for 62 anti-malarials and 23 RDTs and retail volumes for 113 anti-malarials and 33 RDTs. At wholesale outlets, RA estimates for anti-malarial sales were on average higher than RC estimates (mean difference of four adult equivalent treatment doses (95% CI 0.6-7.2)), equivalent to 30% of mean sales volumes. For RDTs at wholesalers, the between-method mean difference was not statistically significant (one test, 95% CI -6.0-4.0). At retail outlets, between-method differences for both anti-malarials and RDTs increased with larger volumes being measured, so mean differences were not a meaningful measure of agreement between the methods. Qualitative research revealed that in Cambodia where sales volumes are small, RC had key advantages: providers were perceived to remember more easily their sales volumes and find RC less invasive; fieldworkers found it more

  20. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.