WorldWideScience

Sample records for analysis methodology volume

  1. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  2. Method for Determining Language Objectives and Criteria. Volume II. Methodological Tools: Computer Analysis, Data Collection Instruments.

    Science.gov (United States)

    1979-05-25

    This volume presents (1) Methods for computer and hand analysis of numerical language performance data (includes examples) (2) samples of interview, observation, and survey instruments used in collecting language data. (Author)

  3. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-02-27

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  4. Transport of solid commodities via freight pipeline: demand analysis methodology. Volume IV. First year final report

    Energy Technology Data Exchange (ETDEWEB)

    Allen, W.B.; Plaut, T.

    1976-07-01

    In order to determine the feasibility of intercity freight pipelines, it was necessary to determine whether sufficient traffic flows currently exist between various origins and destinations to justify consideration of a mode whose operating characteristics became competitive under conditions of high-traffic volume. An intercity origin/destination freight-flow matrix was developed for a large range of commodities from published sources. A high-freight traffic-density corridor between Chicago and New York and another between St. Louis and New York were studied. These corridors, which represented 18 cities, had single-direction flows of 16 million tons/year. If trans-shipment were allowed at each of the 18 cities, flows of up to 38 million tons/year were found in each direction. These figures did not include mineral or agricultural products. After determining that such pipeline-eligible freight-traffic volumes existed, the next step was to determine the ability of freight pipeline to penetrate such markets. Modal-split models were run on aggregate data from the 1967 Census of Transportation. Modal-split models were also run on disaggregate data specially collected for this study. The freight pipeline service characteristics were then substituted into both the aggregate and disaggregate models (truck vs. pipeline and then rail vs. pipeline) and estimates of pipeline penetration into particular STCC commodity groups were made. Based on these very preliminary results, it appears that freight pipeline has market penetration potential that is consistent with high-volume participation in the intercity freight market.

  5. Seismic hazard analysis application of methodology, results, and sensitivity studies. Volume 4

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D. L

    1981-08-08

    As part of the Site Specific Spectra Project, this report seeks to identify the sources of and minimize uncertainty in estimates of seismic hazards in the Eastern United States. Findings are being used by the Nuclear Regulatory Commission to develop a synthesis among various methods that can be used in evaluating seismic hazard at the various plants in the Eastern United States. In this volume, one of a five-volume series, we discuss the application of the probabilistic approach using expert opinion. The seismic hazard is developed at nine sites in the Central and Northeastern United States, and both individual experts' and synthesis results are obtained. We also discuss and evaluate the ground motion models used to develop the seismic hazard at the various sites, analyzing extensive sensitivity studies to determine the important parameters and the significance of uncertainty in them. Comparisons are made between probabilistic and real spectral for a number of Eastern earthquakes. The uncertainty in the real spectra is examined as a function of the key earthquake source parameters. In our opinion, the single most important conclusion of this study is that the use of expert opinion to supplement the sparse data available on Eastern United States earthquakes is a viable approach for determining estimted seismic hazard in this region of the country. 29 refs., 15 tabs.

  6. Computer technology -- 1996: Applications and methodology. PVP-Volume 326

    Energy Technology Data Exchange (ETDEWEB)

    Hulbert, G.M. [ed.] [Univ. of Michigan, Ann Arbor, MI (United States); Hsu, K.H. [ed.] [Babcock and Wilcox, Barberton, OH (United States); Lee, T.W. [ed.] [FMC Corp., Santa Clara, CA (United States); Nicholas, T. [ed.] [USAF Wright Laboratory, Wright-Patterson AFB, OH (United States)

    1996-12-01

    The primary objective of the Computer Technology Committee of the ASME Pressure Vessels and Piping Division is to promote interest and technical exchange in the field of computer technology, related to the design and analysis of pressure vessels and piping. The topics included in this volume are: analysis of bolted joints; nonlinear analysis, applications and methodology; finite element analysis and applications; and behavior of materials. Separate abstracts were prepared for 23 of the papers in this volume.

  7. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  8. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  9. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 2: Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  10. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  11. Risk analysis methodology survey

    Science.gov (United States)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  12. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  13. Cassini Spacecraft Uncertainty Analysis Data and Methodology Review and Update/Volume 1: Updated Parameter Uncertainty Models for the Consequence Analysis

    Energy Technology Data Exchange (ETDEWEB)

    WHEELER, TIMOTHY A.; WYSS, GREGORY D.; HARPER, FREDERICK T.

    2000-11-01

    Uncertainty distributions for specific parameters of the Cassini General Purpose Heat Source Radioisotope Thermoelectric Generator (GPHS-RTG) Final Safety Analysis Report consequence risk analysis were revised and updated. The revisions and updates were done for all consequence parameters for which relevant information exists from the joint project on Probabilistic Accident Consequence Uncertainty Analysis by the United States Nuclear Regulatory Commission and the Commission of European Communities.

  14. Architectural and Behavioral Systems Design Methodology and Analysis for Optimal Habitation in a Volume-Limited Spacecraft for Long Duration Flights

    Science.gov (United States)

    Kennedy, Kriss J.; Lewis, Ruthan; Toups, Larry; Howard, Robert; Whitmire, Alexandra; Smitherman, David; Howe, Scott

    2016-01-01

    As our human spaceflight missions change as we reach towards Mars, the risk of an adverse behavioral outcome increases, and requirements for crew health, safety, and performance, and the internal architecture, will need to change to accommodate unprecedented mission demands. Evidence shows that architectural arrangement and habitability elements impact behavior. Net habitable volume is the volume available to the crew after accounting for elements that decrease the functional volume of the spacecraft. Determination of minimum acceptable net habitable volume and associated architectural design elements, as mission duration and environment varies, is key to enabling, maintaining, andor enhancing human performance and psychological and behavioral health. Current NASA efforts to derive minimum acceptable net habitable volumes and study the interaction of covariates and stressors, such as sensory stimulation, communication, autonomy, and privacy, and application to internal architecture design layouts, attributes, and use of advanced accommodations will be presented. Furthermore, implications of crew adaptation to available volume as they transfer from Earth accommodations, to deep space travel, to planetary surface habitats, and return, will be discussed.

  15. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological) p

  16. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  17. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  18. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  19. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    OpenAIRE

    Juan Carlos Rincón-Vásquez; Andrea Velandia-Morales; Idaly Barreto

    2011-01-01

    This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1) Structure Research, 2) Collection of informationand, 3) Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  20. METHODOLOGICAL STRATEGIES FOR TEXTUAL DATA ANALYSIS:

    Directory of Open Access Journals (Sweden)

    Juan Carlos Rincón-Vásquez

    2011-12-01

    Full Text Available This paper presents a classification methodology for studies of textual data. Thisclassification is based on the two predominant methodologies for social scienceresearch: qualitative and quantitative. The basic assumption is that the researchprocess involves three main features: 1 Structure Research, 2 Collection of informationand, 3 Analysis and Interpretation of Data. In each, there are generalguidelines for textual studies.

  1. Rat sperm motility analysis: methodologic considerations

    Science.gov (United States)

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  2. Similar methodological analysis involving the user experience.

    Science.gov (United States)

    Almeida e Silva, Caio Márcio; Okimoto, Maria Lúcia R L; Tanure, Raffaela Leane Zenni

    2012-01-01

    This article deals with the use of a protocol for analysis of similar methodological analysis related to user experience. For both, were selected articles recounting experiments in the area. They were analyze based on the similar analysis protocol and finally, synthesized and associated.

  3. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  4. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... in collaborative governance processes in Denmark. The paper contributes to refining methodologies of organizational discourse analysis by elaborating method-mixing that embraces multimodal organizational discourses. Furthermore it discusses practical implications of the struggling subjectification processes......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork...

  5. Exploring participatory methodologies in organizational discourse analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both......-and practices by dealing with challenges of methodological overview, responsive creativity and identity-struggle. The potentials hereof are demonstrated and discussed with cases of two both critical and co-creative practices, namely ‘organizational modelling’ and ‘fixed/unfixed positioning’ from fieldwork...

  6. Seismic hazard methodology for the Central and Eastern United States. Volume 1: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, R.K.; Veneziano, D.; Toro, G.; O' Hara, T.; Drake, L.; Patwardhan, A.; Kulkarni, R.; Kenney, R.; Winkler, R.; Coppersmith, K.

    1986-07-01

    A methodology to estimate the hazard of earthquake ground motion at a site has been developed. The methodology consists of systematic procedures to characterize earthquake sources, the seismicity parameters of those sources, and functions for the attenuation of seismic energy, incorporating multiple input interpretations by earth scientists. Uncertainties reflecting permissible alternative inperpretations are quantified by use of probability logic trees and are propagated through the hazard results. The methodology is flexible and permits, for example, interpretations of seismic sources that are consistent with earth-science practice in the need to depict complexity and to accommodate alternative hypotheses. This flexibility is achieved by means of a tectonic framework interpretation from which alternative seismic sources are derived. To estimate rates of earthquake recurrence, maximum use is made of the historical earthquake database in establishing a uniform measure of earthquake size, in identifying independent events, and in detemining the completeness of the earthquake record in time, space, and magnitude. Procedures developed as part of the methodology permit relaxation of the usual assumption of homogeneous seismicity within a source and provide unbiased estimates of recurrence parameters. The methodology incorporates the Poisson-exponential earthquake recurrence model and an extensive assessment of its applicability is provided. Finally, the methodology includes procedures to aggregate hazard results from a number of separate input interpretations to obtain a best-estimate value of hazard, together with its uncertainty, at a site.

  7. Fractal analysis: methodologies for biomedical researchers.

    Science.gov (United States)

    Ristanović, Dusan; Milosević, Nebojsa T

    2012-01-01

    Fractal analysis has become a popular method in all branches of scientific investigations including biology and medicine. Although there is a growing interest in the application of fractal analysis in biological sciences, questions about the methodology of fractal analysis have partly restricted its wider and comprehensible application. It is a notable fact that fractal analysis is derived from fractal geometry, but there are some unresolved issues that need to be addressed. In this respect, we discuss several related underlying principles for fractal analysis and establish the meaningful relationship between fractal analysis and fractal geometry. Since some concepts in fractal analysis are determined descriptively and/or qualitatively, this paper provides their exact mathematical definitions or explanations. Another aim of this study is to show that nowadays fractal analysis is an independent mathematical and experimental method based on Mandelbrot's fractal geometry, Euclidean traditiontal geometry and Richardson's coastline method.

  8. Mass Spectrometry Methodology in Lipid Analysis

    Directory of Open Access Journals (Sweden)

    Lin Li

    2014-06-01

    Full Text Available Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics technology greatly push forward the study of lipidomics. Among them, mass spectrometry (MS is the most important technology for lipid analysis. In this review, the methodology based on MS for lipid analysis was introduced. It is believed that along with the rapid development of MS and its further applications to lipid analysis, more functional lipids will be identified as biomarkers and therapeutic targets and for the study of the mechanisms of disease.

  9. Spatial analysis methodology applied to rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)

    2006-08-15

    The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)

  10. Enhanced recovery of unconventional gas. The methodology--Volume III (of 3 volumes)

    Energy Technology Data Exchange (ETDEWEB)

    Kuuskraa, V. A.; Brashear, J. P.; Doscher, T. M.; Elkins, L. E.

    1979-02-01

    The methodology is described in chapters on the analytic approach, estimated natural gas production, recovery from tight gas sands, recovery from Devonian shales, recovery from coal seams, and recovery from geopressured aquifers. (JRD)

  11. Design and Verification Methodology of Boundary Conditions for Finite Volume Schemes

    Science.gov (United States)

    2012-07-01

    Finite Volume Schemes 5a. CONTRACT NUMBER In-House 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Folkner, D., Katz , A and Sankaran...July 9-13, 2012 ICCFD7-2012-1001 Design and Verification Methodology of Boundary Conditions for Finite Volume Schemes D. Folkner∗, A. Katz ∗ and V...Office (ARO), under the supervision of Dr. Frederick Ferguson. The authors would like to thank Dr. Ferguson for his continuing support of this research

  12. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  13. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  14. Cost analysis methodology: Photovoltaic Manufacturing Technology Project

    Energy Technology Data Exchange (ETDEWEB)

    Whisnant, R.A. (Research Triangle Inst., Research Triangle Park, NC (United States))

    1992-09-01

    This report describes work done under Phase 1 of the Photovoltaic Manufacturing Technology (PVMaT) Project. PVMaT is a five-year project to support the translation of research and development in PV technology into the marketplace. PVMaT, conceived as a DOE/industry partnership, seeks to advanced PV manufacturing technologies, reduce PV module production costs, increase module performance, and expand US commercial production capacities. Under PVMaT, manufacturers will propose specific manufacturing process improvements that may contribute to the goals of the project, which is to lessen the cost, thus hastening entry into the larger scale, grid-connected applications. Phase 1 of the PVMaT project is to identify obstacles and problems associated with manufacturing processes. This report describes the cost analysis methodology required under Phase 1 that will allow subcontractors to be ranked and evaluated during Phase 2.

  15. Bare-Hand Volume Cracker for Raw Volume Data Analysis

    Directory of Open Access Journals (Sweden)

    Bireswar Laha

    2016-09-01

    Full Text Available Analysis of raw volume data generated from different scanning technologies faces a variety of challenges, related to search, pattern recognition, spatial understanding, quantitative estimation, and shape description. In a previous study, we found that the Volume Cracker (VC 3D interaction (3DI technique mitigated some of these problems, but this result was from a tethered glove-based system with users analyzing simulated data. Here, we redesigned the VC by using untethered bare-hand interaction with real volume datasets, with a broader aim of adoption of this technique in research labs. We developed symmetric and asymmetric interfaces for the Bare-Hand Volume Cracker (BHVC through design iterations with a biomechanics scientist. We evaluated our asymmetric BHVC technique against standard 2D and widely used 3D interaction techniques with experts analyzing scanned beetle datasets. We found that our BHVC design significantly outperformed the other two techniques. This study contributes a practical 3DI design for scientists, documents lessons learned while redesigning for bare-hand trackers, and provides evidence suggesting that 3D interaction could improve volume data analysis for a variety of visual analysis tasks. Our contribution is in the realm of 3D user interfaces tightly integrated with visualization, for improving the effectiveness of visual analysis of volume datasets. Based on our experience, we also provide some insights into hardware-agnostic principles for design of effective interaction techniques.

  16. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  17. High volume data storage architecture analysis

    Science.gov (United States)

    Malik, James M.

    1990-01-01

    A High Volume Data Storage Architecture Analysis was conducted. The results, presented in this report, will be applied to problems of high volume data requirements such as those anticipated for the Space Station Control Center. High volume data storage systems at several different sites were analyzed for archive capacity, storage hierarchy and migration philosophy, and retrieval capabilities. Proposed architectures were solicited from the sites selected for in-depth analysis. Model architectures for a hypothetical data archiving system, for a high speed file server, and for high volume data storage are attached.

  18. The space station assembly phase: Flight telerobotic servicer feasibility. Volume 2: Methodology and case study

    Science.gov (United States)

    Smith, Jeffrey H.; Gyamfi, Max A.; Volkmer, Kent; Zimmerman, Wayne F.

    1987-01-01

    A methodology is described for examining the feasibility of a Flight Telerobotic Servicer (FTS) using two assembly scenarios, defined at the EVA task level, for the 30 shuttle flights (beginning with MB-1) over a four-year period. Performing all EVA tasks by crew only is compared to a scenario in which crew EVA is augmented by FTS. A reference FTS concept is used as a technology baseline and life-cycle cost analysis is performed to highlight cost tradeoffs. The methodology, procedure, and data used to complete the analysis are documented in detail.

  19. Swept Volume Parameterization for Isogeometric Analysis

    Science.gov (United States)

    Aigner, M.; Heinrich, C.; Jüttler, B.; Pilgerstorfer, E.; Simeon, B.; Vuong, A.-V.

    Isogeometric Analysis uses NURBS representations of the domain for performing numerical simulations. The first part of this paper presents a variational framework for generating NURBS parameterizations of swept volumes. The class of these volumes covers a number of interesting free-form shapes, such as blades of turbines and propellers, ship hulls or wings of airplanes. The second part of the paper reports the results of isogeometric analysis which were obtained with the help of the generated NURBS volume parameterizations. In particular we discuss the influence of the chosen parameterization and the incorporation of boundary conditions.

  20. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  1. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  2. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  3. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  4. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  5. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  6. Finite element analysis applied to dentoalveolar trauma: methodology description.

    Science.gov (United States)

    da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated.

  7. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  8. Mass Spectrometry Methodology in Lipid Analysis

    OpenAIRE

    Lin Li; Juanjuan Han; Zhenpeng Wang; Jian'an Liu; Jinchao Wei; Shaoxiang Xiong; Zhenwen Zhao

    2014-01-01

    Lipidomics is an emerging field, where the structures, functions and dynamic changes of lipids in cells, tissues or body fluids are investigated. Due to the vital roles of lipids in human physiological and pathological processes, lipidomics is attracting more and more attentions. However, because of the diversity and complexity of lipids, lipid analysis is still full of challenges. The recent development of methods for lipid extraction and analysis and the combination with bioinformatics tech...

  9. PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jin Hui; Wang Jinnuo; Wang Libin

    2003-01-01

    The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.

  10. A New Methodology of Spatial Crosscorrelation Analysis

    CERN Document Server

    Chen, Yanguang

    2015-01-01

    The idea of spatial crosscorrelation was conceived of long ago. However, unlike the related spatial autocorrelation, the theory and method of spatial crosscorrelation analysis have remained undeveloped. This paper presents a set of models and working methods for spatial crosscorrelation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form and by means of mathematical reasoning, I derive a theoretical framework for geographical crosscorrelation analysis. First, two sets of spatial crosscorrelation coefficients are defined, including a global spatial crosscorrelation coefficient and a set of local spatial crosscorrelation coefficients. Second, a pair of scatterplots of spatial crosscorrelation is proposed, and different scatterplots show different relationships between correlated variables. Based on the spatial crosscorrelation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial crosscorrelation) and indirect correlation (sp...

  11. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  12. Bifilar analysis study, volume 1

    Science.gov (United States)

    Miao, W.; Mouzakis, T.

    1980-01-01

    A coupled rotor/bifilar/airframe analysis was developed and utilized to study the dynamic characteristics of the centrifugally tuned, rotor-hub-mounted, bifilar vibration absorber. The analysis contains the major components that impact the bifilar absorber performance, namely, an elastic rotor with hover aerodynamics, a flexible fuselage, and nonlinear individual degrees of freedom for each bifilar mass. Airspeed, rotor speed, bifilar mass and tuning variations are considered. The performance of the bifilar absorber is shown to be a function of its basic parameters: dynamic mass, damping and tuning, as well as the impedance of the rotor hub. The effect of the dissimilar responses of the individual bifilar masses which are caused by tolerance induced mass, damping and tuning variations is also examined.

  13. Seismic hazard methodology for the Central and Eastern United States: Volume 1: Part 2, Methodology (Revision 1): Final report

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, R.K.; Veneziano, D.; Van Dyck, J.; Toro, G.; O' Hara, T.; Drake, L.; Patwardhan, A.; Kulkarni, R.; Keeney, R.; Winkler, R.

    1988-11-01

    Aided by its consultant, the US Geologic Survey (USGS), the Nuclear Regulatory Commission (NRC) reviewed ''Seismic Hazard Methodology for the Central and Eastern United States.'' This topical report was submitted jointly by the Seismicity Owners Group (SOG) and the Electric Power Research Institute (EPRI) in July 1986 and was revised in February 1987. The NRC staff concludes that SOG/EPRI Seismic Hazard Methodology as documented in the topical report and associated submittals, is an acceptable methodology for use in calculating seismic hazard in the Central and Eastern United States (CEUS). These calculations will be based upon the data and information documented in the material that was submitted as the SOG/EPRI topical report and ancillary submittals. However, as part of the review process the staff conditions its approval by noting areas in which problems may arise unless precautions detailed in the report are observed. 23 refs.

  14. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  15. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  16. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  17. Financial constraints in capacity planning: a national utility regulatory model (NUREG). Volume I of III: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1981-10-29

    This report develops and demonstrates the methodology for the National Utility Regulatory (NUREG) Model developed under contract number DEAC-01-79EI-10579. It is accompanied by two supporting volumes. Volume II is a user's guide for operation of the NUREG software. This includes description of the flow of software and data, as well as the formats of all user data files. Finally, Volume III is a software description guide. It briefly describes, and gives a listing of, each program used in NUREG.

  18. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  19. Active disturbance rejection control: methodology and theoretical analysis.

    Science.gov (United States)

    Huang, Yi; Xue, Wenchao

    2014-07-01

    The methodology of ADRC and the progress of its theoretical analysis are reviewed in the paper. Several breakthroughs for control of nonlinear uncertain systems, made possible by ADRC, are discussed. The key in employing ADRC, which is to accurately determine the "total disturbance" that affects the output of the system, is illuminated. The latest results in theoretical analysis of the ADRC-based control systems are introduced.

  20. Estimates of emergency operating capacity in US manufacturing and nonmanufacturing industries - Volume 1: Concepts and Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, D.B. (Pacific Northwest Lab., Richland, WA (USA)); Serot, D.E. (D/E/S Research, Richland, WA (USA)); Kellogg, M.A. (ERCE, Inc., Portland, OR (USA))

    1991-03-01

    Development of integrated mobilization preparedness policies requires planning estimates of available productive capacity during national emergency conditions. Such estimates must be developed in a manner to allow evaluation of current trends in capacity and the consideration of uncertainties in various data inputs and in engineering assumptions. This study developed estimates of emergency operating capacity (EOC) for 446 manufacturing industries at the 4-digit Standard Industrial Classification (SIC) level of aggregation and for 24 key nonmanufacturing sectors. This volume lays out the general concepts and methods used to develop the emergency operating estimates. The historical analysis of capacity extends from 1974 through 1986. Some nonmanufacturing industries are included. In addition to mining and utilities, key industries in transportation, communication, and services were analyzed. Physical capacity and efficiency of production were measured. 3 refs., 2 figs., 12 tabs. (JF)

  1. A Case Study on Methodological Approaches to Conversation Analysis

    Institute of Scientific and Technical Information of China (English)

    廉莲

    2014-01-01

    Taking a piece of social interaction as the object of the study, some basic and brief analysis on how meaning is negotiat-ed is offered from both structural and functional perspectives. The potential purpose is to provide readers with a maybe rough but clear presentation of those assorted methods used in conversation analysis. Out of the presentation is developed a possibility for language learners as well as teachers to be more aware of the differences and also the interrelations among these methodological approaches to conversation analysis, which may be of some relevance to teaching practice.

  2. Transfinite element methodology towards a unified thermal/structural analysis

    Science.gov (United States)

    Tamma, K. K.; Railkar, S. B.

    1986-01-01

    The paper describes computational developments towards thermal/structural modeling and analysis via a generalized common numerical methodology for effectively and efficiently interfacing interdisciplinary areas. The proposed formulations use transform methods in conjunction with finite element developments for each of the heat transfer and structural disciplines, respectively, providing avenues for obtaining the structural response due to thermal effects. An alternative methodology for unified thermal/structural analysis is presented. The potential of the approach is outlined in comparison with conventional schemes and existing practices. Highlights and characteristic features of the approach are described via general formulations and applications to several problems. Results obtained demonstrate excellent agreement in comparison with analytic and/or conventional finite element schemes accurately and efficiently.

  3. Latest developments on safety analysis methodologies at the Juzbado plant

    Energy Technology Data Exchange (ETDEWEB)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A. [ENUSA Industrias Avanzadas S. A., Juzbado Nuclear Fuel Fabrication Plant, Ctra. Salamanca-Ledesma, km. 26, 37015 Juzbado, Salamanca (Spain)

    2010-07-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  4. Event-scale power law recession analysis: quantifying methodological uncertainty

    Science.gov (United States)

    Dralle, David N.; Karst, Nathaniel J.; Charalampous, Kyriakos; Veenstra, Andrew; Thompson, Sally E.

    2017-01-01

    The study of single streamflow recession events is receiving increasing attention following the presentation of novel theoretical explanations for the emergence of power law forms of the recession relationship, and drivers of its variability. Individually characterizing streamflow recessions often involves describing the similarities and differences between model parameters fitted to each recession time series. Significant methodological sensitivity has been identified in the fitting and parameterization of models that describe populations of many recessions, but the dependence of estimated model parameters on methodological choices has not been evaluated for event-by-event forms of analysis. Here, we use daily streamflow data from 16 catchments in northern California and southern Oregon to investigate how combinations of commonly used streamflow recession definitions and fitting techniques impact parameter estimates of a widely used power law recession model. Results are relevant to watersheds that are relatively steep, forested, and rain-dominated. The highly seasonal mediterranean climate of northern California and southern Oregon ensures study catchments explore a wide range of recession behaviors and wetness states, ideal for a sensitivity analysis. In such catchments, we show the following: (i) methodological decisions, including ones that have received little attention in the literature, can impact parameter value estimates and model goodness of fit; (ii) the central tendencies of event-scale recession parameter probability distributions are largely robust to methodological choices, in the sense that differing methods rank catchments similarly according to the medians of these distributions; (iii) recession parameter distributions are method-dependent, but roughly catchment-independent, such that changing the choices made about a particular method affects a given parameter in similar ways across most catchments; and (iv) the observed correlative relationship

  5. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    OpenAIRE

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar traum...

  6. Geolocation Via Satellite: A Methodology and Error Analysis

    Science.gov (United States)

    1988-05-01

    SATELLITE: A Methodology and Error Analysis 12. PERSONAL AUTHOR(S) M.J. Shensa 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (YMo, MoK Dho...b sino cos y sinO (4.17)[-a a cos4 cosv -b costo cosn Comparing this with (4.8), we find sinD-1 a ov I + b .0 11 (4.18) Since S _R (4.19) this

  7. Relationship between stroke volume and pulse pressure during blood volume perturbation: a mathematical analysis.

    Science.gov (United States)

    Bighamian, Ramin; Hahn, Jin-Oh

    2014-01-01

    Arterial pulse pressure has been widely used as surrogate of stroke volume, for example, in the guidance of fluid therapy. However, recent experimental investigations suggest that arterial pulse pressure is not linearly proportional to stroke volume. However, mechanisms underlying the relation between the two have not been clearly understood. The goal of this study was to elucidate how arterial pulse pressure and stroke volume respond to a perturbation in the left ventricular blood volume based on a systematic mathematical analysis. Both our mathematical analysis and experimental data showed that the relative change in arterial pulse pressure due to a left ventricular blood volume perturbation was consistently smaller than the corresponding relative change in stroke volume, due to the nonlinear left ventricular pressure-volume relation during diastole that reduces the sensitivity of arterial pulse pressure to perturbations in the left ventricular blood volume. Therefore, arterial pulse pressure must be used with care when used as surrogate of stroke volume in guiding fluid therapy.

  8. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  9. PANSYSTEMS ANALYSIS: MATHEMATICS, METHODOLOGY,RELATIVITY AND DIALECTICAL THINKING

    Institute of Scientific and Technical Information of China (English)

    郭定和; 吴学谋; 冯向军; 李永礼

    2001-01-01

    Based on new analysis modes and new definitions with relative mathematization and simplification or strengthening forms for concepts of generalized systems,panderivatives , pansymmetry , panbox principle, pansystems relativity, etc. , the framework and related principles of pansystems methodology and pansystems relativity are developed. Related contents include: pansystems with relatively universal mathematizing forns, 200 types of dualities, duality transformation, pansymmetry transformation,pansystems dialectics, the 8-domain method, pansystems mathematical methods,generalized quantification, the principles of approximation-transforming, pan-equivalence theorems , supply-demand analysis, thinking experiment, generalized gray systems, etc.

  10. Simplifying multivariate survival analysis using global score test methodology

    Science.gov (United States)

    Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz

    2015-12-01

    In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve multiple endpoints, and this situation further complicates the analysis of survival data. In the case of tumor patients, endpoints concerning survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For each patient, these endpoints are correlated, and the estimation of the correlation between two score statistics is fundamental in derivation of overall treatment advantage. In this paper, the bivariate survival analysis method using the global score test methodology is extended to multivariate setting.

  11. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-08-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  12. Registering a methodology for imaging and analysis of residual-limb shape after transtibial amputation

    Directory of Open Access Journals (Sweden)

    Alexander S. Dickinson, PhD

    2016-03-01

    Full Text Available Successful prosthetic rehabilitation following lower-limb amputation depends upon a safe and comfortable socket-residual limb interface. Current practice predominantly uses a subjective, iterative process to establish socket shape, often requiring several visits to a prosthetist. This study proposes an objective methodology for residual-limb shape scanning and analysis by high-resolution, automated measurements. A 3-D printed "analog" residuum was scanned with three surface digitizers on 10 occasions. Accuracy was measured by the scan-height error between repeat analog scans and the computer-aided design (CAD geometry and the scan versus CAD volume. Subsequently, 20 male residuum casts from ambulatory individuals with transtibial amputation were scanned by two observers, and 10 were repeat-scanned by one observer. The shape files were aligned spatially, and geometric measurements were extracted. Repeatability was evaluated by intraclass correlation, Bland-Altman analysis of scan volumes, and pairwise root-mean-square error ranges of scan area and width profiles. Submillimeter accuracy was achieved when scanning the analog shape using white light and laser scanning technologies. Scanning male residuum casts was highly repeatable within and between observers. The analysis methodology technique provides clinical researchers and prosthetists the capability to establish their own quantitative, objective, multipatient datasets. This could provide an evidence base for training, long-term follow-up, and interpatient outcome comparison, for decision support in socket design.

  13. Loss of Coolant Accident Analysis Methodology for SMART-P

    Energy Technology Data Exchange (ETDEWEB)

    Bae, K. H.; Lee, G. H.; Yang, S. H.; Yoon, H. Y.; Kim, S. H.; Kim, H. C

    2006-02-15

    The analysis methodology on the Loss-of-coolant accidents (LOCA's) for SMART-P is described in this report. SMART-P is an advanced integral type PWR producing a maximum thermal power of 65.5 MW with metallic fuel. LOCA's are hypothetical accidents that would result from the loss of reactor coolant, at a rate in excess of the capability of the reactor coolant makeup system, from breaks in pipes in the reactor coolant pressure boundary up to and including a break equivalent in size to the double-ended rupture of the largest pipe in the reactor coolant system. Since SMART-P contains the major primary circuit components in a single Reactor Pressure Vessel (RPV), the possibility of a large break LOCA (LBLOCA) is inherently eliminated and only the small break LOCA is postulated. This report describes the outline and acceptance criteria of small break LOCA (SBLOCA) for SMART-P and documents the conservative analytical model and method and the analysis results using the TASS/SMR code. This analysis method is applied in the SBLOCA analysis performed for the ECCS performance evaluation which is described in the section 6.3.3 of the safety analysis report. The prediction results of SBLOCA analysis model of SMART-P for the break flow, system's pressure and temperature distributions, reactor coolant distribution, single and two-phase natural circulation phenomena, and the time of major sequence of events, etc. should be compared and verified with the applicable separate and integral effects test results. Also, it is required to set-up the feasible acceptance criteria applicable to the metallic fueled integral reactor of SMART-P. The analysis methodology for the SBLOCA described in this report will be further developed and validated as the design and licensing status of SMART-P evolves.

  14. Evaluating some Reliability Analysis Methodologies in Seismic Design

    Directory of Open Access Journals (Sweden)

    A. E. Ghoulbzouri

    2011-01-01

    Full Text Available Problem statement: Accounting for uncertainties that are present in geometric and material data of reinforced concrete buildings is performed in this study within the context of performance based seismic engineering design. Approach: Reliability of the expected performance state is assessed by using various methodologies based on finite element nonlinear static pushover analysis and specialized reliability software package. Reliability approaches that were considered included full coupling with an external finite element code and surface response based methods in conjunction with either first order reliability method or importance sampling method. Various types of probability distribution functions that model parameters uncertainties were introduced. Results: The probability of failure according to the used reliability analysis method and to the selected distribution of probabilities was obtained. Convergence analysis of the importance sampling method was performed. The required duration of analysis as function of the used reliability method was evaluated. Conclusion/Recommendations: It was found that reliability results are sensitive to the used reliability analysis method and to the selected distribution of probabilities. Durations of analysis for coupling methods were found to be higher than those associated to surface response based methods; one should however include time needed to derive these lasts. For the reinforced concrete building considered in this study, it was found that significant variations exist between all the considered reliability methodologies. The full coupled importance sampling method is recommended, but the first order reliability method applied on a surface response model can be used with good accuracy. Finally, the distributions of probabilities should be carefully identified since giving the mean and the standard deviation were found to be insufficient.

  15. Volume totalizers analysis of pipelines operated by TRANSPETRO National Operational Control Center; Analise de totalizadores de volume em oleodutos operados pelo Centro Nacional de Controle e Operacao da TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Aramaki, Thiago Lessa; Montalvao, Antonio Filipe Falcao [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Marques, Thais Carrijo [Universidade do Estado do Rio de Janeiro (UERJ), RJ (Brazil)

    2012-07-01

    This paper aims to present the results and methodology in the analysis of differences in volume totals used in systems such as batch tracking and leak detection of pipelines operated by the National Center for Operational Control (CNCO) at TRANSPETRO. In order to optimize this type of analysis, software was developed to acquisition and processing of historical data using the methodology developed. The methodology developed takes into account the particularities encountered in systems operated by TRANSPETRO, more specifically, by CNCO. (author)

  16. Sensitivity Analysis of Entropy Generation in Nanofluid Flow inside a Channel by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Bijan Darbari

    2016-02-01

    Full Text Available Nanofluids can afford excellent thermal performance and have a major role in energy conservation aspect. In this paper, a sensitivity analysis has been performed by using response surface methodology to calculate the effects of nanoparticles on the entropy generation. For this purpose, the laminar forced convection of Al2O3-water nanofluid flow inside a channel is considered. The total entropy generation rates consist of the entropy generation rates due to heat transfer and friction loss are calculated by using velocity and temperature gradients. The continuity, momentum and energy equations have been solved numerically using a finite volume method. The sensitivity of the entropy generation rate to different parameters such as the solid volume fraction, the particle diameter, and the Reynolds number is studied in detail. Series of simulations were performed for a range of solid volume fraction 0 ≤ ϕ ≤ 0.05 , particle diameter 30  nm ≤ d p ≤ 90 ​ nm , and the Reynolds number 200 ≤ Re ≤ 800. The results showed that the total entropy generation is more sensitive to the Reynolds number rather than the nanoparticles diameter or solid volume fraction. Also, the magnitude of total entropy generation, which increases with increase in the Reynolds number, is much higher for the pure fluid rather than the nanofluid.

  17. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  18. Contemporary Impact Analysis Methodology for Planetary Sample Return Missions

    Science.gov (United States)

    Perino, Scott V.; Bayandor, Javid; Samareh, Jamshid A.; Armand, Sasan C.

    2015-01-01

    Development of an Earth entry vehicle and the methodology created to evaluate the vehicle's impact landing response when returning to Earth is reported. NASA's future Mars Sample Return Mission requires a robust vehicle to return Martian samples back to Earth for analysis. The Earth entry vehicle is a proposed solution to this Mars mission requirement. During Earth reentry, the vehicle slows within the atmosphere and then impacts the ground at its terminal velocity. To protect the Martian samples, a spherical energy absorber called an impact sphere is under development. The impact sphere is composed of hybrid composite and crushable foam elements that endure large plastic deformations during impact and cause a highly nonlinear vehicle response. The developed analysis methodology captures a range of complex structural interactions and much of the failure physics that occurs during impact. Numerical models were created and benchmarked against experimental tests conducted at NASA Langley Research Center. The postimpact structural damage assessment showed close correlation between simulation predictions and experimental results. Acceleration, velocity, displacement, damage modes, and failure mechanisms were all effectively captured. These investigations demonstrate that the Earth entry vehicle has great potential in facilitating future sample return missions.

  19. Segment clustering methodology for unsupervised Holter recordings analysis

    Science.gov (United States)

    Rodríguez-Sotelo, Jose Luis; Peluffo-Ordoñez, Diego; Castellanos Dominguez, German

    2015-01-01

    Cardiac arrhythmia analysis on Holter recordings is an important issue in clinical settings, however such issue implicitly involves attending other problems related to the large amount of unlabelled data which means a high computational cost. In this work an unsupervised methodology based in a segment framework is presented, which consists of dividing the raw data into a balanced number of segments in order to identify fiducial points, characterize and cluster the heartbeats in each segment separately. The resulting clusters are merged or split according to an assumed criterion of homogeneity. This framework compensates the high computational cost employed in Holter analysis, being possible its implementation for further real time applications. The performance of the method is measure over the records from the MIT/BIH arrhythmia database and achieves high values of sensibility and specificity, taking advantage of database labels, for a broad kind of heartbeats types recommended by the AAMI.

  20. Mechanistic Methodology for Airport Pavement Design with Engineering Fabrics. Volume 1. Theoretical and Experimental Bases.

    Science.gov (United States)

    1984-08-01

    DOTIFAAIPM-8419,, Mechanistic Methodology for Program Engineering& Airport Pavement Design with Maintenance Service Engineerin Washington, D.C. 20591...Reflective cracks require labor intensive operations for crack sealing and patching, thus becoming a significant maintenance expense item. The problem of...models or prediciting allowable critical strains are not available. The problems are complicated further by the fact that since asphaltic concrete is a

  1. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  2. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  3. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  4. Life prediction methodology for ceramic components of advanced vehicular heat engines: Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Khandelwal, P.K.; Provenzano, N.J.; Schneider, W.E. [Allison Engine Co., Indianapolis, IN (United States)

    1996-02-01

    One of the major challenges involved in the use of ceramic materials is ensuring adequate strength and durability. This activity has developed methodology which can be used during the design phase to predict the structural behavior of ceramic components. The effort involved the characterization of injection molded and hot isostatic pressed (HIPed) PY-6 silicon nitride, the development of nondestructive evaluation (NDE) technology, and the development of analytical life prediction methodology. Four failure modes are addressed: fast fracture, slow crack growth, creep, and oxidation. The techniques deal with failures initiating at the surface as well as internal to the component. The life prediction methodology for fast fracture and slow crack growth have been verified using a variety of confirmatory tests. The verification tests were conducted at room and elevated temperatures up to a maximum of 1371 {degrees}C. The tests involved (1) flat circular disks subjected to bending stresses and (2) high speed rotating spin disks. Reasonable correlation was achieved for a variety of test conditions and failure mechanisms. The predictions associated with surface failures proved to be optimistic, requiring re-evaluation of the components` initial fast fracture strengths. Correlation was achieved for the spin disks which failed in fast fracture from internal flaws. Time dependent elevated temperature slow crack growth spin disk failures were also successfully predicted.

  5. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  6. Rectal cancer surgery: volume-outcome analysis.

    LENUS (Irish Health Repository)

    Nugent, Emmeline

    2010-12-01

    There is strong evidence supporting the importance of the volume-outcome relationship with respect to lung and pancreatic cancers. This relationship for rectal cancer surgery however remains unclear. We review the currently available literature to assess the evidence base for volume outcome in relation to rectal cancer surgery.

  7. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  8. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  9. Metabolic tumour volumes measured at staging in lymphoma: methodological evaluation on phantom experiments and patients

    Energy Technology Data Exchange (ETDEWEB)

    Meignan, Michel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Paris-Est University, Service de Medecine Nucleaire, EAC CNRS 7054, Hopital Henri Mondor AP-HP, Creteil (France); Sasanelli, Myriam; Itti, Emmanuel [Hopital Henri Mondor and Paris-Est University, Department of Nuclear Medicine, Creteil (France); Casasnovas, Rene Olivier [CHU Le Bocage, Department of Hematology, Dijon (France); Luminari, Stefano [University of Modena and Reggio Emilia, Department of Diagnostic, Clinic and Public Health Medicine, Modena (Italy); Fioroni, Federica [Santa Maria Nuova Hospital-IRCCS, Department of Medical Physics, Reggio Emilia (Italy); Coriani, Chiara [Santa Maria Nuova Hospital-IRCCS, Department of Radiology, Reggio Emilia (Italy); Masset, Helene [Henri Mondor Hospital, Department of Radiophysics, Creteil (France); Gobbi, Paolo G. [University of Pavia, Department of Internal Medicine and Gastroenterology, Fondazione IRCCS Policlinico San Matteo, Pavia (Italy); Merli, Francesco [Santa Maria Nuova Hospital-IRCCS, Department of Hematology, Reggio Emilia (Italy); Versari, Annibale [Santa Maria Nuova Hospital-IRCCS, Department of Nuclear Medicine, Reggio Emilia (Italy)

    2014-06-15

    The presence of a bulky tumour at staging on CT is an independent prognostic factor in malignant lymphomas. However, its prognostic value is limited in diffuse disease. Total metabolic tumour volume (TMTV) determined on {sup 18}F-FDG PET/CT could give a better evaluation of the total tumour burden and may help patient stratification. Different methods of TMTV measurement established in phantoms simulating lymphoma tumours were investigated and validated in 40 patients with Hodgkin lymphoma and diffuse large B-cell lymphoma. Data were processed by two nuclear medicine physicians in Reggio Emilia and Creteil. Nineteen phantoms filled with {sup 18}F-saline were scanned; these comprised spherical or irregular volumes from 0.5 to 650 cm{sup 3} with tumour-to-background ratios from 1.65 to 40. Volumes were measured with different SUVmax thresholds. In patients, TMTV was measured on PET at staging by two methods: volumes of individual lesions were measured using a fixed 41 % SUVmax threshold (TMTV{sub 41}) and a variable visually adjusted SUVmax threshold (TMTV{sub var}). In phantoms, the 41 % threshold gave the best concordance between measured and actual volumes. Interobserver agreement was almost perfect. In patients, the agreement between the reviewers for TMTV{sub 41} measurement was substantial (ρ {sub c} = 0.986, CI 0.97 - 0.99) and the difference between the means was not significant (212 ± 218 cm{sup 3} for Creteil vs. 206 ± 219 cm{sup 3} for Reggio Emilia, P = 0.65). By contrast the agreement was poor for TMTV{sub var}. There was a significant direct correlation between TMTV{sub 41} and normalized LDH (r = 0.652, CI 0.42 - 0.8, P <0.001). Higher disease stages and bulky tumour were associated with higher TMTV{sub 41}, but high TMTV{sub 41} could be found in patients with stage 1/2 or nonbulky tumour. Measurement of baseline TMTV in lymphoma using a fixed 41% SUVmax threshold is reproducible and correlates with the other parameters for tumour mass evaluation

  10. Handbook of Systems Analysis: Volume 1. Overview. Chapter 2. The Genesis of Applied Systems Analysis

    OpenAIRE

    1981-01-01

    The International Institute for Applied Systems Analysis is preparing a Handbook of Systems Analysis, which will appear in three volumes: Volume 1: Overview is aimed at a widely varied audience of producers and users of systems analysis studies. Volume 2: Methods is aimed at systems analysts and other members of systems analysis teams who need basic knowledge of methods in which they are not expert; this volume contains introductory overviews of such methods. Volume 3: Cases co...

  11. National Aviation Fuel Scenario Analysis Program (NAFSAP). Volume I. Model Description. Volume II. User Manual.

    Science.gov (United States)

    1980-03-01

    TESI CHART NATIONAI RUREAt (F ANDA[)Rt 1V4 A NATIONAL. AVIATION ~ FUEL SCENARIO.. ANALYSIS PROGRAM 49!! VOLUM I: MODEL DESCRIA~v 4<C VOLUME II: tr)ER...executes post processor which translates results of the graphics program to machine readable code used by the pen plotter) cr (depressing the carriage

  12. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  13. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  14. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  15. RESPONSE SURFACE METHODOLOGY ANALYSIS OF POLYPHENOL RECOVERY FROM ARTICHOKE WASTE

    Directory of Open Access Journals (Sweden)

    Antonio Zuorro

    2014-01-01

    Full Text Available Large amounts of a solid waste consisting mainly of outer bracts and stems are produced from the industrial processing of artichokes. In this study, the recovery of polyphenols from the two waste components was investigated. Extraction experiments were carried outby an environmentally friendly procedure using aqueous ethanol as solvent. The total polyphenol content, expressed as mg of GAE per g of dry weight, was 10.23±0.68 mg/g for bracts and 16.36±0.85 mg/g for stems. To evaluate the effect of Temperature (T, Extraction time (E and liquid-to-solid Ratio (R on the extraction yields, a central composite design coupled with response surface methodology was used. Under the best conditions (T = 50°C, E = 110.4 min and R = 20 mL g-1, extraction yields between 90 and 93% were obtained. Statistical analysis of the data showed that E was the most influential factor, followed by T and R. Simplified polynomial models were developed to describe the effect of individual factors and their interactions on the extraction yield of polyphenols. Overall, the results of this study support the potential of using artichoke waste as a source of natural phenolic antioxidants and give useful directions on how to improve recovery by proper selection of extraction conditions.

  16. Hazardous materials transportation: a risk-analysis-based routing methodology.

    Science.gov (United States)

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  17. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2012-10-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  18. Economic Analysis. Volume V. Course Segments 65-79.

    Science.gov (United States)

    Sterling Inst., Washington, DC. Educational Technology Center.

    The fifth volume of the multimedia, individualized course in economic analysis produced for the United States Naval Academy covers segments 65-79 of the course. Included in the volume are discussions of monopoly markets, monopolistic competition, oligopoly markets, and the theory of factor demand and supply. Other segments of the course, the…

  19. Application Potential of Energy Systems at Navy Sites. Volume I. Methodology and Results.

    Science.gov (United States)

    1980-01-01

    remaining mix of selected systems (RDF, FBC, wind and cogeneration) meet the demand without relying on expensive comerci al purchases. The second...and Honolulu in Hawaii; and El Centro and China Lake in California. A detailed analysis of these bases by the NES optimization code is recommended

  20. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  1. Cellulose Isolation Methodology for NMR Analysis of Cellulose Ultrastructure

    Directory of Open Access Journals (Sweden)

    Art J. Ragauskas

    2011-11-01

    Full Text Available In order to obtain accurate information about the ultrastructure of cellulose from native biomass by 13C cross polarization magic angle spinning (CP/MAS NMR spectroscopy the cellulose component must be isolated due to overlapping resonances from both lignin and hemicellulose. Typically, cellulose isolation has been achieved via holocellulose pulping to remove lignin followed by an acid hydrolysis procedure to remove the hemicellulose components. Using 13C CP/MAS NMR and non-linear line-fitting of the cellulose C4 region, it was observed that the standard acid hydrolysis procedure caused an apparent increase in crystallinity of ~10% or less on the cellulose isolated from Populus holocellulose. We have examined the effect of the cellulose isolation method, particularly the acid treatment time for hemicellulose removal, on cellulose ultrastructural characteristics by studying these effects on cotton, microcrystalline cellulose (MCC and holocellulose pulped Populus. 13C CP/MAS NMR of MCC indicated that holocellulose pulping and acid hydrolysis has little effect on the crystalline ultrastructural components of cellulose. Although any chemical method to isolate cellulose from native biomass will invariably alter substrate characteristics, especially those related to regions accessible to solvents, we found those changes to be minimal and consistent in samples of typical crystallinity and lignin/hemicellulose content. Based on the rate of the hemicellulose removal, as determined by HPLC-carbohydrate analysis and magnitude of cellulose ultrastructural alteration, the most suitable cellulose isolation methodology utilizes a treatment of 2.5 M HCl at 100 °C for a standard residence time between 1.5 and 4 h. However, for the most accurate crystallinity results this residence time should be determined empirically for a particular sample.

  2. Methodology for Using 3-Dimensional Sonography to Measure Fetal Adrenal Gland Volumes in Pregnant Women With and Without Early Life Stress.

    Science.gov (United States)

    Kim, Deborah; Epperson, C Neill; Ewing, Grace; Appleby, Dina; Sammel, Mary D; Wang, Eileen

    2016-09-01

    Fetal adrenal gland volumes on 3-dimensional sonography have been studied as potential predictors of preterm birth. However, no consistent methodology has been published. This article describes the methodology used in a study that is evaluating the effects of maternal early life stress on fetal adrenal growth to allow other researchers to compare methodologies across studies. Fetal volumetric data were obtained in 36 women at 20 to 22 and 28 to 30 weeks' gestation. Two independent examiners measured multiple images of a single fetal adrenal gland from each sonogram. Intra- and inter-rater consistency was examined. In addition, fetal adrenal volumes between male and female fetuses were reported. The intra- and inter-rater reliability was satisfactory when the mean of 3 measurements from each rater was used. At 20 weeks' gestation, male fetuses had larger average adjusted adrenal volumes than female fetuses (mean, 0.897 versus 0.638; P = .004). At 28 weeks' gestation, the fetal weight was more influential in determining values for adjusted fetal adrenal volume (0.672 for male fetuses versus 0.526 for female fetuses; P = .034). This article presents a methodology for assessing fetal adrenal volume using 3-dimensional sonography that can be used by other researchers to provide more consistency across studies.

  3. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  4. Cost-utility analysis: Current methodological issues and future perspectives

    Directory of Open Access Journals (Sweden)

    Mark J C Nuijten

    2011-06-01

    Full Text Available The use of cost-effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost-effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost-effectiveness threshold (Cost/QALY. In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment and modelling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity.

  5. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  6. An Analysis of the Research Methodology of the Ramirez Study.

    Science.gov (United States)

    Thomas, Wayne P.

    1992-01-01

    Analyzes the political, educational, and technical factors that strongly influenced the Ramirez study of bilingual programs. Enumerates strengths and weaknesses of the study's research methodology, along with implications for decision making in language-minority education. Summarizes defensible conclusions of the study that have not yet been…

  7. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  8. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  9. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  10. Methodologies for Assessing the Cumulative Environmental Effects of Hydroelectric Development of Fish and Wildlife in the Columbia River Basin, Volume 1, Recommendations, 1987 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Stull, Elizabeth Ann

    1987-07-01

    This volume is the first of a two-part set addressing methods for assessing the cumulative effects of hydropower development on fish and wildlife in the Columbia River Basin. Species and habitats potentially affected by cumulative impacts are identified for the basin, and the most significant effects of hydropower development are presented. Then, current methods for measuring and assessing single-project effects are reviewed, followed by a review of methodologies with potential for use in assessing the cumulative effects associated with multiple projects. Finally, two new approaches for cumulative effects assessment are discussed in detail. Overall, this report identifies and reviews the concepts, factors, and methods necessary for understanding and conducting a cumulative effects assessment in the Columbia River Basin. Volume 2 will present a detailed procedural handbook for performing a cumulative assessment using the integrated tabular methodology introduced in this volume. 308 refs., 18 figs., 10 tabs.

  11. Full-Envelope Launch Abort System Performance Analysis Methodology

    Science.gov (United States)

    Aubuchon, Vanessa V.

    2014-01-01

    The implementation of a new dispersion methodology is described, which dis-perses abort initiation altitude or time along with all other Launch Abort System (LAS) parameters during Monte Carlo simulations. In contrast, the standard methodology assumes that an abort initiation condition is held constant (e.g., aborts initiated at altitude for Mach 1, altitude for maximum dynamic pressure, etc.) while dispersing other LAS parameters. The standard method results in large gaps in performance information due to the discrete nature of initiation conditions, while the full-envelope dispersion method provides a significantly more comprehensive assessment of LAS abort performance for the full launch vehicle ascent flight envelope and identifies performance "pinch-points" that may occur at flight conditions outside of those contained in the discrete set. The new method has significantly increased the fidelity of LAS abort simulations and confidence in the results.

  12. Population Analysis: A Methodology for Understanding Populations in COIN Environments

    Science.gov (United States)

    2008-12-01

    contributions to the field. Emile Durkheim (1858-1917) is described as one of the founders of sociology. In the earliest days of sociology, thinkers...such as Durkheim , attempted to apply a scientific methodological approach to understanding societal change. During this era, philosophers such as... Durkheim , Max Weber and Karl Marx each contributed to the field in their attempts to explain the world around them. At a very basic level, this

  13. Analysis of urea distribution volume in hemodialysis.

    Science.gov (United States)

    Maduell, F; Sigüenza, F; Caridad, A; Miralles, F; Serrato, F

    1994-01-01

    According to the urea kinetic model it is considered that the urea distribution volume (V) is that of body water, and that it is distributed in only one compartment. Since the V value is different to measure, it is normal to use 58% of body weight, in spite of the fact that it may range from 35 to 75%. In this study, we have calculated the value of V by using an accurate method based on the total elimination of urea from the dialysate. We have studied the V, and also whether the different dialysis characteristics modify it. Thirty-five patients were included in this study, 19 men and 16 women, under a chronic hemodialysis programme. The dialysate was collected in a graduated tank, and the concentration of urea in plasma and in dialysate were determined every hour. Every patient received six dialysis sessions, changing the blood flow (250 or 350 ml/min), the ultrafiltration (0.5 or 1.5 l/h), membrane (cuprophane or polyacrylonitrile) and/or buffer (bicarbonate or acetate). At the end of the hemodialysis session, the V value ranged from 43 to 72% of body weight; nevertheless, this value was practically constant in every patient. The V value gradually increased throughout the dialysis session, 42.1 +/- 6.9% of body weight in the first hour, 50.7 +/- 7.5% in the second hour and 55.7 +/- 7.9% at the end of the dialysis session. The change of blood flow, ultrafiltration, membrane or buffer did not alter the results. The V value was significantly higher in men in comparison with women, 60.0 +/- 6.6% vs. 50.5 +/- 5.9% of body weight (p < 0.001).

  14. A study on the core analysis methodology for SMART CEA ejection accident-I

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Kim, Kyo Yoon; Cho, Byung Oh

    1999-04-01

    A methodology to analyze the fuel enthalpy is developed based on MASTER that is a time dependent 3 dimensional core analysis code. Using the proposed methodology, SMART CEA ejection accident is analyzed. Moreover, radiation doses are estimated at the exclusion area boundary and low population zone to confirm the criteria for the accident. (Author). 31 refs., 13 tabs., 18 figs.

  15. Estimation of cell volume and biomass of penicillium chrysogenum using image analysis.

    Science.gov (United States)

    Packer, H L; Keshavarz-Moore, E; Lilly, M D; Thomas, C R

    1992-02-20

    A methodology for the estimation of biomass for the penicillin fermentation using image analysis is presented. Two regions of hyphae are defined to describe the growth of mycelia during fermentation: (1) the cytoplasmic region, and (2) the degenerated region including large vacuoles. The volume occupied by each of these regions in a fixed volume of sample is estimated from area measurements using image analysis. Areas are converted to volumes by treating the hyphae as solid cylinders with the hyphal diameter as the cylinder diameter. The volumes of the cytoplasmic and degenerated regions are converted into dry weight estimations using hyphal density values available from the literature. The image analysis technique is able to estimate biomass even in the presence of nondissolved solids of a concentration of up to 30 gL(-1). It is shown to estimate successfully concentrations of mycelia from 0.03 to 38 gL(-1). Although the technique has been developed for the penicillin fermentation, it should be applicable to other (nonpellected) fungal fermentations.

  16. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  17. Grinding analysis of Indian coal using response surface methodology

    Institute of Scientific and Technical Information of China (English)

    Twinkle Singh; Aishwarya Awasthi; Pranjal Tripathi; Shina Gautam; Alok Gautam

    2016-01-01

    The present work discusses a systematic approach to model grinding parameters of coal in a ball mill.A three level Box-Behnken design combined with response surface methodology using second order model was applied to the experiments done according to the model requirement.Three parameters ball charge (numbers 10-20),coal content (100-200 g) and the grinding time (4-8 min) were chosen for the experiments as well as for the modeling work.Coal fineness is defined as the d80 (80 % passing size).A quadratic model was developed to show the effect of parameters and their interaction with fineness of the product.Three different sizes (4,1 and 0.65 mm) of Indian coal were used.The model equations for each fraction were developed and different sets of experiments were performed.The predicted values of the fineness of coal were in good agreement with the experimental results (R2 values of d80 varies between 0.97 and 0.99).Fine size of three different coal sizes were obtained with larger ball charge with less grinding time and less solid content.This work represents the efficient use of response surface methodology and the Box-Behnken design use for grinding of Indian coal.

  18. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  19. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  20. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system.

  1. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  2. The XMM Cluster Survey: X-ray analysis methodology

    CERN Document Server

    Lloyd-Davies, E J; Hosmer, Mark; Mehrtens, Nicola; Davidson, Michael; Sabirli, Kivanc; Mann, Robert G; Hilton, Matt; Liddle, Andrew R; Viana, Pedro T P; Campbell, Heather C; Collins, Chris A; Dubois, E Naomi; Freeman, Peter; Hoyle, Ben; Kay, Scott T; Kuwertz, Emma; Miller, Christopher J; Nichol, Robert C; Sahlen, Martin; Stanford, S Adam; Stott, John P

    2010-01-01

    The XMM Cluster Survey (XCS) is a serendipitous search for galaxy clusters using all publicly available data in the XMM- Newton Science Archive. Its main aims are to measure cosmological parameters and trace the evolution of X-ray scaling relations. In this paper we describe the data processing methodology applied to the 5776 XMM observations used to construct the current XCS source catalogue. A total of 3669 > 4-{\\sigma} cluster candidates with >50 background-subtracted X-ray counts are extracted from a total non-overlapping area suitable for cluster searching of 410 deg^2 . Of these, 1022 candidates are detected with >300 X-ray counts, and we demonstrate that robust temperature measurements can be obtained down to this count limit. We describe in detail the automated pipelines used to perform the spectral and surface brightness fitting for these sources, as well as to estimate redshifts from the X-ray data alone. A total of 517 (126) X-ray temperatures to a typical accuracy of <40 (<10) per cent have ...

  3. A Novel Methodology for Thermal Analysis & 3-Dimensional Memory Integration

    CERN Document Server

    Cherian, Annmol; Jose, Jemy; Pangracious, Vinod; 10.5121/ijait.2011.1403

    2011-01-01

    The semiconductor industry is reaching a fascinating confluence in several evolutionary trends that will likely lead to a number of revolutionary changes in the design, implementation, scaling, and the use of computer systems. However, recently Moore's law has come to a stand-still since device scaling beyond 65 nm is not practical. 2D integration has problems like memory latency, power dissipation, and large foot-print. 3D technology comes as a solution to the problems posed by 2D integration. The utilization of 3D is limited by the problem of temperature crisis. It is important to develop an accurate power profile extraction methodology to design 3D structure. In this paper, design of 3D integration of memory is considered and hence the static power dissipation of the memory cell is analysed in transistor level and is used to accurately model the inter-layer thermal effects for 3D memory stack. Subsequently, packaging of the chip is considered and modelled using an architecture level simulator. This modelli...

  4. Meta-analysis: Its role in psychological methodology

    Directory of Open Access Journals (Sweden)

    Andrej Kastrin

    2008-11-01

    Full Text Available Meta-analysis refers to the statistical analysis of a large collection of independent observations for the purpose of integrating results. The main objectives of this article are to define meta-analysis as a method of data integration, to draw attention to some particularities of its use, and to encourage researchers to use meta-analysis in their work. The benefits of meta-analysis include more effective exploitation of existing data from independent sources and contribution to more powerful domain knowledge. It may also serve as a support tool to generate new research hypothesis. The idea of combining results of independent studies addressing the same research question dates back to sixteenth century. Metaanalysis was reinvented in 1976 by Glass, to refute the conclusion of an eminent colleague, Eysenck, that psychotherapy was essentially ineffective. We review some major historical landmarks of metaanalysis and its statistical background. We present the concept of effect size measure, the problem of heterogeneity and two models which are used to combine individual effect sizes (fixed and random effect model in great details. Two visualization techniques, forest and funnel plot graphics are demonstrated. We developed RMetaWeb, simple and fast web server application to conduct meta-analysis online. RMetaWeb is the first web meta-analysis application and is completely based on R software environment for statistical computing and graphics.

  5. Development of methodology for horizontal axis wind turbine dynamic analysis

    Science.gov (United States)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  6. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan;

    2016-01-01

    process operating at Novozymes A/S. Following the FUPCR methodology, the final product concentration could be predicted with an average prediction error of 7.4%. Multiple iterations of preprocessing were applied by implementing the methodology to identify the best data handling methods for the model....... It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify......This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...

  7. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    Science.gov (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  8. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 1-Summary

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, M.D.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit is to be taken for the reduced reactivity of burned or spent fuel relative to its original ''fresh'' composition, it is necessary to benchmark computational methods used in determining such reactivity worth against spent fuel reactivity measurements. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized- water reactors (PWR). The analysis methodology utilized for all calculations in this report is based on the modules and data associated with the SCALE-4 code system. Isotopic densities for spent fuel assemblies in the core were calculated using the SAS2H analytical sequence in SCALE-4. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code sequence was used to extract the necessary isotopic densities from SAS2H results and to provide the data in the format required for SCALE-4 criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) for the critical configuration. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for analysis of each critical configuration. Each of the five volumes comprising this report provides an overview of the methodology applied. Subsequent volumes also describe in detail the approach taken in performing criticality calculations for these PWR configurations: Volume 2 describes criticality calculations for the Tennessee Valley Authority's Sequoyah Unit 2 reactor for Cycle 3; Volume 3 documents the analysis of Virginia Power

  9. Criteria for the development and use of the methodology for environmentally-acceptable fossil energy site evaluation and selection. Volume 2. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Eckstein, L.; Northrop, G.; Scott, R.

    1980-02-01

    This report serves as a companion document to the report, Volume 1: Environmentally-Acceptable Fossil Energy Site Evaluation and Selection: Methodology and Users Guide, in which a methodology was developed which allows the siting of fossil fuel conversion facilities in areas with the least environmental impact. The methodology, known as SELECS (Site Evaluation for Energy Conversion Systems) does not replace a site specific environmental assessment, or an environmental impact statement (EIS), but does enhance the value of an EIS by thinning down the number of options to a manageable level, by doing this in an objective, open and selective manner, and by providing preliminary assessment and procedures which can be utilized during the research and writing of the actual impact statement.

  10. Hydrothermal analysis in engineering using control volume finite element method

    CERN Document Server

    Sheikholeslami, Mohsen

    2015-01-01

    Control volume finite element methods (CVFEM) bridge the gap between finite difference and finite element methods, using the advantages of both methods for simulation of multi-physics problems in complex geometries. In Hydrothermal Analysis in Engineering Using Control Volume Finite Element Method, CVFEM is covered in detail and applied to key areas of thermal engineering. Examples, exercises, and extensive references are used to show the use of the technique to model key engineering problems such as heat transfer in nanofluids (to enhance performance and compactness of energy systems),

  11. Theoretical and methodological analysis of personality theories of leadership

    Directory of Open Access Journals (Sweden)

    Оксана Григорівна Гуменюк

    2016-10-01

    Full Text Available The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  12. Method for measuring anterior chamber volume by image analysis

    Science.gov (United States)

    Zhai, Gaoshou; Zhang, Junhong; Wang, Ruichang; Wang, Bingsong; Wang, Ningli

    2007-12-01

    Anterior chamber volume (ACV) is very important for an oculist to make rational pathological diagnosis as to patients who have some optic diseases such as glaucoma and etc., yet it is always difficult to be measured accurately. In this paper, a method is devised to measure anterior chamber volumes based on JPEG-formatted image files that have been transformed from medical images using the anterior-chamber optical coherence tomographer (AC-OCT) and corresponding image-processing software. The corresponding algorithms for image analysis and ACV calculation are implemented in VC++ and a series of anterior chamber images of typical patients are analyzed, while anterior chamber volumes are calculated and are verified that they are in accord with clinical observation. It shows that the measurement method is effective and feasible and it has potential to improve accuracy of ACV calculation. Meanwhile, some measures should be taken to simplify the handcraft preprocess working as to images.

  13. Micro analysis of fringe field formed inside LDA measuring volume

    Science.gov (United States)

    Ghosh, Abhijit; Nirala, A. K.

    2016-05-01

    In the present study we propose a technique for micro analysis of fringe field formed inside laser Doppler anemometry (LDA) measuring volume. Detailed knowledge of the fringe field obtained by this technique allows beam quality, alignment and fringe uniformity to be evaluated with greater precision and may be helpful for selection of an appropriate optical element for LDA system operation. A complete characterization of fringes formed at the measurement volume using conventional, as well as holographic optical elements, is presented. Results indicate the qualitative, as well as quantitative, improvement of fringes formed at the measurement volume by holographic optical elements. Hence, use of holographic optical elements in LDA systems may be advantageous for improving accuracy in the measurement.

  14. Respirable crystalline silica: Analysis methodologies; Silice cristalina respirable: Metodologias de analisis

    Energy Technology Data Exchange (ETDEWEB)

    Gomez-Tena, M. P.; Zumaquero, E.; Ibanez, M. J.; Machi, C.; Escric, A.

    2012-07-01

    This paper describes different analysis methodologies in occupational environments and raw materials. A review is presented of the existing methodologies, the approximations made, some of the constraints involved, as well as the best measurement options for the different raw materials. In addition, the different factors that might affect the precision and accuracy of the results are examined. With regard to the methodologies used for the quantitative analysis of any of the polymorph s, particularly of quartz, the study centres particularly on the analytical X-ray diffraction method. Simplified methods of calculation and experimental separation are evaluated for the estimation of this fraction in the raw materials, such as separation methods by centrifugation, sedimentation, and dust generation in controlled environments. In addition, a review is presented of the methodologies used for the collection of respirable crystalline silica in environmental dust. (Author)

  15. Comparative proteomic analysis of human pancreatic juice : Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, ZhaoHui; Yang, AiMing; Deng, RuiXue; Mai, CanRong; Sang, XinTing; Faber, Klaas Nico; Lu, XingHua

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  16. Comparative proteomic analysis of human pancreatic juice: Methodological study

    NARCIS (Netherlands)

    Zhou, Lu; Lu, Z.H.; Yang, A.M.; Deng, R.X.; Mai, C.R.; Sang, X.T.; Faber, Klaas Nico; Lu, X.H.

    2007-01-01

    Pancreatic cancer is the most lethal of all the common malignancies. Markers for early detection of this disease are urgently needed. Here, we optimized and applied a proteome analysis of human pancreatic juice to identify biomarkers for pancreatic cancer. Pancreatic juice samples, devoid of blood o

  17. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    Science.gov (United States)

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  18. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  19. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  20. Simulation methodology development for rotating blade containment analysis

    Institute of Scientific and Technical Information of China (English)

    Qing HE; Hai-jun XUAN; Lian-fang LIAO; Wei-rong HONG; Rong-ren WU

    2012-01-01

    An experimental and numerical investigation on the aeroengine blade/case containment analysis is presented.Blade out containment capability analysis is an essential step in the new aeroengine design,but containment tests are time-consuming and incur significant costs; thus,developing a short-period and low-cost numerical method is warranted.Using explicit nonlinear dynamic finite element analysis software,the present study numerically investigated the high-speed impact process for simulated blade containment tests which were carried out on high-speed spin testing facility.A number of simulations were conducted using finite element models with different mesh sizes and different values of both the contact penalty factor and the friction coefficient.Detailed comparisons between the experimental and numerical results reveal that the mesh size and the friction coefficient have a considerable impact on the results produced.It is shown that a finer mesh will predict lower containment capability of the case,which is closer to the test data.A larger value of the friction coefficient also predicts lower containment capability.However,the contact penalty factor has little effect on the simulation results if it is large enough to avoid false penetration.

  1. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  2. Trading Volume and Stock Indices: A Test of Technical Analysis

    Directory of Open Access Journals (Sweden)

    Paul Abbondante

    2010-01-01

    Full Text Available Problem statement: Technical analysis and its emphasis on trading volume has been used to analyze movements in individual stock prices and make investment recommendations to either buy or sell that stock. Little attention has been paid to investigating the relationship between trading volume and various stock indices. Approach: Since stock indices track overall stock market movements, trends in trading volume could be used to forecast future stock market trends. Instead of focusing only on individual stocks, this study will examine movements in major stock markets as a whole. Regression analysis was used to investigate the relationship between trading volume and five popular stock indices using daily data from January, 2000 to June, 2010. A lag of 5 days was used because this represents the prior week of trading volume. The total sample size ranges from 1,534-2,638 to observations. Smaller samples were used to test the investment horizon that explains movements of the indices more completely. Results: The F statistics were significant for samples using 6 and 16 months of data. The F statistic was not significant using a sample of 1 month of data. This is surprising given the short term focus of technical analysis. The results indicate that above-average returns can be achieved using futures, options and exchange traded funds which track these indices. Conclusion: Future research efforts will include out-of-sample forecasting to determine if above-average returns can be achieved. Additional research can be conducted to determine the optimal number of lags for each index.

  3. Educational Historiographical Meta-analysis: Rethinking Methodology in the 1990s.

    Science.gov (United States)

    Kincheloe, Joe L.

    1991-01-01

    Discusses the relationship of ideology, social theory, critical analysis, hegemony, and objectivity in the work of educational historians. Comments that the training of educational historians often neglects intensive ideological analysis of historical research. Argues that study of historiographical methodology and consciousness of its dimensions…

  4. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  5. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    CERN Document Server

    Jacobs, Daniel C; Trott, C M; Dillon, Joshua S; Pindor, B; Sullivan, I S; Pober, J C; Barry, N; Beardsley, A P; Bernardi, G; Bowman, Judd D; Briggs, F; Cappallo, R J; Carroll, P; Corey, B E; de Oliveira-Costa, A; Emrich, D; Ewall-Wice, A; Feng, L; Gaensler, B M; Goeke, R; Greenhill, L J; Hewitt, J N; Hurley-Walker, N; Johnston-Hollitt, M; Kaplan, D L; Kasper, J C; Kim, H S; Kratzenberg, E; Lenc, E; Line, J; Loeb, A; Lonsdale, C J; Lynch, M J; McKinley, B; McWhirter, S R; Mitchell, D A; Morales, M F; Morgan, E; Neben, A R; Thyagarajan, N; Oberoi, D; Offringa, A R; Ord, S M; Paul, S; Prabu, T; Procopio, P; Riding, J; Rogers, A E E; Roshi, A; Shankar, N Udaya; Sethi, Shiv K; Srivani, K S; Subrahmanyan, R; Tegmark, M; Tingay, S J; Waterson, M; Wayth, R B; Webster, R L; Whitney, A R; Williams, A; Williams, C L; Wu, C; Wyithe, J S B

    2016-01-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple, independent, data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregr...

  6. The Murchison Widefield Array 21 cm Power Spectrum Analysis Methodology

    Science.gov (United States)

    Jacobs, Daniel C.; Hazelton, B. J.; Trott, C. M.; Dillon, Joshua S.; Pindor, B.; Sullivan, I. S.; Pober, J. C.; Barry, N.; Beardsley, A. P.; Bernardi, G.; Bowman, Judd D.; Briggs, F.; Cappallo, R. J.; Carroll, P.; Corey, B. E.; de Oliveira-Costa, A.; Emrich, D.; Ewall-Wice, A.; Feng, L.; Gaensler, B. M.; Goeke, R.; Greenhill, L. J.; Hewitt, J. N.; Hurley-Walker, N.; Johnston-Hollitt, M.; Kaplan, D. L.; Kasper, J. C.; Kim, HS; Kratzenberg, E.; Lenc, E.; Line, J.; Loeb, A.; Lonsdale, C. J.; Lynch, M. J.; McKinley, B.; McWhirter, S. R.; Mitchell, D. A.; Morales, M. F.; Morgan, E.; Neben, A. R.; Thyagarajan, N.; Oberoi, D.; Offringa, A. R.; Ord, S. M.; Paul, S.; Prabu, T.; Procopio, P.; Riding, J.; Rogers, A. E. E.; Roshi, A.; Udaya Shankar, N.; Sethi, Shiv K.; Srivani, K. S.; Subrahmanyan, R.; Tegmark, M.; Tingay, S. J.; Waterson, M.; Wayth, R. B.; Webster, R. L.; Whitney, A. R.; Williams, A.; Williams, C. L.; Wu, C.; Wyithe, J. S. B.

    2016-07-01

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  7. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  8. UNDERSTANDING FLOW OF ENERGY IN BUILDINGS USING MODAL ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    John Gardner; Kevin Heglund; Kevin Van Den Wymelenberg; Craig Rieger

    2013-07-01

    It is widely understood that energy storage is the key to integrating variable generators into the grid. It has been proposed that the thermal mass of buildings could be used as a distributed energy storage solution and several researchers are making headway in this problem. However, the inability to easily determine the magnitude of the building’s effective thermal mass, and how the heating ventilation and air conditioning (HVAC) system exchanges thermal energy with it, is a significant challenge to designing systems which utilize this storage mechanism. In this paper we adapt modal analysis methods used in mechanical structures to identify the primary modes of energy transfer among thermal masses in a building. The paper describes the technique using data from an idealized building model. The approach is successfully applied to actual temperature data from a commercial building in downtown Boise, Idaho.

  9. Anthropological analysis of taekwondo--new methodological approach.

    Science.gov (United States)

    Cular, Drazen; Munivrana, Goran; Katić, Ratko

    2013-05-01

    The aim of this research is to determine the order and importance of impacts of particular anthropological characteristics and technical and tactical competence on success in taekwondo according to opinions of top taekwondo instructors (experts). Partial objectives include analysis of metric characteristics of the measuring instrument, and determining differences between two disciplines (sparring and technical discipline of patterns) and two competition systems (WTF and ITF). In accordance with the aims, the research was conducted on a sample of respondents which consisted of 730 taekwondo instructors from 6 continents and from 69 countries (from which we selected 242 instructors), who are at different success levels in both taekwondo competition systems (styles) and two taekwondo disciplines. The respondents were divided into 3 qualitative subsamples (OST-USP-VRH) using the dependant variable of accomplished results of the instructor. In 6 languages, they electronically evaluated the impact in percentage value (%) of motor and functional skills (MOTFS), morphological characteristics (MORF), psychological profile of an athlete (PSIH), athletic intelligence (INTE) and technical and tactical competence - (TE-TA) on success in taekwondo. The analysis of metric characteristics of the constructed instrument showed a satisfactory degree of agreement (IHr) which is proportional to the level of respondent quality, i.e. it grows along with the increase in instructor quality in all analysed disciplines of both systems. Top instructors assigned the highest portion of impact on success to the motor and functional skills (MOTFS) variable: WTF-SPB=29.1, ITF-SPB=29.2, WTF-THN=35.0, ITF-THN=32.0). Statistically significant differences in opinions of instructors of different styles and disciplines were not recorded in any of the analysed variables. The only exception is the psychological profile of an athlete variable, which WTF instructors of sparring (AM=23.7%), on a significance

  10. A new methodology for the CFD uncertainty analysis

    Institute of Scientific and Technical Information of China (English)

    YAO Zhen-qiu; SHEN Hong-cui; GAO Hui

    2013-01-01

    With respect to the measurement uncertainty,this paper discusses the definition,the sources,the classification and the expressions of the CFD uncertainty.Based on the orthogonal design and the statistics inference theory,a new verification and validation method and the related procedures in the CFD simulation are developed.With the method,two examples of the CFD verification and validation are studied for the drag coefficient and the nominal wake fraction,and the calculation factors and their interactions which would significantly affect the simulation results are obtained.Moreover,the sizes of all uncertainty components resulting from the controlled and un-controlled calculation factors are determined,and the optimal combination of the calculation factors is obtained by an effect estimation in the orthogonal experiment design.It is shown that the new method can be used for the verification in the CFD uncertainty analysis,and can reasonably and definitely judge the credibility of the simulative result.As for CFD simulation of the drag coefficient and the nominal wake fraction,the results predicted can be validated.Although there is still some difference between the simulation results and the experiment results,its approximate level and credibility can be accepted.

  11. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  12. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  13. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  14. Methodology for regional economic analysis of urban refuse as an energy source for the Northeast

    Energy Technology Data Exchange (ETDEWEB)

    Meier, P.M.; Le, T.

    1979-01-01

    The potential contribution of municipal refuse as an energy source depends on the spatial distribution and characteristics of population and the cost of alternative fuels. A methodology for regional economic analysis of urban refuse as an energy source requires a methodology that integrates population, interregional fuel price variations, and other engineering variables into a single model that can be subjected to sensitivity analysis helpful to the policy maker. Such a framework in the context of the Northeast States is developed. The area includes New England, New York, New Jersey, Pennsylvania, Maryland, Delaware, and the District of Columbia.

  15. Immunoassay Methods and their Applications in Pharmaceutical Analysis: Basic Methodology and Recent Advances.

    Science.gov (United States)

    Darwish, Ibrahim A

    2006-09-01

    Immunoassays are bioanalytical methods in which the quantitation of the analyte depends on the reaction of an antigen (analyte) and an antibody. Immunoassays have been widely used in many important areas of pharmaceutical analysis such as diagnosis of diseases, therapeutic drug monitoring, clinical pharmacokinetic and bioequivalence studies in drug discovery and pharmaceutical industries. The importance and widespread of immunoassay methods in pharmaceutical analysis are attributed to their inherent specificity, high-throughput, and high sensitivity for the analysis of wide range of analytes in biological samples. Recently, marked improvements were achieved in the field of immunoassay development for the purposes of pharmaceutical analysis. These improvements involved the preparation of the unique immunoanalytical reagents, analysis of new categories of compounds, methodology, and instrumentation. The basic methodologies and recent advances in immunoassay methods applied in different fields of pharmaceutical analysis have been reviewed.

  16. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    2013-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  17. Workload Characterization an Essential Step in Computer Systems Performance Analysis - Methodology and Tools

    OpenAIRE

    CHEVERESAN, R.T.; HOLBAN, S.

    2009-01-01

    Computer system performance is a very complex process in which the hardware and software manufacturers invest important human and financial resources. Workload characterization represents an essential component of performance analysis. This paper presents a trace based methodology for software applications evaluation. It introduces a new analysis concept designed to significantly ease this process and it presents a set of experimental data collected using the new analysis structure on a repre...

  18. Semantic analysis according to Peep Koort--a substance-oriented research methodology.

    Science.gov (United States)

    Sivonen, Kerstin; Kasén, Anne; Eriksson, Katie

    2010-12-01

    The aim of this article is to describe the hermeneutic semantic analysis created by professor Peep Koort (1920-1977) and to discuss it as a methodology for research within caring science. The methodology is developed with a hermeneutic approach that differs from the traditions of semantic analysis in philosophy or linguistics. The research objects are core concepts and theoretical constructs (originally within the academic discipline of education science, later on within the academic discipline of caring science), focusing deeper understanding of essential meaning content when developing a discipline. The qualitative methodology of hermeneutic semantic analysis is described step by step as created by Koort, interpreted and developed by the authors. An etymological investigation and an analysis of synonymy between related concepts within a conceptual family guides the researcher to understand and discriminate conceptual dimensions of meaning content connected to the word studied, thus giving opportunities to summarise it in a theoretical definition, a discovery that can be tested in varying contexts. From a caring science perspective, we find the hermeneutic methodology of semantic analysis fruitful and suitable for researchers developing their understanding of core concepts and theoretical constructs connected to the development of the academic discipline.

  19. French Epistemology and its Revisions: Towards a Reconstruction of the Methodological Position of Foucaultian Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2007-05-01

    Full Text Available This article reconstructs epistemology in the tradition of Gaston BACHELARD as one of the main foundations of the methodology of FOUCAULTian discourse analysis. Foundational concepts and the methodological approach of French epistemology are one of the continuities in the work of Michel FOUCAULT. BACHELARDian epistemology (and of his successor Georges CANGUILHEM can be used for the reconstruction of the FOUCAULTian methodology and it can also be used to instruct the practices of FOUCAULTian discourse analysis as a stand-alone form of qualitative social research. French epistemology was developed in critical opposition to the phenomenology of Edmund HUSSERL, and to phenomenological theories of science. Because the phenomenology of HUSSERL is one foundation of social phenomenology, the reconstruction of the FOUCAULTian methodology—as built on the French tradition of BACHELARDian epistemology—makes it clear that FOUCAULTian discourse analysis is incommensurable with approaches derived from social phenomenology. The epistemology of BACHELARD is portrayed as a proto-version of discourse analysis. Discourses as well as discourse analyses are conceived as forms of socio-epistemological practice. In this article, the main concepts and strategies of French epistemology are introduced and related to discourse analysis. The consequences of epistemology for a self-reflexive methodology and its practice are discussed. URN: urn:nbn:de:0114-fqs0702241

  20. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  1. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  2. A critical methodological review of discourse and conversation analysis studies of family therapy.

    Science.gov (United States)

    Tseliou, Eleftheria

    2013-12-01

    Discourse (DA) and conversation (CA) analysis, two qualitative research methods, have been recently suggested as potentially promising for the study of family therapy due to common epistemological adherences and their potential for an in situ study of therapeutic dialog. However, to date, there is no systematic methodological review of the few existing DA and CA studies of family therapy. This study aims at addressing this lack by critically reviewing published DA and CA studies of family therapy on methodological grounds. Twenty-eight articles in total are reviewed in relation to certain methodological axes identified in the relevant literature. These include choice of method, framing of research question(s), data/sampling, type of analysis, epistemological perspective, content/type of knowledge claims, and attendance to criteria for good quality practice. It is argued that the reviewed studies show "glimpses" of the methods' potential for family therapy research despite the identification of certain "shortcomings" regarding their methodological rigor. These include unclearly framed research questions and the predominance of case study designs. They also include inconsistencies between choice of method, stated or unstated epistemological orientations and knowledge claims, and limited attendance to criteria for good quality practice. In conclusion, it is argued that DA and CA can add to the existing quantitative and qualitative methods for family therapy research. They can both offer unique ways for a detailed study of the actual therapeutic dialog, provided that future attempts strive for a methodologically rigorous practice and against their uncritical deployment.

  3. Atlas based brain volumetry: How to distinguish regional volume changes due to biological or physiological effects from inherent noise of the methodology.

    Science.gov (United States)

    Opfer, Roland; Suppa, Per; Kepp, Timo; Spies, Lothar; Schippling, Sven; Huppertz, Hans-Jürgen

    2016-05-01

    Fully-automated regional brain volumetry based on structural magnetic resonance imaging (MRI) plays an important role in quantitative neuroimaging. In clinical trials as well as in clinical routine multiple MRIs of individual patients at different time points need to be assessed longitudinally. Measures of inter- and intrascanner variability are crucial to understand the intrinsic variability of the method and to distinguish volume changes due to biological or physiological effects from inherent noise of the methodology. To measure regional brain volumes an atlas based volumetry (ABV) approach was deployed using a highly elastic registration framework and an anatomical atlas in a well-defined template space. We assessed inter- and intrascanner variability of the method in 51 cognitively normal subjects and 27 Alzheimer dementia (AD) patients from the Alzheimer's Disease Neuroimaging Initiative by studying volumetric results of repeated scans for 17 compartments and brain regions. Median percentage volume differences of scan-rescans from the same scanner ranged from 0.24% (whole brain parenchyma in healthy subjects) to 1.73% (occipital lobe white matter in AD), with generally higher differences in AD patients as compared to normal subjects (e.g., 1.01% vs. 0.78% for the hippocampus). Minimum percentage volume differences detectable with an error probability of 5% were in the one-digit percentage range for almost all structures investigated, with most of them being below 5%. Intrascanner variability was independent of magnetic field strength. The median interscanner variability was up to ten times higher than the intrascanner variability.

  4. Cost-Benefit Analysis Methodology: Install Commercially Compliant Engines on National Security Exempted Vessels?

    Science.gov (United States)

    2015-11-05

    technologies follow: 1. Selective catalytic reduction (SCR) 2. Diesel particulate filter (DPF) – electrically regenerated active (ERADPF...insurmountable obstacles such as vessel range, engine room space, SLM, additional electric power, etc. Recommendations are developed on the basis of both...Cost-Benefit Analysis Methodology: Install Commercially Compliant Engines on National Security Exempted Vessels? Jonathan DeHart 1 (M

  5. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    Science.gov (United States)

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  6. Methodology of data extraction from a corpus for the conceptual analysis of metaphor in legal English

    OpenAIRE

    Kucheruk, Liliya

    2014-01-01

    The present paper entitled “Methodology of data extraction from a corpus for the conceptual analysis of metaphor in legal English” investigates the main approaches to the study of metaphor in legal language from the point of view of cognitive linguistics. It also offers the most widely spread methods of data extraction form a corpus.

  7. Success story in software engineering using NIAM (Natural language Information Analysis Methodology)

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S.M.; Eaton, D.S.

    1995-10-01

    To create an information system, we employ NIAM (Natural language Information Analysis Methodology). NIAM supports the goals of both the customer and the analyst completely understanding the information. We use the customer`s own unique vocabulary, collect real examples, and validate the information in natural language sentences. Examples are discussed from a successfully implemented information system.

  8. Environmental failure mode and effects analysis (FMEA – a new approach to methodology

    Directory of Open Access Journals (Sweden)

    M. Roszak

    2015-04-01

    Full Text Available The purpose of this paper is to present a concept of FMEA analysis for environmental aspects, together with a discussion of the importance, implementation and application of the proposed concept. The analyses and the developed E-FMEA methodology have resulted in a proposal of management tools for manufacturing processes.

  9. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  10. Synfuel program analysis. Volume I. Procedures-capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This is the first of the two volumes describing the analytic procedures and resulting capabilities developed by Resource Applications (RA) for examining the economic viability, public costs, and national benefits of alternative synfuel projects and integrated programs. This volume is intended for Department of Energy (DOE) and Synthetic Fuel Corporation (SFC) program management personnel and includes a general description of the costing, venture, and portfolio models with enough detail for the reader to be able to specifiy cases and interpret outputs. It also contains an explicit description (with examples) of the types of results which can be obtained when applied to: the analysis of individual projects; the analysis of input uncertainty, i.e., risk; and the analysis of portfolios of such projects, including varying technology mixes and buildup schedules. In all cases, the objective is to obtain, on the one hand, comparative measures of private investment requirements and expected returns (under differing public policies) as they affect the private decision to proceed, and, on the other, public costs and national benefits as they affect public decisions to participate (in what form, in what areas, and to what extent).

  11. User's operating procedures. Volume 2: Scout project financial analysis program

    Science.gov (United States)

    Harris, C. G.; Haris, D. K.

    1985-01-01

    A review is presented of the user's operating procedures for the Scout Project Automatic Data system, called SPADS. SPADS is the result of the past seven years of software development on a Prime mini-computer located at the Scout Project Office, NASA Langley Research Center, Hampton, Virginia. SPADS was developed as a single entry, multiple cross-reference data management and information retrieval system for the automation of Project office tasks, including engineering, financial, managerial, and clerical support. This volume, two (2) of three (3), provides the instructions to operate the Scout Project Financial Analysis program in data retrieval and file maintenance via the user friendly menu drivers.

  12. Environmentally-acceptable fossil energy site evaluation and selection: methodology and user's guide. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Northrop, G.M.

    1980-02-01

    This report is designed to facilitate assessments of environmental and socioeconomic impacts of fossil energy conversion facilities which might be implemented at potential sites. The discussion of methodology and the User's Guide contained herein are presented in a format that assumes the reader is not an energy technologist. Indeed, this methodology is meant for application by almost anyone with an interest in a potential fossil energy development - planners, citizen groups, government officials, and members of industry. It may also be of instructional value. The methodology is called: Site Evaluation for Energy Conversion Systems (SELECS) and is organized in three levels of increasing sophistication. Only the least complicated version - the Level 1 SELECS - is presented in this document. As stated above, it has been expressly designed to enable just about anyone to participate in evaluating the potential impacts of a proposed energy conversion facility. To accomplish this objective, the Level 1 calculations have been restricted to ones which can be performed by hand in about one working day. Data collection and report preparation may bring the total effort required for a first or one-time application to two to three weeks. If repeated applications are made in the same general region, the assembling of data for a different site or energy conversion technology will probably take much less time.

  13. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  14. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  15. Motion analysis of knee joint using dynamic volume images

    Science.gov (United States)

    Haneishi, Hideaki; Kohno, Takahiro; Suzuki, Masahiko; Moriya, Hideshige; Mori, Sin-ichiro; Endo, Masahiro

    2006-03-01

    Acquisition and analysis of three-dimensional movement of knee joint is desired in orthopedic surgery. We have developed two methods to obtain dynamic volume images of knee joint. One is a 2D/3D registration method combining a bi-plane dynamic X-ray fluoroscopy and a static three-dimensional CT, the other is a method using so-called 4D-CT that uses a cone-beam and a wide 2D detector. In this paper, we present two analyses of knee joint movement obtained by these methods: (1) transition of the nearest points between femur and tibia (2) principal component analysis (PCA) of six parameters representing the three dimensional movement of knee. As a preprocessing for the analysis, at first the femur and tibia regions are extracted from volume data at each time frame and then the registration of the tibia between different frames by an affine transformation consisting of rotation and translation are performed. The same transformation is applied femur as well. Using those image data, the movement of femur relative to tibia can be analyzed. Six movement parameters of femur consisting of three translation parameters and three rotation parameters are obtained from those images. In the analysis (1), axis of each bone is first found and then the flexion angle of the knee joint is calculated. For each flexion angle, the minimum distance between femur and tibia and the location giving the minimum distance are found in both lateral condyle and medial condyle. As a result, it was observed that the movement of lateral condyle is larger than medial condyle. In the analysis (2), it was found that the movement of the knee can be represented by the first three principal components with precision of 99.58% and those three components seem to strongly relate to three major movements of femur in the knee bend known in orthopedic surgery.

  16. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  17. A new analysis methodology for the motion of self-propelled particles and its application

    Science.gov (United States)

    Byun, Young-Moo; Lammert, Paul; Crespi, Vincent

    2011-03-01

    The self-propelled particle (SPP) on the microscale in the solution is a growing field of study, which has a potential to be used for nanomedicine and nanorobots. However, little detailed quantitative analysis on the motion of the SPP has been performed so far because its self-propelled motion is strongly coupled to Brownian motion, which makes the extraction of intrinsic propulsion mechanisms problematic, leading to inconsistent conclusions. Here, we present a novel way to decompose the motion of the SPP into self-propelled and Brownian components; accurate values for self-propulsion speed and diffusion coefficients of the SPP are obtained for the first time. Then, we apply our analysis methodology to ostensible chemotaxis of SPP, and reveal the actual (non-chemotactic) mechanism of the phenomenon, demonstrating that our analysis methodology is a powerful and reliable tool.

  18. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  19. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  20. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  1. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  2. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  3. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  4. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.

    Science.gov (United States)

    Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B

    2008-07-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.

  5. Human factors evaluation of teletherapy: Function and task analysis. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kaye, R.D.; Henriksen, K.; Jones, R. [Hughes Training, Inc., Falls Church, VA (United States); Morisseau, D.S.; Serig, D.I. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-07-01

    As a treatment methodology, teletherapy selectively destroys cancerous and other tissue by exposure to an external beam of ionizing radiation. Sources of radiation are either a radioactive isotope, typically Cobalt-60 (Co-60), or a linear accelerator. Records maintained by the NRC have identified instances of teletherapy misadministration where the delivered radiation dose has differed from the radiation prescription (e.g., instances where fractions were delivered to the wrong patient, to the wrong body part, or were too great or too little with respect to the defined treatment volume). Both human error and machine malfunction have led to misadministrations. Effective and safe treatment requires a concern for precision and consistency of human-human and human-machine interactions throughout the course of therapy. The present study is the first part of a series of human factors evaluations for identifying the root causes that lead to human error in the teletherapy environment. The human factors evaluations included: (1) a function and task analysis of teletherapy activities, (2) an evaluation of the human-system interfaces, (3) an evaluation of procedures used by teletherapy staff, (4) an evaluation of the training and qualifications of treatment staff (excluding the oncologists), (5) an evaluation of organizational practices and policies, and (6) an identification of problems and alternative approaches for NRC and industry attention. The present report addresses the function and task analysis of teletherapy activities and provides the foundation for the conduct of the subsequent evaluations. The report includes sections on background, methodology, a description of the function and task analysis, and use of the task analysis findings for the subsequent tasks. The function and task analysis data base also is included.

  6. An empirical methodology derived from the analysis of information remaining on second hand hard disks

    Science.gov (United States)

    Fragkos, Grigorios; Mee, Vivienne; Xynos, Konstantinos; Angelopoulou, Olga

    In this paper we present the findings of an analysis of approximately 260 second hand disks that was conducted in 2006. A third party organisation bought the disks from the second hand market providing a degree of anonymity. This paper will demonstrate the quantitative outcomes of the analysis and the overall experiences. It will look at how analysts can expand their tools and techniques in order to achieve faster results, how one can organise the analysis based on the way information is found and finally a holistic picture of the case should be generated following the proposed methodology.

  7. Methodological Principles of Assessing the Volume of Investment Influx from Non-State Pension Funds into the Economy of Ukraine

    Directory of Open Access Journals (Sweden)

    Dmitro Leonov

    2004-11-01

    Full Text Available This article addresses the processes of forming investment resources from nonstate pension funds under current conditions in Ukraine and the laws and regula tions that define the principles of the formation of in vestment institutions. Based on factors that in the near est future will affect the decisionmaking process by which different kinds of investors make payments to non state pension funds, we develop a procedure for assessing the volume of investment influx from nonstate pension funds into the economy and propose a procedure for long and shortterm prognosis of the volume of investment in flux from nonstate pension funds into the Ukrainian economy.

  8. Three-dimensional volume analysis of vasculature in engineered tissues

    Science.gov (United States)

    YousefHussien, Mohammed; Garvin, Kelley; Dalecki, Diane; Saber, Eli; Helguera, María.

    2013-01-01

    Three-dimensional textural and volumetric image analysis holds great potential in understanding the image data produced by multi-photon microscopy. In this paper, an algorithm that quantitatively analyzes the texture and the morphology of vasculature in engineered tissues is proposed. The investigated 3D artificial tissues consist of Human Umbilical Vein Endothelial Cells (HUVEC) embedded in collagen exposed to two regimes of ultrasound standing wave fields under different pressure conditions. Textural features were evaluated using the normalized Gray-Scale Cooccurrence Matrix (GLCM) combined with Gray-Level Run Length Matrix (GLRLM) analysis. To minimize error resulting from any possible volume rotation and to provide a comprehensive textural analysis, an averaged version of nine GLCM and GLRLM orientations is used. To evaluate volumetric features, an automatic threshold using the gray level mean value is utilized. Results show that our analysis is able to differentiate among the exposed samples, due to morphological changes induced by the standing wave fields. Furthermore, we demonstrate that providing more textural parameters than what is currently being reported in the literature, enhances the quantitative understanding of the heterogeneity of artificial tissues.

  9. Two-dimensional thermal analysis of a fuel rod by finite volume method

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Rhayanne Y.N.; Silva, Mario A.B. da; Lira, Carlos A.B. de O., E-mail: ryncosta@gmail.com, E-mail: mabs500@gmail.com, E-mail: cabol@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamaento de Energia Nuclear

    2015-07-01

    In a nuclear reactor, the amount of power generation is limited by thermal and physic limitations rather than by nuclear parameters. The operation of a reactor core, considering the best heat removal system, must take into account the fact that the temperatures of fuel and cladding shall not exceed safety limits anywhere in the core. If such considerations are not considered, damages in the fuel element may release huge quantities of radioactive materials in the coolant or even core meltdown. Thermal analyses for fuel rods are often accomplished by considering one-dimensional heat diffusion equation. The aim of this study is to develop the first paper to verify the temperature distribution for a two-dimensional heat transfer problem in an advanced reactor. The methodology is based on the Finite Volume Method (FVM), which considers a balance for the property of interest. The validation for such methodology is made by comparing numerical and analytical solutions. For the two-dimensional analysis, the results indicate that the temperature profile agree with expected physical considerations, providing quantitative information for the development of advanced reactors. (author)

  10. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  11. Heterogenic Solid Biofuel Sampling Methodology and Uncertainty Associated with Prompt Analysis

    Directory of Open Access Journals (Sweden)

    Jose A. Pazó

    2010-05-01

    Full Text Available Accurate determination of the properties of biomass is of particular interest in studies on biomass combustion or cofiring. The aim of this paper is to develop a methodology for prompt analysis of heterogeneous solid fuels with an acceptable degree of accuracy. Special care must be taken with the sampling procedure to achieve an acceptable degree of error and low statistical uncertainty. A sampling and error determination methodology for prompt analysis is presented and validated. Two approaches for the propagation of errors are also given and some comparisons are made in order to determine which may be better in this context. Results show in general low, acceptable levels of uncertainty, demonstrating that the samples obtained in the process are representative of the overall fuel composition.

  12. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  13. Putting phylogeny into the analysis of biological traits: a methodological approach.

    Science.gov (United States)

    Jombart, Thibaut; Pavoine, Sandrine; Devillard, Sébastien; Pontier, Dominique

    2010-06-01

    Phylogenetic comparative methods have long considered phylogenetic signal as a source of statistical bias in the correlative analysis of biological traits. However, the main life-history strategies existing in a set of taxa are often combinations of life history traits that are inherently phylogenetically structured. In this paper, we present a method for identifying evolutionary strategies from large sets of biological traits, using phylogeny as a source of meaningful historical and ecological information. Our methodology extends a multivariate method developed for the analysis of spatial patterns, and relies on finding combinations of traits that are phylogenetically autocorrelated. Using extensive simulations, we show that our method efficiently uncovers phylogenetic structures with respect to various tree topologies, and remains powerful in cases where a large majority of traits are not phylogenetically structured. Our methodology is illustrated using empirical data, and implemented in the adephylo package for the free software R.

  14. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2011-10-01

    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  15. Methodological reflections on gesture analysis in second language acquisition and bilingualism research

    OpenAIRE

    Gullberg, M

    2010-01-01

    Gestures, the symbolic movements speakers perform while they speak, form a closely inter-connected system with speech where gestures serve both addressee-directed (‘communicative’) and speaker-directed (’internal’) functions. This paper aims (1) to show that a combined analysis of gesture and speech offers new ways to address theoretical issues in SLA and bilingualism studies, probing SLA and bilingualism as product and process; and (2) to outline some methodological concerns and desiderata t...

  16. Methodology of image analysis for study of the vertisols moisture content

    OpenAIRE

    Cumbrera Gonzalez, Ramiro Alberto; Milán Vega, Humberto; Tarquis Alfonso, Ana Maria

    2014-01-01

    The main problem to study vertical drainage from the moisture distribution, on a vertisol profile, is searching for suitable methods using these procedures. Our aim was to design a digital image processing methodology and its analysis to characterize the moisture content distribution of a vertisol profile. In this research, twelve soil pits were excavated on a ba re Mazic Pellic Vertisols ix of them in May 13/2011 and the rest in May 19 /2011 after a moderate rainfall event. Digi...

  17. Application of the Simulation Based Reliability Analysis on the LBB methodology

    OpenAIRE

    Pečínka L.; Švrček M.

    2008-01-01

    Guidelines on how to demonstrate the existence of Leak Before Break (LBB) have been developed in many western countries. These guidelines, partly based on NUREG/CR-6765, define the steps that should be fulfilled to get a conservative assessment of LBB acceptability. As a complement and also to help identify the key parameters that influence the resulting leakage and failure probabilities, the application of Simulation Based Reliability Analysis is under development. The used methodology will ...

  18. Risk Assessment Planning for Airborne Systems: An Information Assurance Failure Mode, Effects and Criticality Analysis Methodology

    Science.gov (United States)

    2012-06-01

    that a physical vehicle or system could fail, one of the earliest methodologies used was FMEA , failure mode and effects analysis (MIL-P-1629, 1949...marginal, and minor failures, and included both direct effects and 23 secondary effects. The early FMEA process was refined, and utilized in the space...systems (Goddard, Validating the Safety of Real-Time Control Systems Using FMEA , 1993) which moves into the realm of failure modes which are not

  19. Development of an image analysis methodology for animal cell cultures characterization

    OpenAIRE

    Amaral, A.L.; Mesquita, D. P.; Xavier, Mariana; Rodrigues, L. R.; Kluskens, Leon; Ferreira, E. C.

    2014-01-01

    To establish a strong cell culture protocol and to evaluate experimental results, a quantitative determination of animal cells characteristics, such as confluence and morphology is quite often required. Quantitative image analysis using automated processing has become a routine methodology in a wide range of applications with the advantage of being non-invasive and non-destructive. However, in animal cells cultures automatic techniques giving valuable information based on visual inspection ar...

  20. Analysis of Partial Volume Effects on Accurate Measurement of the Hippocampus Volume

    Institute of Scientific and Technical Information of China (English)

    Maryam Hajiesmaeili; Jamshid Dehmeshki; Tim Ellis

    2014-01-01

    Hippocampal volume loss is an important biomarker in distinguishing subjects with Alzheimer’s disease (AD) and its measurement in magnetic resonance images (MRI) is influenced by partial volume effects (PVE). This paper describes a post-processing approach to quantify PVE for correction of the hippocampal volume by using a spatial fuzzyC-means (SFCM) method. The algorithm is evaluated on a dataset of 20 T1-weighted MRI scans sampled at two different resolutions. The corrected volumes for left and right hippocampus (HC) which are 23% and 18% for the low resolution and 6% and 5% for the high resolution datasets, respectively are lower than hippocampal volume results from manual segmentation. Results show the importance of applying this technique in AD detection with low resolution datasets.

  1. Parametric analysis of architectural volumes through genetic algorithms

    Directory of Open Access Journals (Sweden)

    Pedro Salcedo Lagos

    2015-03-01

    Full Text Available During the last time, architectural design has developed partly due to new digital design techniques, which allow the generation of geometries based on the definition of initial parameters and the programming of formal relationship between them. Design processes based on these technologies allow to create shapes with the capacity to modify and adapt to multiple constrains or specific evaluation criteria, which raises the problem of identifying the best architectural solution. Several experiences have set up the utilization of genetic algorithm to face this problem. This paper demonstrates the possibility to implement a parametric analysis of architectural volumes with genetic algorithm, in order to combine functional, environmental and structural requirements, with an effective search method to select a variety of proper solutions through digital technologies.

  2. Coal gasification systems engineering and analysis. Volume 1: Executive summary

    Science.gov (United States)

    1980-01-01

    Feasibility analyses and systems engineering studies for a 20,000 tons per day medium Btu (MBG) coal gasification plant to be built by TVA in Northern Alabama were conducted. Major objectives were as follows: (1) provide design and cost data to support the selection of a gasifier technology and other major plant design parameters, (2) provide design and cost data to support alternate product evaluation, (3) prepare a technology development plan to address areas of high technical risk, and (4) develop schedules, PERT charts, and a work breakdown structure to aid in preliminary project planning. Volume one contains a summary of gasification system characterizations. Five gasification technologies were selected for evaluation: Koppers-Totzek, Texaco, Lurgi Dry Ash, Slagging Lurgi, and Babcock and Wilcox. A summary of the trade studies and cost sensitivity analysis is included.

  3. [Development of New Mathematical Methodology in Air Traffic Control for the Analysis of Hybrid Systems

    Science.gov (United States)

    Hermann, Robert

    1997-01-01

    The aim of this research is to develop new mathematical methodology for the analysis of hybrid systems of the type involved in Air Traffic Control (ATC) problems. Two directions of investigation were initiated. The first used the methodology of nonlinear generalized functions, whose mathematical foundations were initiated by Colombeau and developed further by Oberguggenberger; it has been extended to apply to ordinary differential. Systems of the type encountered in control in joint work with the PI and M. Oberguggenberger. This involved a 'mixture' of 'continuous' and 'discrete' methodology. ATC clearly involves mixtures of two sorts of mathematical problems: (1) The 'continuous' dynamics of a standard control type described by ordinary differential equations (ODE) of the form: {dx/dt = f(x, u)} and (2) the discrete lattice dynamics involved of cellular automata. Most of the CA literature involves a discretization of a partial differential equation system of the type encountered in physics problems (e.g. fluid and gas problems). Both of these directions requires much thinking and new development of mathematical fundamentals before they may be utilized in the ATC work. Rather than consider CA as 'discretization' of PDE systems, I believe that the ATC applications will require a completely different and new mathematical methodology, a sort of discrete analogue of jet bundles and/or the sheaf-theoretic techniques to topologists. Here too, I have begun work on virtually 'virgin' mathematical ground (at least from an 'applied' point of view) which will require considerable preliminary work.

  4. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  5. Technical support document: Energy efficiency standards for consumer products: Room air conditioners, water heaters, direct heating equipment, mobile home furnaces, kitchen ranges and ovens, pool heaters, fluorescent lamp ballasts and television sets. Volume 1, Methodology

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-01

    The Energy Policy and Conservation Act (P.L. 94-163), as amended, establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. DOE is currently considering amending standards for seven types of products: water heaters, direct heating equipment, mobile home furnaces, pool heaters, room air conditioners, kitchen ranges and ovens (including microwave ovens), and fluorescent light ballasts and is considering establishing standards for television sets. This Technical Support Document presents the methodology, data, and results from the analysis of the energy and economic impacts of the proposed standards. This volume presents a general description of the analytic approach, including the structure of the major models.

  6. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  7. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 3-Surry Unit 1 Cycle 2

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using selected critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations in this report is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of two reactor critical configurations for Surry Unit 1 Cycle 2. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted a direct comparison of criticality calculations using the utility-calculated isotopics with those using the isotopics generated by the SCALE-4

  8. Methodology for the analysis of fenbendazole and its metabolites in plasma, urine, feces, and tissue homogenates.

    Science.gov (United States)

    Barker, S A; Hsieh, L C; Short, C R

    1986-05-15

    New methodology for the extraction and analysis of the anthelmintic fenbendazole and its metabolites from plasma, urine, liver homogenates, and feces from several animal species is presented. Quantitation of fenbendazole and its metabolites was conducted by high-pressure liquid chromatography using ultraviolet detection at 290 nm. The combined extraction and analysis procedures give excellent recoveries in all of the different biological matrices examined. High specificity, low limits of detection, and excellent linearity, accuracy, and inter- and intrasample variability were also obtained. The study of fenbendazole pharmacokinetics in vitro and in vivo should be greatly enhanced through the utilization of these methods.

  9. Socio-economic Value Analysis in Geospatial and Earth Observation: A methodology review (Invited)

    Science.gov (United States)

    Coote, A. M.; Bernknopf, R.; Smart, A.

    2013-12-01

    Many industries have long since realised that applying macro-economic analysis methodologies to assess the socio-economic value of a programme is a critical step to convincing decision makers to authorise investment. The geospatial and earth observation industry has however been slow to embrace economic analysis. There are however a growing number of studies, published in the last few years, that have applied economic principles to this domain. They have adopted a variety of different approaches, including: - Computable General Equilibrium Modelling (CGE) - Revealed preference, stated preference (Willingness to Pay surveys) - Partial Analysis - Simulations - Cost-benefit analysis (with and without risk analysis) This paper will critically review these approaches and assess their applicability to different situations and to meet multiple objectives.

  10. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  11. [Application Status of Evaluation Methodology of Electronic Medical Record: Evaluation of Bibliometric Analysis].

    Science.gov (United States)

    Lin, Dan; Liu, Jialin; Zhang, Rui; Li, Yong; Huang, Tingting

    2015-04-01

    In order to provide a reference and theoretical guidance of the evaluation of electronic medical record (EMR) and establishment of evaluation system in China, we applied a bibliometric analysis to assess the application of methodologies used at home and abroad, as well as to summarize the advantages and disadvantages of them. We systematically searched international medical databases of Ovid-MEDLINE, EBSCOhost, EI, EMBASE, PubMed, IEEE, and China's medical databases of CBM and CNKI between Jan. 1997 and Dec. 2012. We also reviewed the reference lists of articles for relevant articles. We selected some qualified papers according to the pre-established inclusion and exclusion criteria, and did information extraction and analysis to the papers. Eventually, 1 736 papers were obtained from online database and other 16 articles from manual retrieval. Thirty-five articles met the inclusion and exclusion criteria and were retrieved and assessed. In the evaluation of EMR, US counted for 54.28% in the leading place, and Canada and Japan stood side by side and ranked second with 8.58%, respectively. For the application of evaluation methodology, Information System Success Model, Technology Acceptance Model (TAM), Innovation Diffusion Model and Cost-Benefit Access Model were widely applied with 25%, 20%, 12.5% and 10%, respectively. In this paper, we summarize our study on the application of methodologies of EMR evaluation, which can provide a reference to EMR evaluation in China.

  12. Methodological approaches to planar and volumetric scintigraphic imaging of small volume targets with high spatial resolution and sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Faculdade de Medicina. Dept. de Biologia Molecular], e-mail: mejia_famerp@yahoo.com.br; Braga, J. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Div. de Astrofisica; Correa, R. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Ciencia Espacial e Atmosferica; Leite, J.P. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Neurologia, Psiquiatria e Psicologia Medica; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica

    2009-08-15

    Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multi pinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target's radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals. (author)

  13. A gap analysis methodology for collecting crop genepools: a case study with phaseolus beans.

    Directory of Open Access Journals (Sweden)

    Julián Ramírez-Villegas

    Full Text Available BACKGROUND: The wild relatives of crops represent a major source of valuable traits for crop improvement. These resources are threatened by habitat destruction, land use changes, and other factors, requiring their urgent collection and long-term availability for research and breeding from ex situ collections. We propose a method to identify gaps in ex situ collections (i.e. gap analysis of crop wild relatives as a means to guide efficient and effective collecting activities. METHODOLOGY/PRINCIPAL FINDINGS: The methodology prioritizes among taxa based on a combination of sampling, geographic, and environmental gaps. We apply the gap analysis methodology to wild taxa of the Phaseolus genepool. Of 85 taxa, 48 (56.5% are assigned high priority for collecting due to lack of, or under-representation, in genebanks, 17 taxa are given medium priority for collecting, 15 low priority, and 5 species are assessed as adequately represented in ex situ collections. Gap "hotspots", representing priority target areas for collecting, are concentrated in central Mexico, although the narrow endemic nature of a suite of priority species adds a number of specific additional regions to spatial collecting priorities. CONCLUSIONS/SIGNIFICANCE: Results of the gap analysis method mostly align very well with expert opinion of gaps in ex situ collections, with only a few exceptions. A more detailed prioritization of taxa and geographic areas for collection can be achieved by including in the analysis predictive threat factors, such as climate change or habitat destruction, or by adding additional prioritization filters, such as the degree of relatedness to cultivated species (i.e. ease of use in crop breeding. Furthermore, results for multiple crop genepools may be overlaid, which would allow a global analysis of gaps in ex situ collections of the world's plant genetic resources.

  14. Control Volume Analysis, Entropy Balance and the Entropy Production in Flow Systems

    OpenAIRE

    Niven, Robert K.; Noack, Bernd R.

    2014-01-01

    This chapter concerns "control volume analysis", the standard engineering tool for the analysis of flow systems, and its application to entropy balance calculations. Firstly, the principles of control volume analysis are enunciated and applied to flows of conserved quantities (e.g. mass, momentum, energy) through a control volume, giving integral (Reynolds transport theorem) and differential forms of the conservation equations. Several definitions of steady state are discussed. The concept of...

  15. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  16. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-08-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  17. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    Energy Technology Data Exchange (ETDEWEB)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations.

  18. A Feasibility Analysis Methodology for Decentralized Wastewater Systems - Energy-Efficiency and Cost.

    Science.gov (United States)

    Naik, Kartiki S; Stenstrom, Michael K

    2016-03-01

    Centralized wastewater treatment, widely practiced in developed areas, involves transporting wastewater from large urban areas to a large capacity plant using a single network of sewers, whereas decentralization is the concept of wastewater collection, treatment and reuse at or near its point of generation. Smaller decentralized plants can achieve extensive reclamation and wastewater management with energy-efficient reclaimed water pumping, modularized expansion and lower capital investment. We devised a methodology to preliminarily assess these alternatives using local constraints and conducted a feasibility analysis for each option. It addressed various scenarios using the pump-back energy consumption, sewer and treatment plant construction and capacity expansion cost. We demonstrated this methodology by applying it to the Hollywood vicinity (California). In this study, the decentralized configuration was more economical and energy-efficient than the centralized system. The pump-back energy consumption was about 50% of the aeration energy consumption for the centralized option.

  19. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej;

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems....... The transformation of ordinary systems to element-based ones and the aggregation of non-key elements allow the important design parameters, such as the number of stages, feed stage and minimum reflux ratio, to be determined by using simple diagrams similar to those regularly employed for non-reactive systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...

  20. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  1. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Licu, Tony [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: antonio.licu@eurocontrol.int; Cioran, Florin [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: florin.cioran@eurocontrol.int; Hayward, Brent [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: bhayward@dedale.net; Lowe, Andrew [EUROCONTROL-European Organization for the Safety of Air Navigation and Dedale Asia Pacific, Safety Team, Rue de la Fusee, 96, 1130 Brussels (Belgium)]. E-mail: alowe@dedale.net

    2007-09-15

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability.

  2. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  3. Design Methodology for Bonded-Bolted Composite Joints. Volume I. Analysis Derivations and Illustrative Solutions

    Science.gov (United States)

    1982-02-01

    Laboratory (AFWAL/ FIBRA ) February 1982 AF Wright Aeronauti cal Laboratori eý IS NUMBER Of PAGES Wright-Patterson AFB, OH 45433 99 IT7MOITORING AGENCY...the software should be submitted in accord- ance with AFSC Sup 1 to AFR 300-6 (DOD Dir 4160.19 dtd 2 Apr 73). Requests must be submitted to AFWAL/ FIBRA , Wright-Patterson AFB, OH 45433. 89

  4. Unified Methodology for Airport Pavement Analysis and Design. Volume 1. State of the Art

    Science.gov (United States)

    1991-06-01

    pavements with saturated bases and subgrades to well- drained pavements. The damage factors ranged from 10 to 70,000. Cedergren also demonstrated that...34Modulus of Asphalt Mixtures - An Unresolved Dilemma," Transportation Research Board, Research Record 1171, p. 193, Washington, DC 13. Cedergren , H.R

  5. HARDMAN Comparability Analysis Methodology Guide. Volume 1. Manager’s Guide

    Science.gov (United States)

    1985-04-01

    34 ■ •wvrrvT-rmi iwi’jm’wwuj’ VA1 !TffTW.’ VJ HJHi H) Vf! qilw.M »’’’«S1* ^LWI ■l’unjuv11^ ’Jgf mff?w^P^^?^ gfff ^ V." 4 ^. Hy>, •"A MCO MEEI

  6. Methodology for CFD Design Analysis of National Launch System Nozzle Manifold

    Science.gov (United States)

    Haire, Scot L.

    1993-01-01

    The current design environment dictates that high technology CFD (Computational Fluid Dynamics) analysis produce quality results in a timely manner if it is to be integrated into the design process. The design methodology outlined describes the CFD analysis of an NLS (National Launch System) nozzle film cooling manifold. The objective of the analysis was to obtain a qualitative estimate for the flow distribution within the manifold. A complex, 3D, multiple zone, structured grid was generated from a 3D CAD file of the geometry. A Euler solution was computed with a fully implicit compressible flow solver. Post processing consisted of full 3D color graphics and mass averaged performance. The result was a qualitative CFD solution that provided the design team with relevant information concerning the flow distribution in and performance characteristics of the film cooling manifold within an effective time frame. Also, this design methodology was the foundation for a quick turnaround CFD analysis of the next iteration in the manifold design.

  7. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository - Volume 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R. (INEEL); Sanchez, L.C.; Aguilar, R.; Trellue, H.R.; Cochrane, K. (SNL); Rath, J.S. (New Mexico Engineering Research Institute)

    1998-10-01

    The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  8. Yucca Mountain transportation routes: Preliminary characterization and risk analysis; Volume 2, Figures [and] Volume 3, Technical Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Souleyrette, R.R. II; Sathisan, S.K.; di Bartolo, R. [Nevada Univ., Las Vegas, NV (United States). Transportation Research Center

    1991-05-31

    This report presents appendices related to the preliminary assessment and risk analysis for high-level radioactive waste transportation routes to the proposed Yucca Mountain Project repository. Information includes data on population density, traffic volume, ecologically sensitive areas, and accident history.

  9. Stereological analysis of nuclear volume in recurrent meningiomas

    DEFF Research Database (Denmark)

    Madsen, C; Schrøder, H D

    1994-01-01

    A stereological estimation of nuclear volume in recurrent and non-recurrent meningiomas was made. The aim was to investigate whether this method could discriminate between these two groups. We found that the mean nuclear volumes in recurrent meningiomas were all larger at debut than in any...... of the control tumors. The mean nuclear volume of the individual recurrent tumors appeared to change with time, showing a tendency to diminish. A relationship between large nuclear volume at presentation and number of or time interval between recurrences was not found. We conclude that measurement of mean...

  10. Efficacy of bronchoscopic lung volume reduction: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Iftikhar IH

    2014-05-01

    Full Text Available Imran H Iftikhar,1 Franklin R McGuire,1 Ali I Musani21Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, University of South Carolina, Columbia, SC, USA; 2Department of Medicine, Division of Pulmonary, Critical Care and Sleep Medicine, National Jewish Health, Denver, CO, USABackground: Over the last several years, the morbidity, mortality, and high costs associated with lung volume reduction (LVR surgery has fuelled the development of different methods for bronchoscopic LVR (BLVR in patients with emphysema. In this meta-analysis, we sought to study and compare the efficacy of most of these methods.Methods: Eligible studies were retrieved from PubMed and Embase for the following BLVR methods: one-way valves, sealants (BioLVR, LVR coils, airway bypass stents, and bronchial thermal vapor ablation. Primary study outcomes included the mean change post-intervention in the lung function tests, the 6-minute walk distance, and the St George's Respiratory Questionnaire. Secondary outcomes included treatment-related complications.Results: Except for the airway bypass stents, all other methods of BLVR showed efficacy in primary outcomes. However, in comparison, the BioLVR method showed the most significant findings and was the least associated with major treatment-related complications. For the BioLVR method, the mean change in forced expiratory volume (in first second was 0.18 L (95% confidence interval [CI]: 0.09 to 0.26; P<0.001; in 6-minute walk distance was 23.98 m (95% CI: 12.08 to 35.88; P<0.01; and in St George's Respiratory Questionnaire was −8.88 points (95% CI: −12.12 to −5.64; P<0.001.Conclusion: The preliminary findings of our meta-analysis signify the importance of most methods of BLVR. The magnitude of the effect on selected primary outcomes shows noninferiority, if not equivalence, when compared to what is known for surgical LVR.Keyword: emphysema, endobronchial valves, sealants, stents, coils

  11. Methodology for Analysis, Modeling and Simulation of Airport Gate-waiting Delays

    Science.gov (United States)

    Wang, Jianfeng

    This dissertation presents methodologies to estimate gate-waiting delays from historical data, to identify gate-waiting-delay functional causes in major U.S. airports, and to evaluate the impact of gate operation disruptions and mitigation strategies on gate-waiting delay. Airport gates are a resource of congestion in the air transportation system. When an arriving flight cannot pull into its gate, the delay it experiences is called gate-waiting delay. Some possible reasons for gate-waiting delay are: the gate is occupied, gate staff or equipment is unavailable, the weather prevents the use of the gate (e.g. lightning), or the airline has a preferred gate assignment. Gate-waiting delays potentially stay with the aircraft throughout the day (unless they are absorbed), adding costs to passengers and the airlines. As the volume of flights increases, ensuring that airport gates do not become a choke point of the system is critical. The first part of the dissertation presents a methodology for estimating gate-waiting delays based on historical, publicly available sources. Analysis of gate-waiting delays at major U.S. airports in the summer of 2007 identifies the following. (i) Gate-waiting delay is not a significant problem on majority of days; however, the worst delay days (e.g. 4% of the days at LGA) are extreme outliers. (ii) The Atlanta International Airport (ATL), the John F. Kennedy International Airport (JFK), the Dallas/Fort Worth International Airport (DFW) and the Philadelphia International Airport (PHL) experience the highest gate-waiting delays among major U.S. airports. (iii) There is a significant gate-waiting-delay difference between airlines due to a disproportional gate allocation. (iv) Gate-waiting delay is sensitive to time of a day and schedule peaks. According to basic principles of queueing theory, gate-waiting delay can be attributed to over-scheduling, higher-than-scheduled arrival rate, longer-than-scheduled gate-occupancy time, and reduced gate

  12. Style, content and format guide for writing safety analysis documents. Volume 1, Safety analysis reports for DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    The purpose of Volume 1 of this 4-volume style guide is to furnish guidelines on writing and publishing Safety Analysis Reports (SARs) for DOE nuclear facilities at Sandia National Laboratories. The scope of Volume 1 encompasses not only the general guidelines for writing and publishing, but also the prescribed topics/appendices contents along with examples from typical SARs for DOE nuclear facilities.

  13. A methodology for the analysis and improvement of a firm´s competitiveness

    Directory of Open Access Journals (Sweden)

    Jose Celso Contador

    2006-01-01

    Full Text Available This paper presents a new methodology for the analysis of a group of companies, aiming at explaining and increasing a firm´s competitiveness. Based on the model of the fields and weapons of the competition, the methodology distinguishes between business and operational competitive strategies. The first consists of some of the 15 fields of the competition, and the latter consists of the weapons of the competition. Competitiveness is explained through the application of several mathematical variables. The influence of the competitive strategies is statistically evaluated using the Wilcoxon-Mann-Whitney non-parametric test, the t-test, and Pearson´s correlation. The methodology was applied to companies belonging to the textil e pole of Americana; one of the conclusions reached is that what explains competitiveness is the operational strategy rather than the business strategy. Therefore, to improve competitiveness, a company must intensify its focus on weapons that are relevant to the fields where it decided to compete.

  14. Simulation Methodology for Analysis of Substrate Noise Impact on Analog / RF Circuits Including Interconnect Resistance

    CERN Document Server

    Soens, C; Wambacq, P; Donnay, S

    2011-01-01

    This paper reports a novel simulation methodology for analysis and prediction of substrate noise impact on analog / RF circuits taking into account the role of the parasitic resistance of the on-chip interconnect in the impact mechanism. This methodology allows investigation of the role of the separate devices (also parasitic devices) in the analog / RF circuit in the overall impact. This way is revealed which devices have to be taken care of (shielding, topology change) to protect the circuit against substrate noise. The developed methodology is used to analyze impact of substrate noise on a 3 GHz LC-tank Voltage Controlled Oscillator (VCO) designed in a high-ohmic 0.18 $\\mu$m 1PM6 CMOS technology. For this VCO (in the investigated frequency range from DC to 15 MHz) impact is mainly caused by resistive coupling of noise from the substrate to the non-ideal on-chip ground interconnect, resulting in analog ground bounce and frequency modulation. Hence, the presented test-case reveals the important role of the o...

  15. The RAAF Logistics Study. Volume 4,

    Science.gov (United States)

    1986-10-01

    Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system

  16. Solar thermal technology development: Estimated market size and energy cost savings. Volume 2: Assumptions, methodology and results

    Science.gov (United States)

    Gates, W. R.

    1983-02-01

    Estimated future energy cost savings associated with the development of cost-competitive solar thermal technologies (STT) are discussed. Analysis is restricted to STT in electric applications for 16 high-insolation/high-energy-price states. Three fuel price scenarios and three 1990 STT system costs are considered, reflecting uncertainty over future fuel prices and STT cost projections. Solar thermal technology research and development (R&D) is found to be unacceptably risky for private industry in the absence of federal support. Energy cost savings were projected to range from $0 to $10 billion (1990 values in 1981 dollars), depending on the system cost and fuel price scenario. Normal R&D investment risks are accentuated because the Organization of Petroleum Exporting Countries (OPEC) cartel can artificially manipulate oil prices and undercut growth of alternative energy sources. Federal participation in STT R&D to help capture the potential benefits of developing cost-competitive STT was found to be in the national interest. Analysis is also provided regarding two federal incentives currently in use: The Federal Business Energy Tax Credit and direct R&D funding.

  17. Time-domain analysis methodology for large-scale RLC circuits and its applications

    Institute of Scientific and Technical Information of China (English)

    LUO Zuying; CAI Yici; Sheldon X.-D Tan; HONG Xianlong; WANG Xiaoyi; PAN Zhu; FU Jingjing

    2006-01-01

    With soaring work frequency and decreasing feature sizes, VLSI circuits with RLC parasitic components are more like analog circuits and should be carefully analyzed in physical design. However, the number of extracted RLC components is typically too large to be analyzed efficiently by using present analog circuit simulators like SPICE. In order to speedup the simulations without error penalty, this paper proposes a novel methodology to compress the time-descritized circuits resulted from numerical integration approximation at every time step. The main contribution of the methodology is the efficient structure-level compression of DC circuits containing many current sources, which is an important complement to present circuit analysis theory. The methodology consists of the following parts: 1) An approach is proposed to delete all intermediate nodes of RL branches. 2) An efficient approach is proposed to compress and back-solve parallel and serial branches so that it is error-free and of linear complexity to analyze circuits of tree topology. 3) The Y to ( transformation method is used to error-free reduce and back-solve the intermediate nodes of ladder circuits with the linear complexity. Thus, the whole simulation method is very accurate and of linear complexity to analyze circuits of chain topology. Based on the methodology, we propose several novel algorithms for efficiently solving RLC-model transient power/ground (P/G) networks. Among them, EQU-ADI algorithm of linear-complexity is proposed to solve RLC P/G networks with mesh-tree or mesh-chain topologies. Experimental results show that the proposed method is at least two orders of magnitude faster than SPICE while it can scale linearly in both time- and memory-complexity to solve very large P/G networks.

  18. A methodology for the analysis of differential coexpression across the human lifespan

    Directory of Open Access Journals (Sweden)

    Gillis Jesse

    2009-09-01

    Full Text Available Abstract Background Differential coexpression is a change in coexpression between genes that may reflect 'rewiring' of transcriptional networks. It has previously been hypothesized that such changes might be occurring over time in the lifespan of an organism. While both coexpression and differential expression of genes have been previously studied in life stage change or aging, differential coexpression has not. Generalizing differential coexpression analysis to many time points presents a methodological challenge. Here we introduce a method for analyzing changes in coexpression across multiple ordered groups (e.g., over time and extensively test its validity and usefulness. Results Our method is based on the use of the Haar basis set to efficiently represent changes in coexpression at multiple time scales, and thus represents a principled and generalizable extension of the idea of differential coexpression to life stage data. We used published microarray studies categorized by age to test the methodology. We validated the methodology by testing our ability to reconstruct Gene Ontology (GO categories using our measure of differential coexpression and compared this result to using coexpression alone. Our method allows significant improvement in characterizing these groups of genes. Further, we examine the statistical properties of our measure of differential coexpression and establish that the results are significant both statistically and by an improvement in semantic similarity. In addition, we found that our method finds more significant changes in gene relationships compared to several other methods of expressing temporal relationships between genes, such as coexpression over time. Conclusion Differential coexpression over age generates significant and biologically relevant information about the genes producing it. Our Haar basis methodology for determining age-related differential coexpression performs better than other tested methods. The

  19. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Pol, Hilleke E. Hulshoff; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension. perceptual organization and perceptual speed as assessed by the Wechsler I

  20. A Genetic Analysis of Brain Volumes and IQ in Children

    Science.gov (United States)

    van Leeuwen, Marieke; Peper, Jiska S.; van den Berg, Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler Intelligence Scale for Children-III. Phenotypic…

  1. A genetic analysis of brain volumes and IQ in children

    NARCIS (Netherlands)

    Leeuwen, van Marieke; Peper, Jiska S.; Berg, van den Stephanie M.; Brouwer, Rachel M.; Hulshoff Pol, Hilleke E.; Kahn, Rene S.; Boomsma, Dorret I.

    2009-01-01

    In a population-based sample of 112 nine-year old twin pairs, we investigated the association among total brain volume, gray matter and white matter volume, intelligence as assessed by the Raven IQ test, verbal comprehension, perceptual organization and perceptual speed as assessed by the Wechsler I

  2. Soft systems methodology analysis for scoping in environmental impact statement in Israel

    OpenAIRE

    Haklay, M.

    1999-01-01

    The current working paper will focus on Soft System Methodology (SSM) analysis of the process of issuing guidelines for Environmental Impact Statements (EIS) to developers in the Israeli context. The paper’s goal is to make the reader familiar with the terminology and the concepts of SSM, while serving as a case study for practising SSM. The paper starts with a “crash” introduction to SSM, followed by a general description of the process in the centre of the discussion - the Israeli EIS proce...

  3. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    Energy Technology Data Exchange (ETDEWEB)

    Armelin, Maria Jose A.; Ferraz, Caue de Mello; Hamada, Margarida M., E-mail: marmelin@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Lab. de Analise por Ativacao Neutronica

    2015-07-01

    Bismuth tri-iodide (BrI{sub 3}) is an attractive material for using as a semiconductor. In this paper, BiI{sub 3} crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI{sub 3} purification methodology. (author)

  4. A Methodology to Evaluate Object oriented Software Systems Using Change Requirement Traceability Based on Impact Analysis

    Directory of Open Access Journals (Sweden)

    Sunil T. D

    2014-06-01

    Full Text Available It is a well known fact that software maintenance plays a major role and finds importance in software development life cycle. As object - oriented programming has become the standard, it is very important to understand the problems of maintaining object -oriented software systems. This paper aims at evaluating object - oriented software system through change requirement traceability – based impact analysis methodology for non functional requirements using functional requirements . The major issues have been related to change impact algorithms and inheritance of functionality.

  5. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    Directory of Open Access Journals (Sweden)

    Živan Ristić

    2006-12-01

    Full Text Available Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a Methodological aspects of evaluation (kinds of evaluation, metaevaluation and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement (b Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspectives (in the perspectives of learning and development, perspectives of internal processes, perspectives of the consumer/user, and in financial perspectives (c Systems and IT solutions of evaluation and measuring performances of the organization in strategic analysis and control.

  6. Total materials consumption; an estimation methodology and example using lead; a materials flow analysis

    Science.gov (United States)

    Biviano, Marilyn B.; Wagner, Lorie A.; Sullivan, Daniel E.

    1999-01-01

    Materials consumption estimates, such as apparent consumption of raw materials, can be important indicators of sustainability. Apparent consumption of raw materials does not account for material contained in manufactured products that are imported or exported and may thus under- or over-estimate total consumption of materials in the domestic economy. This report demonstrates a methodology to measure the amount of materials contained in net imports (imports minus exports), using lead as an example. The analysis presents illustrations of differences between apparent and total consumption of lead and distributes these differences into individual lead-consuming sectors.

  7. Workload Characterization an Essential Step in Computer Systems Performance Analysis - Methodology and Tools

    Directory of Open Access Journals (Sweden)

    CHEVERESAN, R.T.

    2009-10-01

    Full Text Available Computer system performance is a very complex process in which the hardware and software manufacturers invest important human and financial resources. Workload characterization represents an essential component of performance analysis. This paper presents a trace based methodology for software applications evaluation. It introduces a new analysis concept designed to significantly ease this process and it presents a set of experimental data collected using the new analysis structure on a representative set of scientific and commercial applications. Several important conclusions are drawn regarding workload characteristics, classifications and runtime behavior. This type of data is used by the computer architects in their efforts to maximize the performance of the hardware platforms these applications are going to execute on.

  8. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    Science.gov (United States)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  9. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  10. Refined multi-level methodology in parallel computing environment for BWR RIA analysis

    Science.gov (United States)

    Solis-Rodarte, Jorge

    2000-12-01

    Best-estimate methodologies in boiling water reactor can reduce the traditional conservative thermal margins imposed on the designs and during the operation of this type of nuclear reactors. Traditional operating thermal margins are obtained based on simplified modeling techniques that are supplemented with the required dose of conservatism. For instance, much more realistic transient pin peaking distributions can be predicted by applying a dehomogenization algorithm, based on a flux reconstruction scheme which uses nodal results during both steady state and transient calculation at each time step. A subchannel analysis module for obtaining thermal margins supplements the calculation approach used. A multi-level methodology to extend the TRAC-BF1/NEM coupled code capability to obtain the transient fuel rod response has been implemented. To fulfill the development requirements some improved neutronic models were implemented into the NEM solution algorithm, namely the pin power reconstruction capability, and the simulation of a dynamic scram. The obtained results were coupled to a subchannel analysis module: COBRA-TF T-H subchannel analysis code. The objective of the pin power reconstruction capability of NEM is to locate the most limiting node (axial region of assembly/channel) within the core. The power information obtained from the NEM 3D neutronic calculation is used by the hot channel analysis module (COBRA-TF). COBRA-TF needs also the T-H conditions at the boundary nodes. This information is provided by TRACBF1 T-H system analysis code. The Subchannel analysis module uses this information to re-calculate the fluid, thermal and hydraulics conditions in the most limiting node (axial region of assembly/channel) within the core.

  11. Qualitative data analysis using the n Vivo programe and the application of the methodology of grounded theory procedures

    Directory of Open Access Journals (Sweden)

    Niedbalski Jakub

    2012-02-01

    Full Text Available The main aim of the article is to identify the capabilities and constraints of using CAQDAS (Computer-Assisted Qualitative Data Analysis Software programs in qualitative data analysis. Our considerations are based on the personal experiences gained while conducting the research projects using the methodology of grounded theory (GT and the NVivo 8 program. In presented article we focusedon relations between the methodological principles of grounded theory and the technical possibilities of NVivo 8. The paper presents our opinion about the most important options available in NVivo 8 and their application in the studies based on the methodology of grounded theory.

  12. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  13. A New Mathematical Model for Flank Wear Prediction Using Functional Data Analysis Methodology

    Directory of Open Access Journals (Sweden)

    Sonja Jozić

    2014-01-01

    Full Text Available This paper presents a new approach improving the reliability of flank wear prediction during the end milling process. In the present work, prediction of flank wear has been achieved by using cutting parameters and force signals as the sensitive carriers of information about the machining process. A series of experiments were conducted to establish the relationship between flank wear and cutting force components as well as the cutting parameters such as cutting speed, feed per tooth, and radial depth of cut. In order to be able to predict flank wear a new linear regression mathematical model has been developed by utilizing functional data analysis methodology. Regression coefficients of the model are in the form of time dependent functions that have been determined through the use of functional data analysis methodology. The mathematical model has been developed by means of applied cutting parameters and measured cutting forces components during the end milling of workpiece made of 42CrMo4 steel. The efficiency and flexibility of the developed model have been verified by comparing it with the separate experimental data set.

  14. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K.R.; Sharp, D.A. (Westinghouse Savannah River Co., Aiken, SC (United States)); Amos, C.N.; Wagner, K.C.; Bradley, D.R. (Science Applications International Corp., Albuquerque, NM (United States))

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained.

  15. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  16. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  17. Nonlinear Structural Analysis Methodology and Dynamics Scaling of Inflatable Parabolic Reflector Antenna Concepts

    Science.gov (United States)

    Sreekantamurthy, Tham; Gaspar, James L.; Mann, Troy; Behun, Vaughn; Pearson, James C., Jr.; Scarborough, Stephen

    2007-01-01

    Ultra-light weight and ultra-thin membrane inflatable antenna concepts are fast evolving to become the state-of-the-art antenna concepts for deep-space applications. NASA Langley Research Center has been involved in the structural dynamics research on antenna structures. One of the goals of the research is to develop structural analysis methodology for prediction of the static and dynamic response characteristics of the inflatable antenna concepts. This research is focused on the computational studies to use nonlinear large deformation finite element analysis to characterize the ultra-thin membrane responses of the antennas. Recently, structural analyses have been performed on a few parabolic reflector antennas of varying size and shape, which are referred in the paper as 0.3 meters subscale, 2 meters half-scale, and 4 meters full-scale antenna. The various aspects studied included nonlinear analysis methodology and solution techniques, ways to speed convergence in iterative methods, the sensitivities of responses with respect to structural loads, such as inflation pressure, gravity, and pretension loads in the ground and in-space conditions, and the ultra-thin membrane wrinkling characteristics. Several such intrinsic aspects studied have provided valuable insight into evaluation of structural characteristics of such antennas. While analyzing these structural characteristics, a quick study was also made to assess the applicability of dynamics scaling of the half-scale antenna. This paper presents the details of the nonlinear structural analysis results, and discusses the insight gained from the studies on the various intrinsic aspects of the analysis methodology. The predicted reflector surface characteristics of the three inflatable ultra-thin membrane parabolic reflector antenna concepts are presented as easily observable displacement fringe patterns with associated maximum values, and normal mode shapes and associated frequencies. Wrinkling patterns are

  18. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  19. Methodology for in situ gas sampling, transport and laboratory analysis of gases from stranded cetaceans

    Science.gov (United States)

    de Quirós, Yara Bernaldo; González-Díaz, Óscar; Saavedra, Pedro; Arbelo, Manuel; Sierra, Eva; Sacchini, Simona; Jepson, Paul D.; Mazzariol, Sandro; di Guardo, Giovanni; Fernández, Antonio

    2011-12-01

    Gas-bubble lesions were described in cetaceans stranded in spatio-temporal concordance with naval exercises using high-powered sonars. A behaviourally induced decompression sickness-like disease was proposed as a plausible causal mechanism, although these findings remain scientifically controversial. Investigations into the constituents of the gas bubbles in suspected gas embolism cases are highly desirable. We have found that vacuum tubes, insulin syringes and an aspirometer are reliable tools for in situ gas sampling, storage and transportation without appreciable loss of gas and without compromising the accuracy of the analysis. Gas analysis is conducted by gas chromatography in the laboratory. This methodology was successfully applied to a mass stranding of sperm whales, to a beaked whale stranded in spatial and temporal association with military exercises and to a cetacean chronic gas embolism case. Results from the freshest animals confirmed that bubbles were relatively free of gases associated with putrefaction and consisted predominantly of nitrogen.

  20. Theoretical error analysis of the sampling moiré method and phase compensation methodology for single-shot phase analysis.

    Science.gov (United States)

    Ri, Shien; Muramatsu, Takashi

    2012-06-01

    Recently, a rapid and accurate single-shot phase measurement technique called the sampling moiré method has been developed for small-displacement distribution measurements. In this study, the theoretical phase error of the sampling moiré method caused by linear intensity interpolation in the case of a mismatch between the sampling pitch and the original grating pitch is analyzed. The periodic phase error is proportional to the square of the spatial angular frequency of the moiré fringe. Moreover, an effective phase compensation methodology is developed to reduce the periodic phase error. Single-shot phase analysis can perform accurately even when the sampling pitch is not matched to the original grating pitch exactly. The primary simulation results demonstrate the effectiveness of the proposed phase compensation methodology.

  1. Seismic geometric attribute analysis for fracture characterization: New methodologies and applications

    Science.gov (United States)

    Di, Haibin

    In 3D subsurface exploration, detection of faults and fractures from 3D seismic data is vital to robust structural and stratigraphic analysis in the subsurface, and great efforts have been made in the development and application of various seismic attributes (e.g. coherence, semblance, curvature, and flexure). However, the existing algorithms and workflows are not accurate and efficient enough for robust fracture detection, especially in naturally fractured reservoirs with complicated structural geometry and fracture network. My Ph.D. research is proposing the following scopes of work to enhance our capability and to help improve the resolution on fracture characterization and prediction. For discontinuity attribute, previous methods have difficulty highlighting subtle discontinuities from seismic data in cases where the local amplitude variation is non-zero mean. This study proposes implementing a gray-level transformation and the Canny edge detector for improved imaging of discontinuities. Specifically, the new process transforms seismic signals to be zero mean and helps amplify subtle discontinuities, leading to an enhanced visualization for structural and stratigraphic details. Applications to various 3D seismic datasets demonstrate that the new algorithm is superior to previous discontinuity-detection methods. Integrating both discontinuity magnitude and discontinuity azimuth helps better define channels, faults and fractures, than the traditional similarity, amplitude gradient and semblance attributes. For flexure attribute, the existing algorithm is computationally intensive and limited by the lateral resolution for steeply-dipping formations. This study proposes a new and robust volume-based algorithm that evaluate flexure attribute more accurately and effectively. The algorithms first volumetrically fit a cubic surface by using a diamond 13-node grid cell to seismic data, and then compute flexure using the spatial derivatives of the built surface. To avoid

  2. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  3. Influence of Software Tool and Methodological Aspects of Total Metabolic Tumor Volume Calculation on Baseline [18F]FDG PET to Predict Survival in Hodgkin Lymphoma.

    Directory of Open Access Journals (Sweden)

    Salim Kanoun

    Full Text Available To investigate the respective influence of software tool and total metabolic tumor volume (TMTV0 calculation method on prognostic stratification of baseline 2-deoxy-2-[18F]fluoro-D-glucose positron emission tomography ([18F]FDG-PET in newly diagnosed Hodgkin lymphoma (HL.59 patients with newly diagnosed HL were retrospectively included. [18F]FDG-PET was performed before any treatment. Four sets of TMTV0 were calculated with Beth Israel (BI software: based on an absolute threshold selecting voxel with standardized uptake value (SUV >2.5 (TMTV02.5, applying a per-lesion threshold of 41% of the SUV max (TMTV041 and using a per-patient adapted threshold based on SUV max of the liver (>125% and >140% of SUV max of the liver background; TMTV0125 and TMTV0140. TMTV041 was also determined with commercial software for comparison of software tools. ROC curves were used to determine the optimal threshold for each TMTV0 to predict treatment failure.Median follow-up was 39 months. There was an excellent correlation between TMTV041 determined with BI and with the commercial software (r = 0.96, p<0.0001. The median TMTV0 value for TMTV041, TMTV02.5, TMTV0125 and TMTV0140 were respectively 160 (used as reference, 210 ([28;154] p = 0.005, 183 ([-4;114] p = 0.06 and 143 ml ([-58;64] p = 0.9. The respective optimal TMTV0 threshold and area under curve (AUC for prediction of progression free survival (PFS were respectively: 313 ml and 0.70, 432 ml and 0.68, 450 ml and 0.68, 330 ml and 0.68. There was no significant difference between ROC curves. High TMTV0 value was predictive of poor PFS in all methodologies: 4-years PFS was 83% vs 42% (p = 0.006 for TMTV02.5, 83% vs 41% (p = 0.003 for TMTV041, 85% vs 40% (p<0.001 for TMTV0125 and 83% vs 42% (p = 0.004 for TMTV0140.In newly diagnosed HL, baseline metabolic tumor volume values were significantly influenced by the choice of the method used for determination of volume. However, no significant differences were found

  4. Determination of fiber volume in graphite/epoxy materials using computer image analysis

    Science.gov (United States)

    Viens, Michael J.

    1990-01-01

    The fiber volume of graphite/epoxy specimens was determined by analyzing optical images of cross sectioned specimens using image analysis software. Test specimens were mounted and polished using standard metallographic techniques and examined at 1000 times magnification. Fiber volume determined using the optical imaging agreed well with values determined using the standard acid digestion technique. The results were found to agree within 5 percent over a fiber volume range of 45 to 70 percent. The error observed is believed to arise from fiber volume variations within the graphite/epoxy panels themselves. The determination of ply orientation using image analysis techniques is also addressed.

  5. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    Science.gov (United States)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may

  6. Analysis of maternal and child health policies in Malawi: The methodological perspective.

    Science.gov (United States)

    Daire, J; Khalil, D

    2015-12-01

    The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context.

  7. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  8. Field Programmable Gate Array Reliability Analysis Using the Dynamic Flowgraph Methodology

    Directory of Open Access Journals (Sweden)

    Phillip McNelles

    2016-10-01

    Full Text Available Field programmable gate array (FPGA-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs, for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM. It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the “IEEE 1164 standard,” registers (D flip-flops, configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  9. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  10. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  11. Proposal of a Methodology of Stakeholder Analysis for the Brazilian Satellite Space Program

    Directory of Open Access Journals (Sweden)

    Mônica Elizabeth Rocha de Oliveira

    2012-03-01

    Full Text Available To ensure the continuity and growth of space activities in Brazil, it is fundamental to persuade the Brazilian society and its representatives in Government about the importance of investments in space activities. Also, it is important to convince talented professionals to place space activities as an object of their interest; the best schools should also be convinced to offer courses related to the space sector; finally, innovative companies should be convinced to take part in space sector activities, looking to returns, mainly in terms of market differentiation and qualification, as a path to take part in high-technology and high-complexity projects. On the one hand, this process of convincing or, more importantly, committing these actors to space activities, implies a thorough understanding of their expectations and needs, in order to plan how the system/organization can meet them. On the other hand, if stakeholders understand how much they can benefit from this relationship, their consequent commitment will very much strengthen the action of the system/organization. With this framework in perspective, this paper proposes a methodology of stakeholder analysis for the Brazilian satellite space program. In the exercise developed in the article, stakeholders have been identified from a study of the legal framework of the Brazilian space program. Subsequently, the proposed methodology has been applied to the planning of actions by a public organization.

  12. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  13. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 1: Analysis methods

    Science.gov (United States)

    Waszak, M. R.; Schmidt, D. S.

    1985-01-01

    As aircraft become larger and lighter due to design requirements for increased payload and improved fuel efficiency, they will also become more flexible. For highly flexible vehicles, the handling qualities may not be accurately predicted by conventional methods. This study applies two analysis methods to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop model analysis technique. This method considers the effects of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Volume 1 consists of the development and application of the two analysis methods described above.

  14. THE STUDY OF ORGANIZATIONAL CULTURE: METHODOLOGY FOR QUANTITATIVE EVALUATION AND ANALYSIS

    Directory of Open Access Journals (Sweden)

    To Thu Trang

    2014-01-01

    Full Text Available This article discusses the concept: methodof evaluation organizational culture, qualitative and quantitative assessment methodology and lists the basic methodologyfor assessing organizational culture. Fullydescribe professor Denison’s methodology for assessing organizational culture.

  15. Methodology for Impact Analysis of the Mobile Web in Developing Countries: a Pilot Study in Nairobi, Kenya

    OpenAIRE

    Purwandari, Betty; Hall, Wendy; Wills, Gary

    2011-01-01

    In this paper, we describe an impact analysis methodology to measure the impact of the Mobile Web in developing nations. This methodology is needed to anticipate the effects of the Mobile Web on society. Moreover, it can guide advancement of Mobile Web technology to better serve its users. In May 2010, a pilot study to test the methodology was carried out in Nairobi, Kenya. There were 47 students from 3 leading universities participated in the study. Questionnaires were used to ask how they u...

  16. Application of probabilistic and decision analysis methods to structural mechanics and materials sciences problems. Volume 1. Planning document

    Energy Technology Data Exchange (ETDEWEB)

    Garrick, B.J.; Tagart, S.W. Jr. (eds.)

    1984-08-01

    Volume I presents an overview of the EPRI structural reliability research program. First, perspectives on the probabilistic treatment of uncertainty are presented. A brief explanation is given of why decision analysis methods are part of EPRI's structural reliability project, and how the use of such methods to handle uncertainty can improve decision-making in this area. A more detailed discussion of one approach for dealing with uncertainty about event probabilities is also presented. Next, review of probabilistic risk analysis is presented. This review includes a brief history of its development and application, an overview of the methodology involved, the role of structural reliability assessment in providing input to PRAs, and the treatment of uncertainties in that input. A brief discussion of the relationship between PRA and safety goals is also included.

  17. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  18. Evaluation of methodologies for assessing the overall diet: dietary quality scores and dietary pattern analysis.

    Science.gov (United States)

    Ocké, Marga C

    2013-05-01

    This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.

  19. Vehicle technologies heavy vehicle program : FY 2008 benefits analysis, methodology and results --- final report.

    Energy Technology Data Exchange (ETDEWEB)

    Singh, M.; Energy Systems; TA Engineering

    2008-02-29

    This report describes the approach to estimating the benefits and analysis results for the Heavy Vehicle Technologies activities of the Vehicle Technologies (VT) Program of EERE. The scope of the effort includes: (1) Characterizing baseline and advanced technology vehicles for Class 3-6 and Class 7 and 8 trucks, (2) Identifying technology goals associated with the DOE EERE programs, (3) Estimating the market potential of technologies that improve fuel efficiency and/or use alternative fuels, and (4) Determining the petroleum and greenhouse gas emissions reductions associated with the advanced technologies. In FY 08 the Heavy Vehicles program continued its involvement with various sources of energy loss as compared to focusing more narrowly on engine efficiency and alternative fuels. These changes are the result of a planning effort that first occurred during FY 04 and was updated in the past year. (Ref. 1) This narrative describes characteristics of the heavy truck market as they relate to the analysis, a description of the analysis methodology (including a discussion of the models used to estimate market potential and benefits), and a presentation of the benefits estimated as a result of the adoption of the advanced technologies. The market penetrations are used as part of the EERE-wide integrated analysis to provide final benefit estimates reported in the FY08 Budget Request. The energy savings models are utilized by the VT program for internal project management purposes.

  20. Methodology for Benefit Analysis of CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) in USN Shipyards.

    Science.gov (United States)

    1984-03-01

    benefits of CAD/CAR and of the next generation technology, CIDER . The CADOS study (Ref. 13] offers a method to measure the intangibles of CAD/CAR...methodology that measures both tangible and intangible benefits of present CAD technology. This method would be hard to extend to CIDER technology because of...D-Ri38 398 METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM / (COMPUTER-HIDED DESIGN/COMPUTER-AIDED MANUFACTURING) IN USN SHIPYARDS(U) NAVAL POSTGRADUATE

  1. Methodology for adding and amending glycaemic index values to a nutrition analysis package.

    LENUS (Irish Health Repository)

    Levis, Sharon P

    2011-04-01

    Since its introduction in 1981, the glycaemic index (GI) has been a useful tool for classifying the glycaemic effects of carbohydrate foods. Consumption of a low-GI diet has been associated with a reduced risk of developing CVD, diabetes mellitus and certain cancers. WISP (Tinuviel Software, Llanfechell, Anglesey, UK) is a nutrition software package used for the analysis of food intake records and 24 h recalls. Within its database, WISP contains the GI values of foods based on the International Tables 2002. The aim of the present study is to describe in detail a methodology for adding and amending GI values to the WISP database in a clinical or research setting, using data from the updated International Tables 2008.

  2. Application of the Simulation Based Reliability Analysis on the LBB methodology

    Directory of Open Access Journals (Sweden)

    Pečínka L.

    2008-11-01

    Full Text Available Guidelines on how to demonstrate the existence of Leak Before Break (LBB have been developed in many western countries. These guidelines, partly based on NUREG/CR-6765, define the steps that should be fulfilled to get a conservative assessment of LBB acceptability. As a complement and also to help identify the key parameters that influence the resulting leakage and failure probabilities, the application of Simulation Based Reliability Analysis is under development. The used methodology will be demonstrated on the assessment of through wall leakage crack stability according R6 method. R6 is a known engineering assessment procedure for the evaluation of the integrity of the flawed structure. Influence of thermal ageing and seismic event has been elaborate.

  3. From continuous flow analysis to programmable Flow Injection techniques. A history and tutorial of emerging methodologies.

    Science.gov (United States)

    Ruzicka, Jaromir Jarda

    2016-09-01

    Automation of reagent based assays, also known as Flow Analysis, is based on sample processing, in which a sample flows towards and through a detector for monitoring of its components. The Achilles heel of this methodology is that the majority of FA techniques use constant continuous forward flow to transport the sample - an approach which continually consumes reagents and generates chemical waste. Therefore the purpose of this report is to highlight recent developments of flow programming that not only save reagents, but also lead by means of advanced sample processing to selective and sensitive assays based on stop flow measurement. Flow programming combined with a novel approach to data harvesting yields a novel approach to single standard calibration, and avoids interference caused by refractive index. Finally, flow programming is useful for sample preparation, such as rapid, extensive sample dilution. The principles are illustrated by selected references to an available online tutorial http://www.flowinjectiontutorial,com/.

  4. Discovery and functional analysis of lncRNAs: Methodologies to investigate an uncharacterized transcriptome.

    Science.gov (United States)

    Kashi, Kaori; Henderson, Lindsey; Bonetti, Alessandro; Carninci, Piero

    2016-01-01

    It is known that more than 70% of mammalian genomes are transcribed, yet the vast majority of transcripts do not code for proteins. Are these noncoding transcripts merely transcriptional noise, or do they serve a biological purpose? Recent developments in genomic analysis technologies, especially sequencing methods, have allowed researchers to create a large atlas of transcriptomes, study subcellular localization, and investigate potential interactions with proteins for a growing number of transcripts. Here, we review the current methodologies available for discovering and investigating functions of long noncoding RNAs (lncRNAs), which require a wide variety of applications to study their potential biological roles. This article is part of a Special Issue entitled: Clues to long noncoding RNA taxonomy1, edited by Dr. Tetsuro Hirose and Dr. Shinichi Nakagawa.

  5. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    , is presented and detailed to a level enabling an analysis of the relation to the impact categories at midpoint level considered in life cycle (LC) methodologies.The interpretation of the definition of sustainability as outlined in Our Common Future (WCED 1987) suggests that the assessment of a product......'s sustainability is about addressing the extent to which product life cycles affect poverty levels among the current generation, as well as changes in the level of natural, human and produced and social capital available for the future population. It is shown that the extent to which product life cycles affect...... poverty to some extent is covered by impact categories included in existing SLCA approaches. It is also found that the extent to which product life cycles affect natural capital is well covered by LCA, and human capital is covered by both LCA and SLCA but in different ways. Produced capital is not to any...

  6. New methodology developed for the differential scanning calorimetry analysis of polymeric matrixes incorporating phase change materials

    Science.gov (United States)

    Barreneche, Camila; Solé, Aran; Miró, Laia; Martorell, Ingrid; Inés Fernández, A.; Cabeza, Luisa F.

    2012-08-01

    Nowadays, thermal comfort needs in buildings have led to an increase in energy consumption of the residential and service sectors. For this reason, thermal energy storage is shown as an alternative to achieve reduction of this high consumption. Phase change materials (PCM) have been studied to store energy due to their high storage capacity. A polymeric material capable of macroencapsulating PCM was developed by the authors of this paper. However, difficulties were found while measuring the thermal properties of these materials by differential scanning calorimetry (DSC). The polymeric matrix interferes in the detection of PCM properties by DSC. To remove this interfering effect, a new methodology which replaces the conventional empty crucible used as a reference in the DSC analysis by crucibles composed of the polymeric matrix was developed. Thus, a clear signal from the PCM is obtained by subtracting the new full crucible signal from the sample signal.

  7. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    . The GC model uses the Marrero-Gani (MR) method which considers the group contribution in different levels both functional and structural. The methodology helps improve accuracy and reliability of property modeling and provides a rigorous model quality check and assurance. This is expected to further......Property prediction models are a fundamental tool of process modeling and analysis, especially at the early stage of process development. Furthermore, property prediction models are the fundamental tool for Computer-aided molecular design used for the development of new refrigerants. Group...... contribution (GC) based prediction methods use structurally dependent parameters in order to determine the property of pure components. The aim of the GC parameter estimation is to find the best possible set of model parameters that fits the experimental data. In that sense, there is often a lack of attention...

  8. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes.

  9. A methodology for the semi-automatic digital image analysis of fragmental impactites

    Science.gov (United States)

    Chanou, A.; Osinski, G. R.; Grieve, R. A. F.

    2014-04-01

    A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.

  10. Experimental stress analysis for materials and structures stress analysis models for developing design methodologies

    CERN Document Server

    Freddi, Alessandro; Cristofolini, Luca

    2015-01-01

    This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.

  11. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  12. Volume analysis of supercooled water under high pressure

    OpenAIRE

    Duki, Solomon F.; Tsige, Mesfin

    2016-01-01

    Motivated by recent experimental findings on the volume of supercooled water at high pressure [O. Mishima, J. Chem. Phys. 133, 144503 (2010)] we performed atomistic molecular dynamics simulations study of bulk water in the isothermal-isobaric ensemble. Cooling and heating cycles at different isobars and isothermal compression at different temperatures are performed on the water sample with pressures that range from 0 to 1.0 GPa. The cooling simulations are done at temperatures that range from...

  13. Analysis of airborne radiometric data. Volume 3. Topical reports

    Energy Technology Data Exchange (ETDEWEB)

    Reed, J.H.; Shreve, D.C.; Sperling, M.; Woolson, W.A.

    1978-05-01

    This volume consists of four topical reports: a general discussion of the philosophy of unfolding spectra with continuum and discrete components, a mathematical treatment of the effects of various physical parameters on the uncollided gamma-ray spectrum at aircraft elevations, a discussion of the application of the unfolding code MAZNAI to airborne data, and a discussion of the effects of the nonlinear relationship between energy deposited and pulse height in NaI(T1) detectors.

  14. Legal basis for risk analysis methodology while ensuring food safety in the Eurasian Economic union and the Republic of Belarus

    Directory of Open Access Journals (Sweden)

    E.V. Fedorenko

    2015-09-01

    Full Text Available Health risk analysis methodology is an internationally recognized tool for ensuring food safety. Three main elements of risk analysis are risk assessment, risk management and risk communication to inform the interested parties on the risk, are legislated and implemented in the Eurasian Economic Union and the Republic of Belarus. There is a corresponding organizational and functional framework for the application of risk analysis methodology as in the justification of production safety indicators and the implementation of public health surveillance. Common methodological approaches and criteria for evaluating public health risk are determined, which are used in the development and application of food safety requirements. Risk assessment can be used in justifying the indicators of safety (contaminants, food additives, and evaluating the effectiveness of programs on enrichment of food with micronutrients.

  15. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings. [IAEAPU

    Energy Technology Data Exchange (ETDEWEB)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV /sup 241/Pu and 208-keV /sup 237/U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings.

  16. COMPARATIVE ANALYSIS OF INDICATORS OBTAINED BY CORINELAND COVER METHODOLOGY FOR SUSTAINABLE USE OF FOREST ECOSYSTEMS

    Directory of Open Access Journals (Sweden)

    Slaviša Popović

    2015-07-01

    Full Text Available Serbian Environmental Protection Agency followed international and national indicators to do monitoring of forested landscape area for the period 1990-2000. Based on the data obtained by Corine Land Cover methodology following the indicators like Forest area, Forested landscape, Forest land and Forest and semi natural area, analysis was done. The forested landscape indicators analysis helped trends monitoring during the period from 1990 - 2000 year. Dynamic of forested area changes could have direct impact on the practical implementation of indicators. Indicator Forest area can be used in planning sustainable use of forests. Recorded growth rates value in 2000year, compared to the 1990th is 0.296%. Indicator Forested landscape increase for 0.186% till 2000 year, while the indicator Forested Land recorded value growth rate of 0.193%. Changes in rates of those indicators can be used in the future for “emission trading”. The smallest increment of rate change of 0.1% was recorded in indicator Forests and semi natural area. Information given by this indicator can be used for monitoring habitats in high mountain areas.

  17. The Cultural Analysis of Soft Systems Methodology and the Configuration Model of Organizational Culture

    Directory of Open Access Journals (Sweden)

    Jürgen Staadt

    2015-06-01

    Full Text Available Organizations that find themselves within a problematic situation connected with cultural issues such as politics and power require adaptable research and corresponding modeling approaches so as to grasp the arrangements of that situation and their impact on the organizational development. This article originates from an insider-ethnographic intervention into the problematic situation of the leading public housing provider in Luxembourg. Its aim is to describe how the more action-oriented cultural analysis of soft systems methodology and the theory-driven configuration model of organizational culture are mutually beneficial rather than contradictory. The data collected between 2007 and 2013 were analyzed manually as well as by means of ATLAS.ti. Results demonstrate that the cultural analysis enables an in-depth understanding of the power-laden environment within the organization bringing about the so-called “socio-political system” and that the configuration model makes it possible to depict the influence of that system on the whole organization. The overall research approach thus contributes toward a better understanding of the influence and the impact of oppressive social environments and evolving power relations on the development of an organization.

  18. Towards criterion validity in classroom language analysis: methodological constraints of metadiscourse and inter-rater agreement

    Directory of Open Access Journals (Sweden)

    Douglas Altamiro Consolo

    2001-02-01

    Full Text Available

    This paper reports on a process to validate a revised version of a system for coding classroom discourse in foreign language lessons, a context in which the dual role of language (as content and means of communication and the speakers' specific pedagogical aims lead to a certain degree of ambiguity in language analysis. The language used by teachers and students has been extensively studied, and a framework of concepts concerning classroom discourse well-established. Models for coding classroom language need, however, to be revised when they are applied to specific research contexts. The application and revision of an initial framework can lead to the development of earlier models, and to the re-definition of previously established categories of analysis that have to be validated. The procedures followed to validate a coding system are related here as guidelines for conducting research under similar circumstances. The advantages of using instruments that incorporate two types of data, that is, quantitative measures and qualitative information from raters' metadiscourse, are discussed, and it is suggested that such procedure can contribute to the process of validation itself, towards attaining reliability of research results, as well as indicate some constraints of the adopted research methodology.

  19. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    Energy Technology Data Exchange (ETDEWEB)

    Tozer-Loft, S.M

    2000-12-01

    A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy is explained in detail, and some improvements proposed. These 'natural' histograms are extended to show the effects of real point sources which do not exactly follow the inverse-square law, and to demonstrate the in-target dose-volume distribution, previously unpublished. The histograms are used as a way of mathematically analysing the properties of theoretical mono-energetic radionuclides, and for demonstrating the dosimetric properties of a potential new brachytherapy source (Ytterbium-169). A new modification of the Anderson formalism is then described for producing Anderson Inverse-Square Shifted (AISS) histograms for the Gamma Knife, which are shown to be useful for demonstrating the quality of stereotactic radiosurgery dose distributions. A study is performed analysing the results of Gamma Knife treatments on 44 patients suffering from a benign brain tumour (acoustic neuroma). Follow-up data is used to estimate the volume shrinkage or growth of each tumour, and this measure of outcome is compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising

  20. Detecting Hidden Encrypted Volume Files via Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Mario Piccinelli

    2015-05-01

    Full Text Available Nowadays various software tools have been developed for the purpose of creating encrypted volume files. Many of those tools are open source and freely available on the internet. Because of that, the probability of finding encrypted files which could contain forensically useful information has dramatically increased. While decoding these files without the key is still a major challenge, the simple fact of being able to recognize their existence is now a top priority for every digital forensics investigation. In this paper we will present a statistical approach to find elements of a seized filesystem which have a reasonable chance of containing encrypted data.

  1. Synfuel program analysis. Volume 2: VENVAL users manual

    Science.gov (United States)

    Muddiman, J. B.; Whelan, J. W.

    1980-07-01

    This volume is intended for program analysts and is a users manual for the VENVAL model. It contains specific explanations as to input data requirements and programming procedures for the use of this model. VENVAL is a generalized computer program to aid in evaluation of prospective private sector production ventures. The program can project interrelated values of installed capacity, production, sales revenue, operating costs, depreciation, investment, dent, earnings, taxes, return on investment, depletion, and cash flow measures. It can also compute related public sector and other external costs and revenues if unit costs are furnished.

  2. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  3. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  4. Semiconductor applications of nanoliter droplet methodology with total reflection x-ray fluorescence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Thomasin C.; Sparks, Christopher M.; Havrilla, George J. E-mail: havrilla@lanl.gov; Beebe, Meredith R

    2004-08-31

    In this study, the nanoliter dried spot method was applied to semiconductor contamination analysis to enhance vapor phase decomposition processes with total reflection X-ray fluorescence detection. Nanoliter-sized droplets (10 and 50 nl) were deposited onto native silicon oxide wafer surfaces in a clean room environment from both single and multielemental standards containing various concentrations of iron in different matrices. Direct comparisons were made to droplets formed by conventional VPD with similar iron standards. Nanoliter dried spots could be reproducibly deposited and dried in air with typical drying times ranging from 20 s to 2 min depending on the nanoliter volume deposited, compared to VPD spots which have drying times ranging from tens of minutes to several hours. Both types of residues showed a linear relationship between Fe intensity and mass deposited. Variable angle experiments showed that both nanoliter and VPD deposits of single element standards were film-like in character, while residues formed from much more complex matrices and higher mass loadings were particulate in character. For the experimental conditions used in this study (30 kV, 100 mA), typical TXRF spectral Fe limits of detection were calculated to be on the order of picograms or {approx}1x10{sup 10} atoms/cm{sup 2} for a 0.8 cm{sup 2} X-ray excitation beam area for both nanoliter dried spots and VPD spots prepared from single elemental standards. Calculated Fe detection limits for 200 mm diameter silicon wafers used in this study were in the {approx}1x10{sup 8} atoms/cm{sup 2} range. By using nanoliter sized droplets, the required sample volume is greatly reduced resulting in higher sample throughput than with conventional VPD methods.

  5. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    Science.gov (United States)

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  6. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally app

  7. Price-volume multifractal analysis and its application in Chinese stock markets

    Science.gov (United States)

    Yuan, Ying; Zhuang, Xin-tian; Liu, Zhi-ying

    2012-06-01

    An empirical research on Chinese stock markets is conducted using statistical tools. First, the multifractality of stock price return series, ri(ri=ln(Pt+1)-ln(Pt)) and trading volume variation series, vi(vi=ln(Vt+1)-ln(Vt)) is confirmed using multifractal detrended fluctuation analysis. Furthermore, a multifractal detrended cross-correlation analysis between stock price return and trading volume variation in Chinese stock markets is also conducted. It is shown that the cross relationship between them is also found to be multifractal. Second, the cross-correlation between stock price Pi and trading volume Vi is empirically studied using cross-correlation function and detrended cross-correlation analysis. It is found that both Shanghai stock market and Shenzhen stock market show pronounced long-range cross-correlations between stock price and trading volume. Third, a composite index R based on price and trading volume is introduced. Compared with stock price return series ri and trading volume variation series vi, R variation series not only remain the characteristics of original series but also demonstrate the relative correlation between stock price and trading volume. Finally, we analyze the multifractal characteristics of R variation series before and after three financial events in China (namely, Price Limits, Reform of Non-tradable Shares and financial crisis in 2008) in the whole period of sample to study the changes of stock market fluctuation and financial risk. It is found that the empirical results verified the validity of R.

  8. A Methodological Review of Exploratory Factor Analysis in Sexuality Research: Used Practices, Best Practices, and Data Analysis Resources.

    Science.gov (United States)

    Sakaluk, John K; Short, Stephen D

    2017-01-01

    Sexuality researchers frequently use exploratory factor analysis (EFA) to illuminate the distinguishable theoretical constructs assessed by a set of variables. EFA entails a substantive number of analytic decisions to be made with respect to sample size determination, and how factors are extracted, rotated, and retained. The available analytic options, however, are not all equally empirically rigorous. We discuss the commonly available options for conducting EFA and which options constitute best practices for EFA. We also present the results of a methodological review of the analytic options for EFA used by sexuality researchers in more than 200 EFAs, published in more than 160 articles and chapters from 1974 to 2014, in a sample of sexuality research journals. Our review reveals that best practices for EFA are actually those least frequently used by sexuality researchers. We introduce freely available analytic resources to help make it easier for sexuality researchers to adhere to best practices when conducting EFAs in their own research.

  9. The methodology for developing a prospective meta-analysis in the family planning community

    Directory of Open Access Journals (Sweden)

    Jacobson Janet C

    2011-04-01

    . Conclusions PMA is a novel research method that improves meta-analysis by including several study sites, establishing uniform reporting of specific outcomes, and yet allowing some independence on the part of individual sites with respect to the conduct of research. The inclusion of several sites increases statistical power to address important clinical questions. Compared to multi-center trials, PMA methodology encourages collaboration, aids in the development of new investigators, decreases study costs, and decreases time to publication. Trial Registration ClinicalTrials.gov: NCT00613366, NCT00886834, NCT01001897, NCT01147497 and NCT01307111

  10. A methodology for the structural and functional analysis of signaling and regulatory networks

    Directory of Open Access Journals (Sweden)

    Simeoni Luca

    2006-02-01

    Full Text Available Abstract Background Structural analysis of cellular interaction networks contributes to a deeper understanding of network-wide interdependencies, causal relationships, and basic functional capabilities. While the structural analysis of metabolic networks is a well-established field, similar methodologies have been scarcely developed and applied to signaling and regulatory networks. Results We propose formalisms and methods, relying on adapted and partially newly introduced approaches, which facilitate a structural analysis of signaling and regulatory networks with focus on functional aspects. We use two different formalisms to represent and analyze interaction networks: interaction graphs and (logical interaction hypergraphs. We show that, in interaction graphs, the determination of feedback cycles and of all the signaling paths between any pair of species is equivalent to the computation of elementary modes known from metabolic networks. Knowledge on the set of signaling paths and feedback loops facilitates the computation of intervention strategies and the classification of compounds into activators, inhibitors, ambivalent factors, and non-affecting factors with respect to a certain species. In some cases, qualitative effects induced by perturbations can be unambiguously predicted from the network scheme. Interaction graphs however, are not able to capture AND relationships which do frequently occur in interaction networks. The consequent logical concatenation of all the arcs pointing into a species leads to Boolean networks. For a Boolean representation of cellular interaction networks we propose a formalism based on logical (or signed interaction hypergraphs, which facilitates in particular a logical steady state analysis (LSSA. LSSA enables studies on the logical processing of signals and the identification of optimal intervention points (targets in cellular networks. LSSA also reveals network regions whose parametrization and initial

  11. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 2-Sequoyah Unit 2 Cycle 3

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1995-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark away-from-reactor criticality analysis methods using critical configurations from commercial pressurized-water reactors. The analysis methodology selected for all the calculations reported herein is based on the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of three reactor critical configurations for the Sequoyah Unit 2 Cycle 3. This unit and cycle were chosen because of the relevance in spent fuel benchmark applications: (1) the unit had a significantly long downtime of 2.7 years during the middle of cycle (MOC) 3, and (2) the core consisted entirely of burned fuel at the MOC restart. The first benchmark critical calculation was the MOC restart at hot, full-power (HFP) critical conditions. The

  12. Analyzing planned maintenance (PM) inspection data by failure mode and effect analysis methodology.

    Science.gov (United States)

    Ridgway, Malcolm

    2003-01-01

    There is no question that medical devices are becoming more reliable. However, we have had some difficulty finding a satisfactory method for providing persuasive documentary evidence that this improved reliability will allow us to relax our traditional planned maintenance (PM) practices without compromising patient safety. The acceptance and increasing use of Failure Mode and Effect Analysis (FMEA) by several of the oversight agencies, including the Joint Commission on Accreditation of Healthcare Organizations, provides us with an important opportunity to take another shot at this vexing problem. Using this proven FMEA methodology and some relatively simple rules to quantify the results of the routine PM inspections that all healthcare providers are still performing in considerable abundance, we have developed a method that allows us to reduce the test results to a simple, single measure (the Risk Score) that can be used to characterize the effectiveness and levels of safety of our current PM regimens. When tested on theoretical data and a sample of real PM inspection results, the method provides answers that seem reasonable. Although it will probably require some modification as we begin the standardized data gathering and gain working experience, it is our hope that this new approach will become generally accepted within the industry. This kind of positive response should enable us to persuade the various accrediting and licensing agencies to similarly accept the concept.

  13. Effectiveness and cost of failure mode and effects analysis methodology to reduce neurosurgical site infections.

    Science.gov (United States)

    Hover, Alexander R; Sistrunk, William W; Cavagnol, Robert M; Scarrow, Alan; Finley, Phillip J; Kroencke, Audrey D; Walker, Judith L

    2014-01-01

    Mercy Hospital Springfield is a tertiary care facility with 32 000 discharges and 15 000 inpatient surgeries in 2011. From June 2009 through January 2011, a stable inpatient elective neurosurgery infection rate of 2.15% was observed. The failure mode and effects analysis (FMEA) methodology to reduce inpatient neurosurgery infections was utilized. Following FMEA implementation, overall elective neurosurgery infection rates were reduced to 1.51% and sustained through May 2012. Compared with baseline, the post-FMEA deep-space and organ infection rate was reduced by 41% (P = .052). Overall hospital inpatient clean surgery infection rates for the same time frame did not decrease to the same extent, suggesting a specific effect of the FMEA. The study team believes that the FMEA interventions resulted in 14 fewer expected infections, $270 270 in savings, a 168-day reduction in expected length of stay, and 22 fewer readmissions. Given the serious morbidity and cost of health care-associated infections, the study team concludes that FMEA implementation was clinically cost-effective.

  14. Cointegration methodology for psychological researchers: An introduction to the analysis of dynamic process systems.

    Science.gov (United States)

    Stroe-Kunold, Esther; Gruber, Antje; Stadnytska, Tetiana; Werner, Joachim; Brosig, Burkhard

    2012-11-01

    Longitudinal data analysis focused on internal characteristics of a single time series has attracted increasing interest among psychologists. The systemic psychological perspective suggests, however, that many long-term phenomena are mutually interconnected, forming a dynamic system. Hence, only multivariate methods can handle such human dynamics appropriately. Unlike the majority of time series methodologies, the cointegration approach allows interdependencies of integrated (i.e., extremely unstable) processes to be modelled. This advantage results from the fact that cointegrated series are connected by stationary long-run equilibrium relationships. Vector error-correction models are frequently used representations of cointegrated systems. They capture both this equilibrium and compensation mechanisms in the case of short-term deviations due to developmental changes. Thus, the past disequilibrium serves as explanatory variable in the dynamic behaviour of current variables. Employing empirical data from cognitive psychology, psychosomatics, and marital interaction research, this paper describes how to apply cointegration methods to dynamic process systems and how to interpret the parameters under investigation from a psychological perspective.

  15. R&D ENTITIES AND STRATEGIC TRACKING SYSTEMS: AN ANALYSIS OF THE METHODOLOGIES USED

    Directory of Open Access Journals (Sweden)

    Ricardo Lopes Cardoso

    2010-04-01

    Full Text Available This paper’s main purposes are to analyze the research centers strategic performance measurement system; to study what methodologies are more commonly used and to find out how measurement system and strategic goals are integrated. The sample consists of six research centers in Campinas – São Paulo and the theoretical backgrounds used are the strategic performance measurement systems like the Tableau de Bord, the Sink and Tuttle model and the Kaplan and Norton’s Balanced Scorecard. The research results were subject to qualitative analysis leading to the following considerations: research centers measure, assess and monitor their results with strategic drivers systems mainly through qualitative dimensions such as research, development and innovation, customer satisfaction – users and services and processes quality. Among the systems used, the specific ones prevail and half of them are conceptually linked with the Balanced Scorecard content. The final research results demonstrate great lack of integration between the strategic plans and the performance indicators as well as lack of integration between the performance indicators and the organization as a whole.

  16. Life cycle analysis of mitigation methodologies for railway rolling noise and groundbourne vibration.

    Science.gov (United States)

    Tuler, Mariana Valente; Kaewunruen, Sakdirat

    2017-04-15

    Negative outcomes such as noise and vibration generated by railways have become a challenge for both industry and academia in order to guarantee that the railway system can accomplish its purposes and at the same time provide comfort for users and people living in the neighbourhood along the railway corridor. The research interest on this field has been increasing and the advancement in noise and vibration mitigation methodologies can be observed using various engineering techniques that are constantly put into test to solve such effects. In contrast, the life cycle analysis of the mitigation measures has not been thoroughly carried out. There is also a lack of detailed evaluation in the efficiency of various mechanisms for controlling rolling noise and ground-borne vibration. This research is thus focussed on the evaluation of materials used, the total cost associated with the maintenance of such the measures and the carbon footprint left for each type of mechanism. The insight into carbon footprint together with life cycle cost will benefit decision making process for the industry in the selection of optimal and suitable mechanism since the environmental impact is a growing concern around the world.

  17. Modeling and Analysis of MRR, EWR and Surface Roughness in EDM Milling through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    A. K.M.S. Iqbal

    2010-01-01

    Full Text Available Problem statement: Electrical Discharge Machining (EDM has grown over the last few decades from a novelty to a mainstream manufacturing process. Though, EDM process is very demanding but the mechanism of the process is complex and far from completely understood. It is difficult to establish a model that can accurately predict the performance by correlating the process parameters. The optimum processing parameters are essential to increase the production rate and decrease the machining time, since the materials, which are processed by EDM and even the process is very costly. This research establishes empirical relations regarding machining parameters and the responses in analyzing the machinability of the stainless steel. Approach: The machining factors used are voltage, rotational speed of electrode and feed rate over the responses MRR, EWR and Ra. Response surface methodology was used to investigate the relationships and parametric interactions between the three controllable variables on the MRR, EWR and Ra. Central composite experimental design was used to estimate the model coefficients of the three factors. The responses were modeled using a response surface model based on experimental results. The significant coefficients were obtained by performing Analysis Of Variance (ANOVA at 95% level of significance. Results: The variation in percentage errors for developed models was found within 5%. Conclusion: The developed models show that voltage and rotary motion of electrode are the most significant machining parameters influencing MRR, EWR and Ra. These models can be used to get the desired responses within the experimental range.

  18. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  19. Error analysis of overlay compensation methodologies and proposed functional tolerances for EUV photomask flatness

    Science.gov (United States)

    Ballman, Katherine; Lee, Christopher; Dunn, Thomas; Bean, Alexander

    2016-05-01

    Due to the impact on image placement and overlay errors inherent in all reflective lithography systems, EUV reticles will need to adhere to flatness specifications below 10nm for 2018 production. These single value metrics are near impossible to meet using current tooling infrastructure (current state of the art reticles report P-V flatness ~60nm). In order to focus innovation on areas which lack capability for flatness compensation or correction, this paper redefines flatness metrics as being "correctable" vs. "non-correctable" based on the surface topography's contributions to the final IP budget at wafer, as well as whether data driven corrections (write compensation or at scanner) are available for the reticle's specific shape. To better understand and define the limitations of write compensation and scanner corrections, an error budget for processes contributing to these two methods is presented. Photomask flatness measurement tools are now targeting 6σ reproducibility <1nm (previous 3σ reproducibility ~3nm) in order to drive down error contributions and provide more accurate data for correction techniques. Taking advantage of the high order measurement capabilities of improved metrology tooling, as well as computational capabilities which enable fast measurements and analysis of sophisticated shapes, we propose a methodology for the industry to create functional tolerances focused on the flatness errors that are not correctable with compensation.

  20. Survival analysis of colorectal cancer patients with tumor recurrence using global score test methodology

    Energy Technology Data Exchange (ETDEWEB)

    Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my [School of Quantitative Sciences, Universiti Utara Malaysia, UUM Sintok 06010, Kedah (Malaysia); Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Raduan, Farhana, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com; Sagap, Ismail, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com [Surgery Department, Universiti Kebangsaan Malaysia Medical Centre, Jalan Yaacob Latif, 56000 Bandar Tun Razak, Kuala Lumpur (Malaysia); Aziz, Nazrina, E-mail: nazrina@uum.edu.my

    2014-12-04

    Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.

  1. A methodological framework for hydromorphological assessment, analysis and monitoring (IDRAIM) aimed at promoting integrated river management

    Science.gov (United States)

    Rinaldi, M.; Surian, N.; Comiti, F.; Bussettini, M.

    2015-12-01

    A methodological framework for hydromorphological assessment, analysis and monitoring (named IDRAIM) has been developed with the specific aim of supporting the management of river processes by integrating the objectives of ecological quality and flood risk mitigation. The framework builds on existing and up-to-date geomorphological concepts and approaches and has been tested on several Italian streams. The framework includes the following four phases: (1) catchment-wide characterization of the fluvial system; (2) evolutionary trajectory reconstruction and assessment of current river conditions; (3) description of future trends of channel evolution; and (4) identification of management options. The framework provides specific consideration of the temporal context, in terms of reconstructing the trajectory of past channel evolution as a basis for interpreting present river conditions and future trends. A series of specific tools has been developed for the assessment of river conditions, in terms of morphological quality and channel dynamics. These include: the Morphological Quality Index (MQI), the Morphological Dynamics Index (MDI), the Event Dynamics Classification (EDC), and the river morphodynamic corridors (MC and EMC). The monitoring of morphological parameters and indicators, alongside the assessment of future scenarios of channel evolution provides knowledge for the identification, planning and prioritization of actions for enhancing morphological quality and risk mitigation.

  2. Quantitative Analysis of Mutant Subclones in Chronic Myeloid Leukemia: Comparison of Different Methodological Approaches.

    Science.gov (United States)

    Preuner, Sandra; Barna, Agnes; Frommlet, Florian; Czurda, Stefan; Konstantin, Byrgazov; Alikian, Mary; Machova Polakova, Katerina; Sacha, Tomasz; Richter, Johan; Lion, Thomas; Gabriel, Christian

    2016-04-29

    Identification and quantitative monitoring of mutant BCR-ABL1 subclones displaying resistance to tyrosine kinase inhibitors (TKIs) have become important tasks in patients with Ph-positive leukemias. Different technologies have been established for patient screening. Various next-generation sequencing (NGS) platforms facilitating sensitive detection and quantitative monitoring of mutations in the ABL1-kinase domain (KD) have been introduced recently, and are expected to become the preferred technology in the future. However, broad clinical implementation of NGS methods has been hampered by the limited accessibility at different centers and the current costs of analysis which may not be regarded as readily affordable for routine diagnostic monitoring. It is therefore of interest to determine whether NGS platforms can be adequately substituted by other methodological approaches. We have tested three different techniques including pyrosequencing, LD (ligation-dependent)-PCR and NGS in a series of peripheral blood specimens from chronic myeloid leukemia (CML) patients carrying single or multiple mutations in the BCR-ABL1 KD. The proliferation kinetics of mutant subclones in serial specimens obtained during the course of TKI-treatment revealed similar profiles via all technical approaches, but individual specimens showed statistically significant differences between NGS and the other methods tested. The observations indicate that different approaches to detection and quantification of mutant subclones may be applicable for the monitoring of clonal kinetics, but careful calibration of each method is required for accurate size assessment of mutant subclones at individual time points.

  3. An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS, Immersed Boundary Methods, and T-spline CAD Surfaces

    Science.gov (United States)

    2012-01-22

    ICES REPORT 12-05 January 2012 An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed...M.J. Borden, E. Rank, T.J.R. Hughes, An Isogeometric Design-through-analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed...analysis Methodology based on Adaptive Hierarchical Refinement of NURBS , Immersed Boundary Methods, and T-spline CAD Surfaces 5a. CONTRACT NUMBER 5b

  4. An analysis of complex multiple-choice science-technology-society items: Methodological development and preliminary results

    Science.gov (United States)

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; Acevedo-Díaz, José-Antonio

    2006-07-01

    The scarce attention to the assessment and evaluation in science education research has been especially harmful for teaching science-technology-society (STS) issues, due to the dialectical, tentative, value-laden, and polemic nature of most STS topics. This paper tackles the methodological difficulties of the instruments that monitor views related to STS topics and rationalizes a quantitative methodology and an analysis technique to improve the utility of an empirically developed multiple-choice item pool, the Questionnaire of Opinions on STS. This methodology embraces an item-scaling psychometrics based on the judgments by a panel of experts, a multiple response model, a scoring system, and the data analysis. The methodology finally produces normalized attitudinal indices that represent the respondent's reasoned beliefs toward STS statements, the respondent's position on an item that comprises several statements, or the respondent's position on an entire STS topic that encompasses a set of items. Some preliminary results show the methodology's ability to evaluate the STS attitudes in a qualitative and quantitative way and for statistical hypothesis testing. Lastly, some applications for teacher training and STS curriculum development in science classrooms are discussed.

  5. Measurement and analysis of grain boundary grooving by volume diffusion

    Science.gov (United States)

    Hardy, S. C.; Mcfadden, G. B.; Coriell, S. R.; Voorhees, P. W.; Sekerka, R. F.

    1991-01-01

    Experimental measurements of isothermal grain boundary grooving by volume diffusion are carried out for Sn bicrystals in the Sn-Pb system near the eutectic temperature. The dimensions of the groove increase with a temporal exponent of 1/3, and measurement of the associated rate constant allows the determination of the product of the liquid diffusion coefficient D and the capillarity length Gamma associated with the interfacial free energy of the crystal-melt interface. The small-slope theory of Mullins is generalized to the entire range of dihedral angles by using a boundary integral formulation of the associated free boundary problem, and excellent agreement with experimental groove shapes is obtained. By using the diffusivity measured by Jordon and Hunt, the present measured values of Gamma are found to agree to within 5 percent with the values obtained from experiments by Gunduz and Hunt on grain boundary grooving in a temperature gradient.

  6. Dose volume analysis in brachytherapy and stereotactic radiosurgery

    CERN Document Server

    Tozer-Loft, S M

    2000-01-01

    compared with a range of figures of merit which express different aspects of the quality of each dose distributions. The results are analysed in an attempt to answer the question: What are the important features of the dose distribution (conformality, uniformity, etc) which show a definite relationship with the outcome of the treatment? Initial results show positively that, when Gamma Knife radiosurgery is used to treat acoustic neuroma, some measures of conformality seem to have a surprising, but significant association with outcome. A brief introduction to three branches of radiotherapy is given: interstitial brachytherapy, external beam megavoltage radiotherapy, and stereotactic radiosurgery. The current interest in issues around conformity, uniformity and optimisation is explained in the light of technical developments in these fields. A novel method of displaying dose-volume information, which mathematically suppresses the inverse-square law, as first suggested by L.L. Anderson for use in brachytherapy i...

  7. Aerodynamic analysis of flapping foils using volume grid deformation code

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Jin Hwan [Seoul National University, Seoul (Korea, Republic of); Kim, Jee Woong; Park, Soo Hyung; Byun, Do Young [Konkuk University, Seoul (Korea, Republic of)

    2009-06-15

    Nature-inspired flapping foils have attracted interest for their high thrust efficiency, but the large motions of their boundaries need to be considered. It is challenging to develop robust, efficient grid deformation algorithms appropriate for the large motions in three dimensions. In this paper, a volume grid deformation code is developed based on finite macro-element and transfinite interpolation, which successfully interfaces to a structured multi-block Navier-Stokes code. A suitable condition that generates the macro-elements with efficiency and improves the robustness of grid regularity is presented as well. As demonstrated by an airfoil with various motions related to flapping, the numerical results of aerodynamic forces by the developed method are shown to be in good agreement with those of an experimental data or a previous numerical solution

  8. Scram discharge volume break studies accident sequence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harrington, R.M.; Hodge, S.A.

    1982-01-01

    This paper is a summary of a report describing the predicted response of Unit 1 at the Tennessee Valley Authority (TVA) Browns Ferry Nuclear Plant to a hypothetical small break loss of coolant accident (SBLOCA) outside of containment. The accident studied would be initiated by a break in the scram discharge volume (SDV) piping when it is pressurized to full reactor vessel pressure as a normal consequence of a reactor scram. If the scram could be reset, the scram outlet valves would close to isolate the SDV and the piping break from the reactor vessel. However, reset is possible only if the conditions that caused the scram have cleared; it has been assumed in this study that the scram signal remains in effect over a long period of time.

  9. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    Science.gov (United States)

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  10. A Methodological Reflection on the Process of Narrative Analysis: Alienation and Identity in the Life Histories of English Language Teachers

    Science.gov (United States)

    Menard-Warwick, Julia

    2011-01-01

    This article uses data from life-history interviews with English language teachers in Chile and California to illustrate methodological processes in teacher identity research through narrative analysis. To this end, the author describes the steps she took in identifying an issue to be examined, selecting particular narratives as representative of…

  11. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    Science.gov (United States)

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  12. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V.

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  13. Comparison of nested case-control and survival analysis methodologies for analysis of time-dependent exposure

    Directory of Open Access Journals (Sweden)

    Platt Robert W

    2005-01-01

    Full Text Available Abstract Background Epidemiological studies of exposures that vary with time require an additional level of methodological complexity to account for the time-dependence of exposure. This study compares a nested case-control approach for the study of time-dependent exposure with cohort analysis using Cox regression including time-dependent covariates. Methods A cohort of 1340 subjects with four fixed and seven time-dependent covariates was used for this study. Nested case-control analyses were repeated 100 times for each of 4, 8, 16, 32, and 64 controls per case, and point estimates were compared to those obtained using Cox regression on the full cohort. Computational efficiencies were evaluated by comparing central processing unit times required for analysis of the cohort at sizes 1, 2, 4, 8, 16, and 32 times its initial size. Results Nested case-control analyses yielded results that were similar to results of Cox regression on the full cohort. Cox regression was found to be 125 times slower than the nested case-control approach (using four controls per case. Conclusions The nested case-control approach is a useful alternative for cohort analysis when studying time-dependent exposures. Its superior computational efficiency may be particularly useful when studying rare outcomes in databases, where the ability to analyze larger sample sizes can improve the power of the study.

  14. Methodological Framework for Analysis of Buildings-Related Programs: The GPRA Metrics Effort

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Douglas B.; Anderson, Dave M.; Belzer, David B.; Cort, Katherine A.; Dirks, James A.; Hostick, Donna J.

    2004-06-18

    The requirements of the Government Performance and Results Act (GPRA) of 1993 mandate the reporting of outcomes expected to result from programs of the Federal government. The U.S. Department of Energy’s (DOE’s) Office of Energy Efficiency and Renewable Energy (EERE) develops official metrics for its 11 major programs using its Office of Planning, Budget Formulation, and Analysis (OPBFA). OPBFA conducts an annual integrated modeling analysis to produce estimates of the energy, environmental, and financial benefits expected from EERE’s budget request. Two of EERE’s major programs include the Building Technologies Program (BT) and Office of Weatherization and Intergovernmental Program (WIP). Pacific Northwest National Laboratory (PNNL) supports the OPBFA effort by developing the program characterizations and other market information affecting these programs that is necessary to provide input to the EERE integrated modeling analysis. Throughout the report we refer to these programs as “buildings-related” programs, because the approach is not limited in application to BT or WIP. To adequately support OPBFA in the development of official GPRA metrics, PNNL communicates with the various activities and projects in BT and WIP to determine how best to characterize their activities planned for the upcoming budget request. PNNL then analyzes these projects to determine what the results of the characterizations would imply for energy markets, technology markets, and consumer behavior. This is accomplished by developing nonintegrated estimates of energy, environmental, and financial benefits (i.e., outcomes) of the technologies and practices expected to result from the budget request. These characterizations and nonintegrated modeling results are provided to OPBFA as inputs to the official benefits estimates developed for the Federal Budget. This report documents the approach and methodology used to estimate future energy, environmental, and financial benefits

  15. Thermal characterization and analysis of microliter liquid volumes using the three-omega method.

    Science.gov (United States)

    Roy-Panzer, Shilpi; Kodama, Takashi; Lingamneni, Srilakshmi; Panzer, Matthew A; Asheghi, Mehdi; Goodson, Kenneth E

    2015-02-01

    Thermal phenomena in many biological systems offer an alternative detection opportunity for quantifying relevant sample properties. While there is substantial prior work on thermal characterization methods for fluids, the push in the biology and biomedical research communities towards analysis of reduced sample volumes drives a need to extend and scale these techniques to these volumes of interest, which can be below 100 pl. This work applies the 3ω technique to measure the temperature-dependent thermal conductivity and heat capacity of de-ionized water, silicone oil, and salt buffer solution droplets from 24 to 80 °C. Heater geometries range in length from 200 to 700 μm and in width from 2 to 5 μm to accommodate the size restrictions imposed by small volume droplets. We use these devices to measure droplet volumes of 2 μl and demonstrate the potential to extend this technique down to pl droplet volumes based on an analysis of the thermally probed volume. Sensitivity and uncertainty analyses provide guidance for relevant design variables for characterizing properties of interest by investigating the tradeoffs between measurement frequency regime, device geometry, and substrate material. Experimental results show that we can extract thermal conductivity and heat capacity with these sample volumes to within less than 1% of thermal properties reported in the literature.

  16. Do skeletal cephalometric characteristics correlate with condylar volume, surface and shape? A 3D analysis

    Directory of Open Access Journals (Sweden)

    Saccucci Matteo

    2012-05-01

    Full Text Available Abstract Objective The purpose of this study was to determine the condylar volume in subjects with different mandibular divergence and skeletal class using cone-beam computed tomography (CBCT and analysis software. Materials and methods For 94 patients (46 females and 48 males; mean age 24.3 ± 6.5 years, resultant rendering reconstructions of the left and right temporal mandibular joints (TMJs were obtained. Subjects were then classified on the base of ANB angle the GoGn-SN angle in three classes (I, II, III . The data of the different classes were compared. Results No significant difference was observed in the whole sample between the right and the left sides in condylar volume. The analysis of mean volume among low, normal and high mandibular plane angles revealed a significantly higher volume and surface in low angle subjects (p  Class III subjects also tended to show a higher condylar volume and surface than class I and class II subjects, although the difference was not significant. Conclusions Higher condylar volume was a common characteristic of low angle subjects compared to normal and high mandibular plane angle subjects. Skeletal class also appears to be associated to condylar volume and surface.

  17. Coal gasification systems engineering and analysis, volume 2

    Science.gov (United States)

    1980-01-01

    The major design related features of each generic plant system were characterized in a catalog. Based on the catalog and requirements data, approximately 17 designs and cost estimates were developed for MBG and alternate products. A series of generic trade studies was conducted to support all of the design studies. A set of cost and programmatic analyses were conducted to supplement the designs. The cost methodology employed for the design and sensitivity studies was documented and implemented in a computer program. Plant design and construction schedules were developed for the K-T, Texaco, and B&W MBG plant designs. A generic work breakdown structure was prepared, based on the K-T design, to coincide with TVA's planned management approach. An extensive set of cost sensitivity analyses was completed for K-T, Texaco, and B&W design. Product price competitiveness was evaluated for MBG and the alternate products. A draft management policy and procedures manual was evaluated. A supporting technology development plan was developed to address high technology risk issues. The issues were identified and ranked in terms of importance and tractability, and a plan developed for obtaining data or developing technology required to mitigate the risk.

  18. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Directory of Open Access Journals (Sweden)

    Faycal Mimouni

    2016-04-01

    Full Text Available Purpose: Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Design/methodology/approach: Network modeling by combining Petri and Bayesian network. Findings: Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Research limitations/implications: Demands are independent from returns. Practical implications: Model can only be used on nonperishable products. Social implications: Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Originality/value: Bayesian network with a cycle combined with the Petri Network.

  19. Left ventricular pressure and volume data acquisition and analysis using LabVIEW.

    Science.gov (United States)

    Cassidy, S C; Teitel, D F

    1997-03-01

    To automate analysis of left ventricular pressure-volume data, we used LabVIEW to create applications that digitize and display data recorded from conductance and manometric catheters. Applications separate data into cardiac cycles, calculate parallel conductance, and calculate indices of left ventricular function, including end-systolic elastance, preload-recruitable stroke work, stroke volume, ejection fraction, stroke work, maximum and minimum derivative of ventricular pressure, heart rate, indices of relaxation, peak filling rate, and ventricular chamber stiffness. Pressure-volume loops can be graphically displayed. These analyses are exported to a text-file. These applications have simplified and automated the process of evaluating ventricular function.

  20. Underground Test Area Subproject Phase I Data Analysis Task. Volume VI - Groundwater Flow Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-11-01

    Volume VI of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the groundwater flow model data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  1. Underground Test Area Subproject Phase I Data Analysis Task. Volume VII - Tritium Transport Model Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the tritium transport model documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  2. Underground Test Area Subproject Phase I Data Analysis Task. Volume IV - Hydrologic Parameter Data Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-09-01

    Volume IV of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the hydrologic parameter data. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  3. Underground Test Area Subproject Phase I Data Analysis Task. Volume VIII - Risk Assessment Documentation Package

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-12-01

    Volume VIII of the documentation for the Phase I Data Analysis Task performed in support of the current Regional Flow Model, Transport Model, and Risk Assessment for the Nevada Test Site Underground Test Area Subproject contains the risk assessment documentation. Because of the size and complexity of the model area, a considerable quantity of data was collected and analyzed in support of the modeling efforts. The data analysis task was consequently broken into eight subtasks, and descriptions of each subtask's activities are contained in one of the eight volumes that comprise the Phase I Data Analysis Documentation.

  4. Photovoltaic venture analysis. Final report. Volume III. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Costello, D.; Posner, D.; Schiffel, D.; Doane, J.; Bishop, C.

    1978-07-01

    This appendix contains a brief summary of a detailed description of alternative future energy scenarios which provide an overall backdrop for the photovoltaic venture analysis. Also included is a summary of a photovoltaic market/demand workshop, a summary of a photovoltaic supply workshop which used cross-impact analysis, and a report on photovoltaic array and system prices in 1982 and 1986. The results of a sectorial demand analysis for photovoltaic power systems used in the residential sector (single family homes), the service, commercial, and institutional sector (schools), and in the central power sector are presented. An analysis of photovoltaics in the electric utility market is given, and a report on the industrialization of photovoltaic systems is included. A DOE information memorandum regarding ''A Strategy for a Multi-Year Procurement Initiative on Photovoltaics (ACTS No. ET-002)'' is also included. (WHK)

  5. A methodology for stochastic analysis of share prices as Markov chains with finite states.

    Science.gov (United States)

    Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey

    2014-01-01

    Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.

  6. Ceramic component development analysis -- Volume 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Boss, D.E.

    1998-06-09

    The development of advanced filtration media for advanced fossil-fueled power generating systems is a critical step in meeting the performance and emissions requirements for these systems. While porous metal and ceramic candle-filters have been available for some time, the next generation of filters will include ceramic-matrix composites (CMCs) (Techniweave/Westinghouse, Babcock and Wilcox (B and W), DuPont Lanxide Composites), intermetallic alloys (Pall Corporation), and alternate filter geometries (CeraMem Separations). The goal of this effort was to perform a cursory review of the manufacturing processes used by 5 companies developing advanced filters from the perspective of process repeatability and the ability for their processes to be scale-up to produce volumes. Given the brief nature of the on-site reviews, only an overview of the processes and systems could be obtained. Each of the 5 companies had developed some level of manufacturing and quality assurance documentation, with most of the companies leveraging the procedures from other products they manufacture. It was found that all of the filter manufacturers had a solid understanding of the product development path. Given that these filters are largely developmental, significant additional work is necessary to understand the process-performance relationships and projecting manufacturing costs.

  7. Efficient Substrate Noise Coupling Verification and Failure Analysis Methodology for Smart Power ICs in Automotive Applications

    OpenAIRE

    2016-01-01

    International audience; This paper presents a methodology to analyze the substrate noise coupling and reduce their effects in smart power integrated circuits. This methodology considers the propagation of minority carriers in the substrate. Hence, it models the lateral bipolar junction transistors that are layout dependent and are not modeled in conventional substrate extraction tools. It allows the designer to simulate substrate currents and check their effects on circuits functionality. The...

  8. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    Science.gov (United States)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  9. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples.

    Science.gov (United States)

    Mackie, David M; Jahnke, Justin P; Benyamin, Marcus S; Sumner, James J

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users' purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells.

  10. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    HUANG Zhen; LIU JingFang; ZENG DaXing

    2009-01-01

    It is well known that the traditional Grubler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots, and this situation seriously hampers mechani-cal innovation. To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism. The modified Grubler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms, especially the mobility of all recent parallel mechanisms listed by Gogu, and the Bennett mechanism known for its particular difficulty. With wide applications of the criterion, a systematic methodology has recently formed. This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms. In addition, the methodology is convenient for judgment of the instantaneous or full-cycle mobility, and has become an effective and general method of great scientific value and practical significance. In the first half, this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade. The second half of this paperpresents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally, this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  11. A general methodology for mobility analysis of mechanisms based on constraint screw theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    It is well known that the traditional Grübler-Kutzbach formula fails to calculate the mobility of some classical mechanisms or many modern parallel robots,and this situation seriously hampers mechani-cal innovation.To seek an efficient and universal method for mobility calculation has been a heated topic in the sphere of mechanism.The modified Grübler-Kutzbach criterion proposed by us achieved success in calculating the mobility of a lot of highly complicated mechanisms,especially the mobility of all recent parallel mechanisms listed by Gogu,and the Bennett mechanism known for its particular difficulty.With wide applications of the criterion,a systematic methodology has recently formed.This paper systematically presents the methodology based on the screw theory for the first time and ana-lyzes six representative puzzling mechanisms.In addition,the methodology is convenient for judgment of the instantaneous or full-cycle mobility,and has become an effective and general method of great scientific value and practical significance.In the first half,this paper introduces the basic screw theory,then it presents the effective methodology formed within this decade.The second half of this paper presents how to apply the methodology by analyzing the mobility of several puzzling mechanisms.Finally,this paper contrasts and analyzes some different methods and interprets the essential reason for validity of our methodology.

  12. N+3 Aircraft Concept Designs and Trade Studies. Volume 2; Appendices-Design Methodologies for Aerodynamics, Structures, Weight, and Thermodynamic Cycles

    Science.gov (United States)

    Greitzer, E. M.; Bonnefoy, P. A.; delaRosaBlanco, E.; Dorbian, C. S.; Drela, M.; Hall, D. K.; Hansman, R. J.; Hileman, J. I.; Liebeck, R. H.; Lovegren, J.; Mody, P.; Pertuze, J. A.; Sato, S.; Spakovszky, Z. S.; Tan, C. S.; Hollman, J. S.; Duda, J. E.; Fitzgerald, N.; Houghton, J.; Kerrebrock, J. L.; Kiwada, G. F.; Kordonowy, D.; Parrish, J. C.; Tylko, J.; Wen, E. A.

    2010-01-01

    Appendices A to F present the theory behind the TASOPT methodology and code. Appendix A describes the bulk of the formulation, while Appendices B to F develop the major sub-models for the engine, fuselage drag, BLI accounting, etc.

  13. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  14. A New Methodology for Frequency Domain Analysis of Wave Energy Converters with Periodically Varying Physical Parameters

    Science.gov (United States)

    Mosher, Mark

    violations, making it extremely useful in the automated design optimization process; the methodology allows large number of design iterations, including both physical design and control variables, to be evaluated and conclusively compared. In the development of the perturbation method, it was discovered that the device's motion response can be calculated from an infinite series of second order ordinary differential equations that can be truncated without destroying the solution accuracy. It was found that the response amplitude operator for the generic form of a solution component provides a means to gauge the device's response to a given wave input and control parameter variation, including a gauge of the solution process stability. It is unclear as of yet if this is physical, a result of the solution process, or both. However, for a given control parameter set resulting in an unstable solution, the instability was shown to be, at least in part, a result of the device's dynamics. If the stability concerns can be addressed through additional constraints and updates to the wave energy converter hydrodynamic parameters, the methodology will expand on the commonly accepted boundaries for wave energy converter frequency-domain analysis methods and be of much practical importance in the evaluation of control techniques in the field of wave energy converter technology.

  15. A new methodology for fluorescence analysis of composite resins used in anterior direct restorations.

    Science.gov (United States)

    de Lima, Liliane Motta; Abreu, Jessica Dantas; Cohen-Carneiro, Flavia; Regalado, Diego Ferreira; Pontes, Danielson Guedes

    2015-01-01

    The aim of this study was to use a new methodology to evaluate the fluorescence of composite resins for direct restorations. Microhybrid (group 1, Amelogen; group 2, Opallis; group 3, Filtek Z250) and nanohybrid (group 4, Filtek Z350 XT; group 5, Brilliant NG; group 6, Evolu-X) composite resins were analyzed in this study. A prefabricated matrix was used to prepare 60 specimens of 7.0 × 3.0 mm (n = 10 per group); the composite resin discs were prepared in 2 increments (1.5 mm each) and photocured for 20 seconds. To establish a control group of natural teeth, 10 maxillary central incisor crowns were horizontally sectioned to create 10 discs of dentin and enamel tissues with the same dimensions as the composite resin specimens. The specimens were placed in a box with ultraviolet light, and photographs were taken. Aperture 3.0 software was used to quantify the central portion of the image of each specimen in shades of red (R), green (G), and blue (B) of the RGB color space. The brighter the B shade in the evaluated area of the image, the greater the fluorescence shown by the specimen. One-way analysis of variance revealed significant differences between the groups. The fluorescence achieved in group 1 was statistically similar to that of the control group and significantly different from those of the other groups (Bonferroni test). Groups 3 and 4 had the lowest fluorescence values, which were significantly different from those of the other groups. According to the results of this study, neither the size nor the amount of inorganic particles in the evaluated composite resin materials predicts if the material will exhibit good fluorescence.

  16. Analysis of agreement among definitions of metabolic syndrome in nondiabetic Turkish adults: a methodological study

    Directory of Open Access Journals (Sweden)

    Bersot Thomas P

    2007-12-01

    Full Text Available Abstract Background We aimed to explore the agreement among World Health Organization (WHO, European Group for the Study of Insulin Resistance (EGIR, National Cholesterol Education Program (NCEP, American College of Endocrinology (ACE, and International Diabetes Federation (IDF definitions of the metabolic syndrome. Methods 1568 subjects (532 men, 1036 women, mean age 45 and standard deviation (SD 13 years were evaluated in this cross-sectional, methodological study. Cardiometabolic risk factors were determined. Insulin sensitivity was calculated by HOMA-IR. Agreement among definitions was determined by the kappa statistic. ANOVA and post hoc Tukey's test were used to compare multiple groups. Results The agreement between WHO and EGIR definitions was very good (kappa: 0.83. The agreement between NCEP, ACE, and IDF definitions was substantial to very good (kappa: 0.77–0.84. The agreement between NCEP or ACE or IDF and WHO or EGIR definitions was fair (kappa: 0.32–0.37. The age and sex adjusted prevalence of metabolic syndrome was 38% by NCEP, 42% by ACE and IDF, 20% by EGIR and 19% by WHO definition. The evaluated definitions were dichotomized after analysis of design, agreement and prevalence: insulin measurement requiring definitions (WHO and EGIR and definitions not requiring insulin measurement (NCEP, ACE, IDF. One definition was selected from each set for comparison. WHO-defined subjects were more insulin resistant than subjects without the metabolic syndrome (mean and SD for log HOMA-IR, 0.53 ± 0.14 vs. 0.07 ± 0.23, respectively, p 0.05, but lower log HOMA-IR values (p Conclusion The metabolic syndrome definitions that do not require measurement of insulin levels (NCEP, ACE and IDF identify twice more patients with insulin resistance and increased Framingham risk scores and are more useful than the definitions that require measurement of insulin levels (WHO and EGIR.

  17. The IMAGE project: methodological issues for the molecular genetic analysis of ADHD

    Directory of Open Access Journals (Sweden)

    Faraone Stephen V

    2006-08-01

    Full Text Available Abstract The genetic mechanisms involved in attention deficit hyperactivity disorder (ADHD are being studied with considerable success by several centres worldwide. These studies confirm prior hypotheses about the role of genetic variation within genes involved in the regulation of dopamine, norepinephrine and serotonin neurotransmission in susceptibility to ADHD. Despite the importance of these findings, uncertainties remain due to the very small effects sizes that are observed. We discuss possible reasons for why the true strength of the associations may have been underestimated in research to date, considering the effects of linkage disequilibrium, allelic heterogeneity, population differences and gene by environment interactions. With the identification of genes associated with ADHD, the goal of ADHD genetics is now shifting from gene discovery towards gene functionality – the study of intermediate phenotypes ('endophenotypes'. We discuss methodological issues relating to quantitative genetic data from twin and family studies on candidate endophenotypes and how such data can inform attempts to link molecular genetic data to cognitive, affective and motivational processes in ADHD. The International Multi-centre ADHD Gene (IMAGE project exemplifies current collaborative research efforts on the genetics of ADHD. This European multi-site project is well placed to take advantage of the resources that are emerging following the sequencing of the human genome and the development of international resources for whole genome association analysis. As a result of IMAGE and other molecular genetic investigations of ADHD, we envisage a rapid increase in the number of identified genetic variants and the promise of identifying novel gene systems that we are not currently investigating, opening further doors in the study of gene functionality.

  18. A Study for Appropriateness of National Nuclear Policy by using Economic Analysis Methodology after Fukushima accident

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Jong Myoung; Roh, Myung Sub [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2013-10-15

    The aim of this paper is to clarify the appropriateness of national nuclear policy in BPE of Korea from an economic perspective. To do this, this paper only focus on the economic analysis methodology without any considering other conditions such as political, cultural, or historical things. In a number of countries, especially Korea, nuclear energy policy is keeping the status quo after Fukushima accident. However the nation's nuclear policy may vary depending on the choice of people. Thus, to make the right decisions, it is important to deliver accurate information and knowledge about nuclear energy to the people. As proven in this paper, the levelized cost of nuclear power is the most inexpensive among the base load units. As the reliance on nuclear power is getting stronger through the economic logic, the nuclear safety and environmental elements will be strengthened. Based on this, national nuclear policy should be promoted. In the aftermath of the Fukushima accident recognized as the world's worst nuclear disaster since the Chernobyl, there are some changes in the nuclear energy policy of various countries. Germany, for example, called a halt to operate Nuclear Power Plant (NPP) which accounts for about 7.5% of the national power generation capacity of 6.3GW. In developing countries such as China and India they conducted the safety check of the nuclear power plants again before preceding their nuclear business. Korea government announced 'The 6th Basic Plan for Long-term Electricity Supply and Demand (BPE)', considering the safety and general public acceptance of the nuclear power plants. According to BPE, they postponed a plan for additional NPP construction, except for constructions that had been already reflected in the 5th BPE. All told, the responses for nuclear energy policy of countries are different depending on their own circumstances.

  19. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  20. Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.

    Energy Technology Data Exchange (ETDEWEB)

    LaChance, Jeffrey L.; Hansen, Clifford W.

    2010-09-01

    The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additional Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.

  1. Methodology assessment and recommendations for the Mars science laboratory launch safety analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Sturgis, Beverly Rainwater; Metzinger, Kurt Evan; Powers, Dana Auburn; Atcitty, Christopher B.; Robinson, David B; Hewson, John C.; Bixler, Nathan E.; Dodson, Brian W.; Potter, Donald L.; Kelly, John E.; MacLean, Heather J.; Bergeron, Kenneth Donald (Sala & Associates); Bessette, Gregory Carl; Lipinski, Ronald J.

    2006-09-01

    thousands of possible event sequences and to build up a statistical representation of the releases for each accident case. A code to carry out this process will have to be developed or adapted from previous MMRTG missions. Finally, Level C translates the release (or ''source term'') information from Level B into public risk by applying models for atmospheric transport and the health consequences of exposure to the released plutonium dioxide. A number of candidate codes for this level of analysis are available. This report surveys the range of available codes and tools for each of these levels and makes recommendations for which choices are best for the MSL mission. It also identities areas where improvements to the codes are needed. In some cases a second tier of codes may be identified to provide supporting or clarifying insight about particular issues. The main focus of the methodology assessment is to identify a suite of computational tools that can produce a high quality SAR that can be successfully reviewed by external bodies (such as the Interagency Nuclear Safety Review Panel) on the schedule established by NASA and DOE.

  2. A Multi-Criteria Decision Analysis based methodology for quantitatively scoring the reliability and relevance of ecotoxicological data.

    Science.gov (United States)

    Isigonis, Panagiotis; Ciffroy, Philippe; Zabeo, Alex; Semenzin, Elena; Critto, Andrea; Giove, Silvio; Marcomini, Antonio

    2015-12-15

    Ecotoxicological data are highly important for risk assessment processes and are used for deriving environmental quality criteria, which are enacted for assuring the good quality of waters, soils or sediments and achieving desirable environmental quality objectives. Therefore, it is of significant importance the evaluation of the reliability of available data for analysing their possible use in the aforementioned processes. The thorough analysis of currently available frameworks for the assessment of ecotoxicological data has led to the identification of significant flaws but at the same time various opportunities for improvement. In this context, a new methodology, based on Multi-Criteria Decision Analysis (MCDA) techniques, has been developed with the aim of analysing the reliability and relevance of ecotoxicological data (which are produced through laboratory biotests for individual effects), in a transparent quantitative way, through the use of expert knowledge, multiple criteria and fuzzy logic. The proposed methodology can be used for the production of weighted Species Sensitivity Weighted Distributions (SSWD), as a component of the ecological risk assessment of chemicals in aquatic systems. The MCDA aggregation methodology is described in detail and demonstrated through examples in the article and the hierarchically structured framework that is used for the evaluation and classification of ecotoxicological data is shortly discussed. The methodology is demonstrated for the aquatic compartment but it can be easily tailored to other environmental compartments (soil, air, sediments).

  3. Passive solar design handbook. Volume 3: Passive solar design analysis

    Science.gov (United States)

    Jones, R. W.; Bascomb, J. D.; Kosiewicz, C. E.; Lazarus, G. S.; McFarland, R. D.; Wray, W. O.

    1982-07-01

    Simple analytical methods concerning the design of passive solar heating systems are presented with an emphasis on the average annual heating energy consumption. Key terminology and methods are reviewed. The solar load ratio (SLR) is defined, and its relationship to analysis methods is reviewed. The annual calculation, or Load Collector Ratio (LCR) method, is outlined. Sensitivity data are discussed. Information is presented on balancing conservation and passive solar strategies in building design. Detailed analysis data are presented for direct gain and sunspace systems, and details of the systems are described. Key design parameters are discussed in terms of their impact on annual heating performance of the building. These are the sensitivity data. The SLR correlations for the respective system types are described. The monthly calculation, or SLR method, based on the SLR correlations, is reviewed. Performance data are given for 9 direct gain systems and 15 water wall and 42 Trombe wall systems.

  4. Passive solar design handbook. Volume III. Passive solar design analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, R.W.; Balcomb, J.D.; Kosiewicz, C.E.; Lazarus, G.S.; McFarland, R.D.; Wray, W.O.

    1982-07-01

    Simple analytical methods concerning the design of passive solar heating systems are presented with an emphasis on the average annual heating energy consumption. Key terminology and methods are reviewed. The solar load ratio (SLR) is defined, and its relationship to analysis methods is reviewed. The annual calculation, or Load Collector Ratio (LCR) method, is outlined. Sensitivity data are discussed. Information is presented on balancing conservation and passive solar strategies in building design. Detailed analysis data are presented for direct gain and sunspace systems, and details of the systems are described. Key design parameters are discussed in terms of their impact on annual heating performance of the building. These are the sensitivity data. The SLR correlations for the respective system types are described. The monthly calculation, or SLR method, based on the SLR correlations, is reviewed. Performance data are given for 9 direct gain systems and 15 water wall and 42 Trombe wall systems. (LEW)

  5. A philosophical analysis of the general methodology of qualitative research: a critical rationalist perspective.

    Science.gov (United States)

    Rudnick, Abraham

    2014-09-01

    Philosophical discussion of the general methodology of qualitative research, such as that used in some health research, has been inductivist or relativist to date, ignoring critical rationalism as a philosophical approach with which to discuss the general methodology of qualitative research. This paper presents a discussion of the general methodology of qualitative research from a critical rationalist perspective (inspired by Popper), using as an example mental health research. The widespread endorsement of induction in qualitative research is positivist and is suspect, if not false, particularly in relation to the context of justification (or rather theory testing) as compared to the context of discovery (or rather theory generation). Relativism is riddled with philosophical weaknesses and hence it is suspect if not false too. Theory testing is compatible with qualitative research, contrary to much writing about and in qualitative research, as theory testing involves learning from trial and error, which is part of qualitative research, and which may be the form of learning most conducive to generalization. Generalization involves comparison, which is a fundamental methodological requirement of any type of research (qualitative or other); hence the traditional grounding of quantitative and experimental research in generalization. Comparison--rather than generalization--is necessary for, and hence compatible with, qualitative research; hence, the common opposition to generalization in qualitative research is misdirected, disregarding whether this opposition's claims are true or false. In conclusion, qualitative research, similar to quantitative and experimental research, assumes comparison as a general methodological requirement, which is necessary for health research.

  6. Quantitative Indicators for Defense Analysis. Volume II. Technical Report

    Science.gov (United States)

    1975-06-01

    34*"WTOiw«* piB ^ r- ••’ ’ ’■’.WH""" - "«.JH QUAURANT II Hot War JIoL ]War land i Cold I |Criscs War iThreaten ed - Crisis 1...34The Political Analysis of Negotiations," World Politics 26. 3 (April). ^(1971) The Politics of Trade Negotiations Between Africa and the EEC

  7. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  8. Principal component and volume of interest analyses in depressed patients imaged by {sup 99m}Tc-HMPAO SPET: a methodological comparison

    Energy Technology Data Exchange (ETDEWEB)

    Pagani, Marco [Institute of Cognitive Sciences and Technologies, CNR, Rome (Italy); Section of Nuclear Medicine, Department of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Gardner, Ann; Haellstroem, Tore [NEUROTEC, Division of Psychiatry, Karolinska Institutet, Huddinge University Hospital, Stockholm (Sweden); Salmaso, Dario [Institute of Cognitive Sciences and Technologies, CNR, Rome (Italy); Sanchez Crespo, Alejandro; Jonsson, Cathrine; Larsson, Stig A. [Section of Nuclear Medicine, Department of Hospital Physics, Karolinska Hospital, Stockholm (Sweden); Jacobsson, Hans [Department of Radiology, Karolinska Hospital, Stockholm (Sweden); Lindberg, Greger [Department of Medicine, Division of Gastroenterology and Hepatology, Karolinska Institutet, Huddinge University Hospital, Stockholm (Sweden); Waegner, Anna [Department of Clinical Neuroscience, Division of Neurology, Karolinska Hospital, Stockholm (Sweden)

    2004-07-01

    Previous regional cerebral blood flow (rCBF) studies on patients with unipolar major depressive disorder (MDD) have analysed clusters of voxels or single regions and yielded conflicting results, showing either higher or lower rCBF in MDD as compared to normal controls (CTR). The aim of this study was to assess rCBF distribution changes in 68 MDD patients, investigating the data set with both volume of interest (VOI) analysis and principal component analysis (PCA). The rCBF distribution in 68 MDD and 66 CTR, at rest, was compared. Technetium-99m d,l-hexamethylpropylene amine oxime single-photon emission tomography was performed and the uptake in 27 VOIs, bilaterally, was assessed using a standardising brain atlas. Data were then grouped into factors by means of PCA performed on rCBF of all 134 subjects and based on all 54 VOIs. VOI analysis showed a significant group x VOI x hemisphere interaction (P<0.001). rCBF in eight VOIs (in the prefrontal, temporal, occipital and central structures) differed significantly between groups at the P<0.05 level. PCA identified 11 anatomo-functional regions that interacted with groups (P<0.001). As compared to CTR, MDD rCBF was relatively higher in right associative temporo-parietal-occipital cortex (P<0.01) and bilaterally in prefrontal (P<0.005) and frontal cortex (P<0.025), anterior temporal cortex and central structures (P<0.05 and P<0.001 respectively). Higher rCBF in a selected group of MDD as compared to CTR at rest was found using PCA in five clusters of regions sharing close anatomical and functional relationships. At the single VOI level, all eight regions showing group differences were included in such clusters. PCA is a data-driven method for recasting VOIs to be used for group evaluation and comparison. The appearance of significant differences absent at the VOI level emphasises the value of analysing the relationships among brain regions for the investigation of psychiatric disease. (orig.)

  9. Scale-4 Analysis of Pressurized Water Reactor Critical Configurations: Volume 5 - North Anna Unit 1 Cycle 5

    Energy Technology Data Exchange (ETDEWEB)

    Bowman, S.M.

    1993-01-01

    The requirements of ANSI/ANS 8.1 specify that calculational methods for away-from-reactor (AFR) criticality safety analyses be validated against experimental measurements. If credit for the negative reactivity of the depleted (or spent) fuel isotopics is desired, it is necessary to benchmark computational methods against spent fuel critical configurations. This report summarizes a portion of the ongoing effort to benchmark AFR criticality analysis methods using selected critical configurations from commercial pressurized-water reactors (PWR). The analysis methodology selected for all calculations reported herein was the codes and data provided in the SCALE-4 code system. The isotopic densities for the spent fuel assemblies in the critical configurations were calculated using the SAS2H analytical sequence of the SCALE-4 system. The sources of data and the procedures for deriving SAS2H input parameters are described in detail. The SNIKR code module was used to extract the necessary isotopic densities from the SAS2H results and to provide the data in the format required by the SCALE criticality analysis modules. The CSASN analytical sequence in SCALE-4 was used to perform resonance processing of the cross sections. The KENO V.a module of SCALE-4 was used to calculate the effective multiplication factor (k{sub eff}) of each case. The SCALE-4 27-group burnup library containing ENDF/B-IV (actinides) and ENDF/B-V (fission products) data was used for all the calculations. This volume of the report documents the SCALE system analysis of one reactor critical configuration for North Anna Unit 1 Cycle 5. This unit and cycle were chosen for a previous analysis using a different methodology because detailed isotopics from multidimensional reactor calculations were available from the Virginia Power Company. These data permitted comparison of criticality calculations directly using the utility-calculated isotopics to those using the isotopics generated by the SCALE-4 SAS2H

  10. Methodologies for localizing loco-regional hypopharyngeal carcinoma recurrences in relation to FDG-PET positive and clinical radiation therapy target volumes

    DEFF Research Database (Denmark)

    Due, Anne Kirkebjerg; Korreman, Stine; Bentzen, Søren M;

    2010-01-01

    Focal methods to determine the source of recurrence are presented, tested for reproducibility and compared to volumetric approaches with respect to the number of recurrences ascribed to the FDG-PET positive and high dose volumes....

  11. Analysis of methodologies for assessing the employability of Sport graduates in Portugal

    Directory of Open Access Journals (Sweden)

    Dina Miragaia

    2013-01-01

    Full Text Available The purpose of this article is to analyze the methodologies used to monitor and evaluate employability of Portuguese graduated sport students. The consultation of national and international literature was carried to understand the significance of employability concept and its implications for the methodological design used in evaluation reports of employability. Questionnaires applied by official organizations were consulted and the coherence between the dimensions/variables uncovered and the concept of employability was examined. Problems of validity, reliability, discrimination and comparability between studies were identified, which suggests that it is required find new ways to evaluate employability. One possible way consists in the use of the methodology of triangulation (data, researchers that integrates inter-institutional research.

  12. Design of batch operations: Systematic methodology for generation and analysis of sustainable alternatives

    DEFF Research Database (Denmark)

    Carvalho, Ana; Matos, Henrique A.; Gani, Rafiqul

    2010-01-01

    The objective of this paper is to present a new methodology that is able to generate, screen and identify sustainable alternatives to continuous chemical processes as well as processes operating in the batch mode. The methodology generates the sustainable (design) alternatives by locating...... processes are described, highlighting the main differences between them. Through two case studies, the application of the methodology, to obtain sustainable design alternatives for batch plants, is highlighted....... the operational, environmental, economical and safety related problems inherent in the process (batch or continuous). Alternatives that are more sustainable, compared to a reference, are generated and evaluated by addressing one or more of the identified problems. A decomposition technique as well as a set...

  13. Note: Methodology for the analysis of Bluetooth gateways in an implemented scatternet

    Science.gov (United States)

    Etxaniz, J.; Monje, P. M.; Aranguren, G.

    2014-03-01

    This Note introduces a novel methodology to analyze the time performance of Bluetooth gateways in multi-hop networks, known as scatternets. The methodology is focused on distinguishing between the processing time and the time that each communication between nodes takes along an implemented scatternet. This technique is not only valid for Bluetooth networks but also for other wireless networks that offer access to their middleware in order to include beacons in the operation of the nodes. We show in this Note the results of the tests carried out on a Bluetooth scatternet in order to highlight the reliability and effectiveness of the methodology. The results also validate this technique showing convergence in the results when subtracting the time for the beacons from the delay measurements.

  14. Methodologies for localizing loco-regional hypopharyngeal carcinoma recurrences in relation to FDG-PET positive and clinical radiation therapy target volumes

    DEFF Research Database (Denmark)

    Due, Anne Kirkebjerg; Korreman, Stine Sofia; Tomé, Wolfgang;

    2010-01-01

    Focal methods to determine the source of recurrence are presented, tested for reproducibility and compared to volumetric approaches with respect to the number of recurrences ascribed to the FDG-PET positive and high dose volumes.......Focal methods to determine the source of recurrence are presented, tested for reproducibility and compared to volumetric approaches with respect to the number of recurrences ascribed to the FDG-PET positive and high dose volumes....

  15. Investment in selective social programs: a proposed methodological tool for the analysis of programs’ sustainability

    Directory of Open Access Journals (Sweden)

    Manuel Antonio Barahona Montero

    2014-08-01

    Full Text Available This paper proposes a methodology to evaluate sustainability of Selective Social Programs (SSP, based on the relationship between economic growth and human development posed by the United Nations Development Program (UNDP.  For such purposes, the Circle of Sustainability is developed, which is comprised of 12 pillars. Each pillar is evaluated based on its current status and impact.  Combining both results allows to assesses sustainability of these programs and identify areas of focus. Therefore, this methodology helps to better channel available efforts and resources.

  16. Enhancing the chemical mixture methodology in emergency preparedness and consequence assessment analysis.

    Science.gov (United States)

    Yu, Xiao-Ying; Glantz, Clifford S; Yao, Juan; He, Hua; Petrocchi, Achille J; Craig, Douglas K; Ciolek, John T; Booth, Alexander E

    2013-11-16

    Emergency preparedness personnel at U.S. Department of Energy (DOE) facilities use the chemical mixture methodology (CMM) to estimate the potential health impacts to workers and the public from the unintended airborne release of chemical mixtures. The CMM uses a Hazard Index (HI) for each chemical in a mixture to compare a chemical's concentration at a receptor location to an appropriate concentration limit for that chemical. This limit is typically based on Protection Action Criteria (PAC) values developed and published by the DOE. As a first cut, the CMM sums the HIs for all the chemicals in a mixture to conservatively estimate their combined health impact. A cumulative HI>1.0 represents a concentration exceeding the concentration limit and indicates the potential for adverse health effects. Next, Health Code Numbers (HCNs) are used to identify the target organ systems that may be impacted by exposure to each chemical in a mixture. The sum of the HIs for the maximally impacted target organ system is used to provide a refined, though still conservative, estimate of the potential for adverse health effects from exposure to the chemical mixture. This paper explores approaches to enhance the effectiveness of the CMM by using HCN weighting factors. A series of 24 case studies have been defined to evaluate both the existing CMM and three new approaches for improving the CMM. The first approach uses a set of HCN weighting factors that are applied based on the priority ranking of the HCNs for each chemical. The second approach uses weighting factors based on the priority rankings of the HCNs established for a given type of concentration limit. The third approach uses weighting factors that are based on the exposure route used to derive PAC values and a priority ranking of the HCNs (the same ranking as used in the second approach). Initial testing indicates that applying weighting factors increases the effectiveness of the CMM in general, though care must be taken to

  17. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  18. A normative price for energy from an electricity generation system: An Owner-dependent Methodology for Energy Generation (system) Assessment (OMEGA). Volume 1: Summary

    Science.gov (United States)

    Chamberlain, R. G.; Mcmaster, K. M.

    1981-01-01

    The utility owned solar electric system methodology is generalized and updated. The net present value of the system is determined by consideration of all financial benefits and costs (including a specified return on investment). Life cycle costs, life cycle revenues, and residual system values are obtained. Break even values of system parameters are estimated by setting the net present value to zero. While the model was designed for photovoltaic generators with a possible thermal energy byproduct, it applicability is not limited to such systems. The resulting owner-dependent methodology for energy generation system assessment consists of a few equations that can be evaluated without the aid of a high-speed computer.

  19. Development of Methodology for Spent Fuel Pool Severe Accident Analysis Using MELCOR Program

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Won-Tae; Shin, Jae-Uk [RETech. Co. LTD., Yongin (Korea, Republic of); Ahn, Kwang-Il [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The general reason why SFP severe accident analysis has to be considered is that there is a potential great risk due to the huge number of fuel assemblies and no containment in a SFP building. In most cases, the SFP building is vulnerable to external damage or attack. In contrary, low decay heat of fuel assemblies may make the accident processes slow compared to the accident in reactor core because of a great deal of water. In short, its severity of consequence cannot exclude the consideration of SFP risk management. The U.S. Nuclear Regulatory Commission has performed the consequence studies of postulated spent fuel pool accident. The Fukushima-Daiichi accident has accelerated the needs for the consequence studies of postulated spent fuel pool accidents, causing the nuclear industry and regulatory bodies to reexamine several assumptions concerning beyond-design basis events such as a station blackout. The tsunami brought about the loss of coolant accident, leading to the explosion of hydrogen in the SFP building. Analyses of SFP accident processes in the case of a loss of coolant with no heat removal have studied. Few studies however have focused on a long term process of SFP severe accident under no mitigation action such as a water makeup to SFP. USNRC and OECD have co-worked to examine the behavior of PWR fuel assemblies under severe accident conditions in a spent fuel rack. In support of the investigation, several new features of MELCOR model have been added to simulate both BWR fuel assembly and PWR 17 x 17 assembly in a spent fuel pool rack undergoing severe accident conditions. The purpose of the study in this paper is to develop a methodology of the long-term analysis for the plant level SFP severe accident by using the new-featured MELCOR program in the OPR-1000 Nuclear Power Plant. The study is to investigate the ability of MELCOR in predicting an entire process of SFP severe accident phenomena including the molten corium and concrete reaction. The

  20. Design and tolerance analysis of a low bending loss hole-assisted fiber using statistical design methodology.

    Science.gov (United States)

    Van Erps, Jürgen; Debaes, Christof; Nasilowski, Tomasz; Watté, Jan; Wojcik, Jan; Thienpont, Hugo

    2008-03-31

    We present the design of a low bending loss hole-assisted fiber for a 180?-bend fiber socket application, including a tolerance analysis for manufacturability. To this aim, we make use of statistical design methodology, combined with a fully vectorial mode solver. Two resulting designs are presented and their performance in terms of bending loss, coupling loss to Corning SMF-28 standard telecom fiber, and cut-off wavelength is calculated.

  1. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  2. Quality and Rigor of the Concept Mapping Methodology: A Pooled Study Analysis

    Science.gov (United States)

    Rosas, Scott R.; Kane, Mary

    2012-01-01

    The use of concept mapping in research and evaluation has expanded dramatically over the past 20 years. Researchers in academic, organizational, and community-based settings have applied concept mapping successfully without the benefit of systematic analyses across studies to identify the features of a methodologically sound study. Quantitative…

  3. Making Explicit the Analysis of Students' Mathematical Discourses--Revisiting a Newly Developed Methodological Framework

    Science.gov (United States)

    Ryve, Andreas

    2006-01-01

    Sfard and Kieran [Kieran, C., Educational Studies in Mathematics 46, 2001, 187-228; Sfard, A., Educational Studies in Mathematics 46, 2001, 13-57; Sfard, A. and Kieran, C., Mind, Culture, and Activity 8, 2001, 42-76] have developed a methodological framework, which aims at characterizing the students' mathematical discourses while they are working…

  4. Analysis of Feedback Processes in Online Group Interaction: A Methodological Model

    Science.gov (United States)

    Espasa, Anna; Guasch, Teresa; Alvarez, Ibis M.

    2013-01-01

    The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the…

  5. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  6. Abort Trigger False Positive and False Negative Analysis Methodology for Threshold-Based Abort Detection

    Science.gov (United States)

    Melcher, Kevin J.; Cruz, Jose A.; Johnson Stephen B.; Lo, Yunnhon

    2015-01-01

    This paper describes a quantitative methodology for bounding the false positive (FP) and false negative (FN) probabilities associated with a human-rated launch vehicle abort trigger (AT) that includes sensor data qualification (SDQ). In this context, an AT is a hardware and software mechanism designed to detect the existence of a specific abort condition. Also, SDQ is an algorithmic approach used to identify sensor data suspected of being corrupt so that suspect data does not adversely affect an AT's detection capability. The FP and FN methodologies presented here were developed to support estimation of the probabilities of loss of crew and loss of mission for the Space Launch System (SLS) which is being developed by the National Aeronautics and Space Administration (NASA). The paper provides a brief overview of system health management as being an extension of control theory; and describes how ATs and the calculation of FP and FN probabilities relate to this theory. The discussion leads to a detailed presentation of the FP and FN methodology and an example showing how the FP and FN calculations are performed. This detailed presentation includes a methodology for calculating the change in FP and FN probabilities that result from including SDQ in the AT architecture. To avoid proprietary and sensitive data issues, the example incorporates a mixture of open literature and fictitious reliability data. Results presented in the paper demonstrate the effectiveness of the approach in providing quantitative estimates that bound the probability of a FP or FN abort determination.

  7. A Case Study of a Case Study: Analysis of a Robust Qualitative Research Methodology

    Science.gov (United States)

    Snyder, Catherine

    2012-01-01

    A unique multi-part qualitative study methodology is presented from a study which tracked the transformative journeys of four career-changing women from STEM fields into secondary education. The article analyzes the study's use of archived writing, journaling, participant-generated photography, interviews, member-checking, and reflexive analytical…

  8. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  9. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  10. Efficiency of bimaxillary advancement surgery in increasing the volume of the upper airways: a systematic review of observational studies and meta-analysis.

    Science.gov (United States)

    Rosário, Henrique Damian; Oliveira, Gustavo Mussi Stefan; Freires, Irlan Almeida; de Souza Matos, Felipe; Paranhos, Luiz Renato

    2017-01-01

    Postsurgical changes of the airways have become a great point of interest because it has been reported that maxillomandibular advancement surgery can improve or eliminate obstructive sleep apnea; however, its treatment effectiveness is still controversial. The purpose of this systematic review and meta-analysis was to assess the effectiveness of maxillomandibular advancement surgery to increase upper airway volume in adults, comparing before and after treatment. Bibliographic searches of observational studies with no restriction of year or language were performed in the electronic databases PubMed, Scopus, ScienceDirect and SciELO for articles published up to April 2015. After verification of duplicate records, 1860 articles were examined. Of these, ten met the eligibility criteria, of which three were excluded for having poor methodological quality. The other seven articles were included in the systematic review and six in the meta-analysis, representing 83 patients. One study whose data were not given in absolute values was excluded from the meta-analysis. The meta-analysis showed a statistically significant difference between the averages of upper airway volume before and after surgery {7.86 cm(3) [95 % CI (6.22, 9.49), p = 1.00)}. Clinical evidence suggests that the upper airway volume is increased after maxillomandibular advancement surgery.

  11. Nuclear Dynamics Consequence Analysis (NDCA) for the Disposal of Spent Nuclear Fuel in an Underground Geologic Repository--Volume 1: Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, L.L.; Wilson, J.R.; Sanchez, L.Z.; Aguilar, R.; Trellue, H.R.; Cochrane, K.; Rath, J.S.

    1998-10-01

    The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).

  12. Content Analysis of the "Journal of Counseling & Development": Volumes 74 to 84

    Science.gov (United States)

    Blancher, Adam T.; Buboltz, Walter C.; Soper, Barlow

    2010-01-01

    A content analysis of the research published in the "Journal of Counseling & Development" ("JCD") was conducted for Volumes 74 (1996) through 84 (2006). Frequency distributions were used to identify the most published authors and their institutional affiliations, as well as some basic characteristics (type of sample, gender, and ethnicity) of the…

  13. Introduction to Subject Indexing; A Programmed Text. Volume One: Subject Analysis and Practical Classification.

    Science.gov (United States)

    Brown, Alan George

    This programed text presents the basic principles and practices of subject indexing--limited to the area of precoordinate indexing. This first of two volumes deals with the subject analysis of documents, primarily at the level of summarization, and the basic elements of translation into classification schemes. The text includes regular self-tests…

  14. Geotechnical Analysis Report for July 2004 - June 2005, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2006-03-20

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2005. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  15. Waste Isolation Pilot Plant Geotechnical Analysis Report for July 2005 - June 2006, Volume 2, Supporting Data

    Energy Technology Data Exchange (ETDEWEB)

    Washington TRU Solutions LLC

    2007-03-25

    This report is a compilation of geotechnical data presented as plots for each active instrument installed in the underground at the Waste Isolation Pilot Plant (WIPP) through June 30, 2006. A summary of the geotechnical analyses that were performed using the enclosed data is provided in Volume 1 of the Geotechnical Analysis Report (GAR).

  16. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME I: COMPARATIVE ANALYSIS

    Science.gov (United States)

    This volume represents the analysis of case study facilities' experience with waterbased adhesive use and retrofit requirements. (NOTE: The coated and laminated substrate manufacturing industry was selected as part of NRMRL'S support of the 33/50 Program because of its significan...

  17. Comparison of gray matter volume and thickness for analysis of cortical changes in Alzheimer's disease

    Science.gov (United States)

    Liu, Jiachao; Li, Ziyi; Chen, Kewei; Yao, Li; Wang, Zhiqun; Li, Kunchen; Guo, Xiaojuan

    2011-03-01

    Gray matter volume and cortical thickness are two indices of concern in brain structure magnetic resonance imaging research. Gray matter volume reflects mixed-measurement information of cerebral cortex, while cortical thickness reflects only the information of distance between inner surface and outer surface of cerebral cortex. Using Scaled Subprofile Modeling based on Principal Component Analysis (SSM_PCA) and Pearson's Correlation Analysis, this study further provided quantitative comparisons and depicted both global relevance and local relevance to comprehensively investigate morphometrical abnormalities in cerebral cortex in Alzheimer's disease (AD). Thirteen patients with AD and thirteen age- and gender-matched healthy controls were included in this study. Results showed that factor scores from the first 8 principal components accounted for ~53.38% of the total variance for gray matter volume, and ~50.18% for cortical thickness. Factor scores from the fifth principal component showed significant correlation. In addition, gray matter voxel-based volume was closely related to cortical thickness alterations in most cortical cortex, especially, in some typical abnormal brain regions such as insula and the parahippocampal gyrus in AD. These findings suggest that these two measurements are effective indices for understanding the neuropathology in AD. Studies using both gray matter volume and cortical thickness can separate the causes of the discrepancy, provide complementary information and carry out a comprehensive description of the morphological changes of brain structure.

  18. Doppler sonography of diabetic feet: Quantitative analysis of blood flow volume

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Young Lan; Kim, Ho Chul; Choi, Chul Soon; Yoon, Dae Young; Han, Dae Hee; Moon, Jeung Hee; Bae, Sang Hoon [Hallym University College of Medicine, Seoul (Korea, Republic of)

    2002-09-15

    To analyze Doppler sonographic findings of diabetic feet by estimating the quantitative blood flow volume and by analyzing waveform on Doppler. Doppler sonography was performed in thirty four patients (10 diabetic patients with foot ulceration, 14 diabetic patients without ulceration and 10 normal patients as the normal control group) to measure the flow volume of the arteries of the lower extremities (posterior and anterior tibial arteries, and distal femoral artery. Analysis of doppler waveforms was also done to evaluate the nature of the changed blood flow volume of diabetic patients, and the waveforms were classified into triphasic, biphasic-1, biphasic-2 and monophasic patterns. Flow volume of arteries in diabetic patients with foot ulceration was increased witha statistical significance when compared to that of diabetes patients without foot ulceration of that of normal control group (P<0.05). Analysis of Doppler waveform revealed that the frequency of biphasic-2 pattern was significantly higher in diabetic patients than in normal control group(p<0.05). Doppler sonography in diabetic feet showed increased flow volume and biphasic Doppler waveform, and these findings suggest neuropathy rather than ischemic changes in diabetic feet.

  19. HARDMAN Comparability Analysis Methodology Guide. Volume 3. Requirements Analysis. Step 2 - Manpower Requirements Analysis. Step 3 - Training Resource Requirements Analysis. Step 4 - Personnel Requirements Analysis

    Science.gov (United States)

    1985-04-01

    INDIRECT COSTS BOOMA BOMPA ■ * - ■" 9. Base Operations K 10. Support Costs A. Training Aids C. Other TAOMA TAMPA OSCOMA OSCMPA OSCFHMA...x P l_ NNG _| |_ IOMAF + BOMPA [9: MPA] x NG NNG >F PIOMAV X NTTMD TTMD PIMPAV X NTTMD PIMPAF + TTMD LV 10. Support Costs A...Training Aids TAOMA = Same as BOOMA, except substitute [10A:OMA] for [9:OMA] TAMPA = Same as BOMPA , except substitute [10A:MPA] for [9:MPA] B

  20. COMPARATIVE ANALYSIS OF TRAINING METHODOLOGY EFFICIENCY ON THE MOTOR SPHERE OF JUNIOR I DANCERS

    Directory of Open Access Journals (Sweden)

    Grigore Virgil

    2015-10-01

    Full Text Available The purpose of this paper is to highlight the influence of the training methodology on the motor sphere of junior I dancers. This scientific approach has involved the organization of an experimental study in ”Two Step” Club of Bucharest. The research activity was conducted from January 2012 to November 2013, by investigating two groups of athletes, an experimental group and a control group; each group included 12 dancers, aged from 12 to 13, corresponding to sports classification category Junior I. The results of the research show that thanks to the training methodology applied to the Junior I dancers included in the experimental group, these ones improved their strength of abdominal and arms muscles, they had an increase of the spine and coxo-femoral joint mobility and they improved their strength under speed conditions as well.

  1. Coal gasification systems engineering and analysis. Appendix E: Cost estimation and economic evaluation methodology

    Science.gov (United States)

    1980-01-01

    The cost estimation and economic evaluation methodologies presented are consistent with industry practice for assessing capital investment requirements and operating costs of coal conversion systems. All values stated are based on January, 1980 dollars with appropriate recognition of the time value of money. Evaluation of project economic feasibility can be considered a two step process (subject to considerable refinement). First, the costs of the project must be quantified and second, the price at which the product can be manufacturd must be determined. These two major categories are discussed. The summary of methodology is divided into five parts: (1) systems costs, (2)instant plant costs, (3) annual operating costs, (4) escalation and discounting process, and (5) product pricing.

  2. An Aegean island earthquake protection strategy: an integrated analysis and policy methodology.

    Science.gov (United States)

    Delladetsima, Pavlos Marinos; Dandoulaki, Miranda; Soulakellis, Nikos

    2006-12-01

    Viewing an insular setting as a distinct risk environment, an effort is made here to develop a methodology for identifying core issues related to earthquake risk and disaster protection policy, adjusted to the 'specificities' of such a context. The methodology's point of departure is the inherent condition of the 'island operating as a closed system', requiring an attempt to assess and optimise local capacity (social, political, economic, institutional and technical) to deal with an earthquake emergency. The island is then treated as an 'open system', implying that in the event of a disaster, it should be able to maximise its ability to receive and distribute external aid and to manage effectively population evacuation and inflows/outflows of aid resources. Hence, an appropriate strategic policy approach could be developed by integrating the 'open' and 'closed' system components of an island setting. Three islands from the Aegean Archipelagos in Greece--Chios, Kos and Nissyros--serve as case study areas.

  3. The total scattering atomic pair distribution function: New methodology for nanostructure analysis

    Science.gov (United States)

    Masadeh, Ahmad

    The conventional xray diffration (XRD) methods probe for the presence of long-range order (periodic structure) which are reflected in the Bragg peaks. Local structural deviations or disorder mainly affect the diffuse scattering intensity. In order to obtain structural information about both long-range order and local structure disorder, a technique that takes in account both Bragg and diffuse scattering need to be employed, such as the atomic pair distribution function (PDF) technique. This work introduces a PDF based methodology to quantitatively investigate nanostructure materials in general. The introduced methodology can be applied to extract quantitatively structural information about structure, crystallinity level, core/shell size, nanoparticle size, and inhomogeneous internal strain in the measured nanoparticles. This method is generally applicable to the characterization of the nano-scale solid, many of which may exhibit complex disorder and strain

  4. Reliability Analysis of Phased Mission Systems by the Considering the Sensitivity Analysis, Uncertainty and Common Cause Failure Analysis using the GO-FLOW Methodology

    Directory of Open Access Journals (Sweden)

    Muhammad Hashim

    2013-04-01

    Full Text Available The reliability is the probability that a device will perform its required function under stated conditions for a specified period of time. The Common Cause Failure (CCFs is the multiple failures and has long been recognized (U.S. NRC, 1975 as an important issue in the Probabilistic Safety Assessment (PSA and uncertainty and sensitivity analysis has the important information for the evaluation of system reliability. In this study, two cases has been considered, in the first case, author have made the analysis of reliability of PWR safety system by GO-FLOW methodology alternatively to Fault Tree Analysis and Even Tree because it is success-oriented system analysis technique and comparatively easy to conduct the reliability analysis of the complex system. In the second case, sensitivity analysis has been made in order to prioritize the important parameters which have largest contribution to system reliability and also for common cause failure analysis and uncertainty analysis. For an example of phased mission system, PWR containment spray system has been considered.

  5. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  6. Methodology for Simulation and Analysis of Complex Adaptive Supply Network Structure and Dynamics Using Information Theory

    Directory of Open Access Journals (Sweden)

    Joshua Rodewald

    2016-10-01

    Full Text Available Supply networks existing today in many industries can behave as complex adaptive systems making them more difficult to analyze and assess. Being able to fully understand both the complex static and dynamic structures of a complex adaptive supply network (CASN are key to being able to make more informed management decisions and prioritize resources and production throughout the network. Previous efforts to model and analyze CASN have been impeded by the complex, dynamic nature of the systems. However, drawing from other complex adaptive systems sciences, information theory provides a model-free methodology removing many of those barriers, especially concerning complex network structure and dynamics. With minimal information about the network nodes, transfer entropy can be used to reverse engineer the network structure while local transfer entropy can be used to analyze the network structure’s dynamics. Both simulated and real-world networks were analyzed using this methodology. Applying the methodology to CASNs allows the practitioner to capitalize on observations from the highly multidisciplinary field of information theory which provides insights into CASN’s self-organization, emergence, stability/instability, and distributed computation. This not only provides managers with a more thorough understanding of a system’s structure and dynamics for management purposes, but also opens up research opportunities into eventual strategies to monitor and manage emergence and adaption within the environment.

  7. LINEAR INFRASTRUCTURES THAT CHARACTERIZE A PAST LAND MANAGEMENT: THE MONTAGNOLA SENESE DRY STONE WALLS. A METHODOLOGICAL APPROACH OF ANALYSIS

    Directory of Open Access Journals (Sweden)

    Emanuele Vazzano

    2012-06-01

    Full Text Available The aim of this paper is to highlight the development of a methodology for studying linear infrastructures such as dry stone walls, characteristic of an earlier land management in the Siena countryside. The study area on which this methodology was tested is located in the Site of Community Importance (SCI “Montagnola Senese”. It was chosen as an example of a historical form of agricultural and forest land management, partly related to the key presence of the above mentioned artifacts. This methodology was based on the analysis of a historical cadastre and the concurrent construction and updating of a L.I.S. (Land Information System processed in a GIS environment. In order to compare 1825 data about land use, land ownership and parcel boundaries of the current walls were surveyed during fieldwork through GPS handheld equipment. The results show quite a good correspondence between wall lines and cadastral parcel boundary lines, mostly in the woodland. The analysis of the study area brings out that the walls were designed to carry out different functions such as property boundary, to enclose fields and defend them from the entrance of livestock grazing in the woodland, and subdivide a same land property in different management portions both as cultivated fields and as woodland.

  8. Development of thermal-hydraulic analysis methodology for multiple modules of water-cooled breeder blanket in fusion DEMO reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun [Department of Nuclear Engineering, Seoul National University 1 Gwanak-ro, Gwanak-gu, Seoul 151-744 (Korea, Republic of); Cho, Hyoung-Kyu, E-mail: chohk@snu.ac.kr [Department of Nuclear Engineering, Seoul National University 1 Gwanak-ro, Gwanak-gu, Seoul 151-744 (Korea, Republic of); Park, Goon-Cherl [Department of Nuclear Engineering, Seoul National University 1 Gwanak-ro, Gwanak-gu, Seoul 151-744 (Korea, Republic of); Im, Kihak [National Fusion Research Institute, 169-148, Yuseong-gu, Daejeon 305-806 (Korea, Republic of)

    2016-02-15

    Highlights: • A methodology to simulate the K-DEMO blanket system was proposed. • The results were compared with the CFD, to verify the prediction capability of MARS. • 46 Blankets in a single sector in K-DEMO were simulated using MARS-KS. • Supervisor program was devised to handle each blanket module individually. • The calculation results showed the flow rates, pressure drops, and temperatures. - Abstract: According to the conceptual design of the fusion DEMO reactor proposed by the National Fusion Research Institute of Korea, the water-cooled breeding blanket system incorporates a total of 736 blanket modules. The heat flux and neutron wall loading to each blanket module vary along their poloidal direction, and hence, thermal analysis for at least one blanket sector is required to confirm that the temperature limitations of the materials are satisfied in all the blanket modules. The present paper proposes a methodology of thermal analysis for multiple modules of the blanket system using a nuclear reactor thermal-hydraulic analysis code, MARS-KS. In order to overcome the limitations of the code, caused by the restriction on the number of computational nodes, a supervisor program was devised, which handles each blanket module separately at first, and then corrects the flow rate, considering pressure drops that occur in each module. For a feasibility test of the proposed methodology, 46 blankets in a single sector were simulated; the calculation results of the parameters, such as mass flow, pressure drops, and temperature distribution in the multiple blanket modules showed that the multi-module analysis method can be used for efficient thermal-hydraulic analysis of the fusion DEMO reactor.

  9. Study for the optimization of a transport aircraft wing for maximum fuel efficiency. Volume 1: Methodology, criteria, aeroelastic model definition and results

    Science.gov (United States)

    Radovcich, N. A.; Dreim, D.; Okeefe, D. A.; Linner, L.; Pathak, S. K.; Reaser, J. S.; Richardson, D.; Sweers, J.; Conner, F.

    1985-01-01

    Work performed in the design of a transport aircraft wing for maximum fuel efficiency is documented with emphasis on design criteria, design methodology, and three design configurations. The design database includes complete finite element model description, sizing data, geometry data, loads data, and inertial data. A design process which satisfies the economics and practical aspects of a real design is illustrated. The cooperative study relationship between the contractor and NASA during the course of the contract is also discussed.

  10. Leukocyte telomere length and hippocampus volume: a meta-analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Gustav Nilsonne

    2015-10-01

    Full Text Available Leukocyte telomere length has been shown to correlate to hippocampus volume, but effect estimates differ in magnitude and are not uniformly positive. This study aimed primarily to investigate the relationship between leukocyte telomere length and hippocampus gray matter volume by meta-analysis and secondarily to investigate possible effect moderators. Five studies were included with a total of 2107 participants, of which 1960 were contributed by one single influential study. A random-effects meta-analysis estimated the effect to r = 0.12 [95% CI -0.13, 0.37] in the presence of heterogeneity and a subjectively estimated moderate to high risk of bias. There was no evidence that apolipoprotein E (APOE genotype was an effect moderator, nor that the ratio of leukocyte telomerase activity to telomere length was a better predictor than leukocyte telomere length for hippocampus volume. This meta-analysis, while not proving a positive relationship, also is not able to disprove the earlier finding of a positive correlation in the one large study included in analyses. We propose that a relationship between leukocyte telomere length and hippocamus volume may be mediated by transmigrating monocytes which differentiate into microglia in the brain parenchyma.

  11. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  12. Improving the clinical correlation of multiple sclerosis black hole volume change by paired-scan analysis.

    Science.gov (United States)

    Tam, Roger C; Traboulsee, Anthony; Riddehough, Andrew; Li, David K B

    2012-01-01

    The change in T 1-hypointense lesion ("black hole") volume is an important marker of pathological progression in multiple sclerosis (MS). Black hole boundaries often have low contrast and are difficult to determine accurately and most (semi-)automated segmentation methods first compute the T 2-hyperintense lesions, which are a superset of the black holes and are typically more distinct, to form a search space for the T 1w lesions. Two main potential sources of measurement noise in longitudinal black hole volume computation are partial volume and variability in the T 2w lesion segmentation. A paired analysis approach is proposed herein that uses registration to equalize partial volume and lesion mask processing to combine T 2w lesion segmentations across time. The scans of 247 MS patients are used to compare a selected black hole computation method with an enhanced version incorporating paired analysis, using rank correlation to a clinical variable (MS functional composite) as the primary outcome measure. The comparison is done at nine different levels of intensity as a previous study suggests that darker black holes may yield stronger correlations. The results demonstrate that paired analysis can strongly improve longitudinal correlation (from -0.148 to -0.303 in this sample) and may produce segmentations that are more sensitive to clinically relevant changes.

  13. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 2: Data

    Science.gov (United States)

    Waszak, M. R.; Schmidt, D. K.

    1985-01-01

    Two analysis methods are applied to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop modal analysis technique. This method considers the effect of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Both analyses indicated that dynamic aeroelastic effects caused a degradation in vehicle tracking performance, based on the evaluation of some simulation results. Volume 2 consists of the presentation of the state variable models of the flexible aircraft configurations used in the analysis applications mode shape plots for the structural modes, numerical results from the modal analysis frequency response plots from the pilot in the loop analysis and a listing of the modal analysis computer program.

  14. Sequential flow injection analysis system on-line coupled to high intensity focused ultrasound: green methodology for trace analysis applications as demonstrated for the determination of inorganic and total mercury in waters and urine by cold vapor atomic absorption spectrometry.

    Science.gov (United States)

    Fernandez, C; Conceição, Antonio C L; Rial-Otero, R; Vaz, C; Capelo, J L

    2006-04-15

    A new concept is presented for green analytical applications based on coupling on-line high-intensity focused ultrasound (HIFU) with a sequential injection/flow injection analysis (SIA/FIA) system. The potential of the SIA/HIFU/FIA scheme is demonstrated by taking mercury as a model analyte. Using inorganic mercury, methylmercury, phenylmercury, and diphenylmercury, the usefulness of the proposed methodology for the determination of inorganic and total mercury in waters and urine was demonstrated. The procedure requires low sample volumes and reagents and can be further applied to all chemical reactions involving HIFU. The inherent advantages of SIA, FIA, and HIFU applications in terms of high throughput, automation, low reagent consumption, and green chemistry are accomplished together for the first time in the present work.

  15. Laser Raman spectroscopic analysis of polymorphic forms in microliter fluid volumes.

    Science.gov (United States)

    Anquetil, Patrick A; Brenan, Colin J H; Marcolli, Claudia; Hunter, Ian W

    2003-01-01

    Knowledge and control of the polymorphic phase of chemical compounds are important aspects of drug development in the pharmaceutical industry. We report herein in situ and real-time Raman spectroscopic polymorphic analysis of optically trapped microcrystals in a microliter volume format. The system studied in particular was the recrystallization of carbamazepine (CBZ) in methanol. Raman spectrometry enabled noninvasive measurement of the amount of dissolved CBZ in a sample as well as polymorphic characterization, whereas exclusive recrystallization of either CBZ form I or CBZ form III from saturated solutions was achieved by specific selection of sample cell cooling profiles. Additionally, using a microcell versus a macroscopic volume gives the advantage of reaching equilibrium much faster while using little compound quantity. We demonstrate that laser Raman spectral polymorphic analysis in a microliter cell is a potentially viable screening platform for polymorphic analysis and could lead to a new high throughput method for polymorph screening.

  16. Methodology of complexity analysis of Emergency Operating Procedures for Nuclear Power Plants; Metodologia de analisis de complejidad de Procedimientos de Operacion de Emergencia de Centrales Nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Martorell, P.; Martorell, S.; Marton, I.; Pelayo, F.; Mendizabal, R.

    2013-07-01

    The Emergency Operating Procedures (SOPs) set out the stages and contain actions to be executed by an operator to respond to an emergency situation. Methodologies are being developed to assess aspects such as complexity, completeness and vulnerability of these procedures. A methodology is presented in this paper to develop a network topology POE and analysis focused on the same complexity as a fundamental attribute.

  17. A Tool and Methodology for AC-Stability Analysis of Continuous-Time Closed-Loop Systems

    CERN Document Server

    Milev, Momchil

    2011-01-01

    Presented are a methodology and a DFII-based tool for AC-stability analysis of a wide variety of closed-loop continuous-time (operational amplifiers and other linear circuits). The methodology used allows for easy identification and diagnostics of ac-stability problems including not only main-loop effects but also local-instability loops in current mirrors, bias circuits and emitter or source followers without breaking the loop. The results of the analysis are easy to interpret. Estimated phase margin is readily available. Instability nodes and loops along with their respective oscillation frequencies are immediately identified and mapped to the existing circuit nodes thus offering significant advantages compared to traditional "black-box" methods of stability analysis (Transient Overshoot, Bode and Phase margin plots etc.). The tool for AC-Stability analysis is written in SKILL? and is fully integrated in DFII? environment. Its "push-button" graphical user interface (GUI) is easy to use and understand. The t...

  18. Development of Thermal-hydraulic Analysis Methodology for Multi-module Breeding Blankets in K-DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun; Park, Goon-Cherl; Cho, Hyoung-Kyu [Seoul National University, Seoul (Korea, Republic of); Im, Kihak [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, the purpose of the analyses is to extend the capability of MARS-KS to the entire blanket system which includes a few hundreds of single blanket modules. Afterwards, the plan for the whole blanket system analysis using MARS-KS is introduced and the result of the multiple blanket module analysis is summarized. A thermal-hydraulic analysis code for a nuclear reactor safety, MARS-KS, was applied for the conceptual design of the K-DEMO breeding blanket thermal analysis. Then, a methodology to simulate multiple blanket modules was proposed, which uses a supervisor program to handle each blanket module individually at first and then distribute the flow rate considering pressure drops arises in each module. For a feasibility test of the proposed methodology, 10 outboard blankets in a toroidal field sector were simulated, which are connected with each other through the inlet and outlet common headers. The calculation results of flow rates, pressure drops, and temperatures showed the validity of the calculation and thanks to the parallelization using MPI, almost linear speed-up could be obtained.

  19. Nonlinear bivariate dependency of price-volume relationships in agricultural commodity futures markets: A perspective from Multifractal Detrended Cross-Correlation Analysis

    Science.gov (United States)

    He, Ling-Yun; Chen, Shu-Peng

    2011-01-01

    Nonlinear dependency between characteristic financial and commodity market quantities (variables) is crucially important, especially between trading volume and market price. Studies on nonlinear dependency between price and volume can provide practical insights into market trading characteristics, as well as the theoretical understanding of market dynamics. Actually, nonlinear dependency and its underlying dynamical mechanisms between price and volume can help researchers and technical analysts in understanding the market dynamics by integrating the market variables, instead of investigating them in the current literature. Therefore, for investigating nonlinear dependency of price-volume relationships in agricultural commodity futures markets in China and the US, we perform a new statistical test to detect cross-correlations and apply a new methodology called Multifractal Detrended Cross-Correlation Analysis (MF-DCCA), which is an efficient algorithm to analyze two spatially or temporally correlated time series. We discuss theoretically the relationship between the bivariate cross-correlation exponent and the generalized Hurst exponents for time series of respective variables. We also perform an empirical study and find that there exists a power-law cross-correlation between them, and that multifractal features are significant in all the analyzed agricultural commodity futures markets.

  20. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated

    2012-07-03

    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  1. Perfusion analysis using a wide coverage flat-panel volume CT: feasibility study

    Science.gov (United States)

    Grasruck, M.; Gupta, R.; Reichardt, B.; Klotz, E.; Schmidt, B.; Flohr, T.

    2007-03-01

    We developed a Flat-panel detector based Volume CT (VCT) prototype scanner with large z-coverage. In that prototype scanner a Varian 4030CB a-Si flat-panel detector was mounted in a multi slice CT-gantry (Siemens Medical Solutions) which provides a 25 cm field of view with 18 cm z-coverage at isocenter. The large volume covered in one rotation can be used for visualization of complete organs of small animals, e.g. rabbits. By implementing a mode with continuous scanning, we are able to reconstruct the complete volume at any point in time during the propagation of a contrast bolus. Multiple volumetric reconstructions over time elucidate the first pass dynamics of a bolus of contrast resulting in 4-D angiography and potentially allowing whole organ perfusion analysis. We studied to which extent pixel based permeability and blood volume calculation with a modified Patlak approach was possible. Experimental validation was performed by imaging evolution of contrast bolus in New Zealand rabbits. Despite the short circulation time of a rabbit, the temporal resolution was sufficient to visually resolve various phases of the first pass of the contrast bolus. Perfusion imaging required substantial spatial smoothing but allowed a qualitative discrimination of different types of parenchyma in brain and liver. If a true quantitative analysis is possible, requires further studies.

  2. Thermal Hydraulics Design and Analysis Methodology for a Solid-Core Nuclear Thermal Rocket Engine Thrust Chamber

    Science.gov (United States)

    Wang, Ten-See; Canabal, Francisco; Chen, Yen-Sen; Cheng, Gary; Ito, Yasushi

    2013-01-01

    Nuclear thermal propulsion is a leading candidate for in-space propulsion for human Mars missions. This chapter describes a thermal hydraulics design and analysis methodology developed at the NASA Marshall Space Flight Center, in support of the nuclear thermal propulsion development effort. The objective of this campaign is to bridge the design methods in the Rover/NERVA era, with a modern computational fluid dynamics and heat transfer methodology, to predict thermal, fluid, and hydrogen environments of a hypothetical solid-core, nuclear thermal engine the Small Engine, designed in the 1960s. The computational methodology is based on an unstructured-grid, pressure-based, all speeds, chemically reacting, computational fluid dynamics and heat transfer platform, while formulations of flow and heat transfer through porous and solid media were implemented to describe those of hydrogen flow channels inside the solid24 core. Design analyses of a single flow element and the entire solid-core thrust chamber of the Small Engine were performed and the results are presented herein

  3. Understanding Scientific Methodology in the Historical and Experimental Sciences via Language Analysis

    Science.gov (United States)

    Dodick, Jeff; Argamon, Shlomo; Chase, Paul

    2009-08-01

    A key focus of current science education reforms involves developing inquiry-based learning materials. However, without an understanding of how working scientists actually do science, such learning materials cannot be properly developed. Until now, research on scientific reasoning has focused on cognitive studies of individual scientific fields. However, the question remains as to whether scientists in different fields fundamentally rely on different methodologies. Although many philosophers and historians of science do indeed assert that there is no single monolithic scientific method, this has never been tested empirically. We therefore approach this problem by analyzing patterns of language used by scientists in their published work. Our results demonstrate systematic variation in language use between types of science that are thought to differ in their characteristic methodologies. The features of language use that were found correspond closely to a proposed distinction between Experimental Sciences (e.g., chemistry) and Historical Sciences (e.g., paleontology); thus, different underlying rhetorical and conceptual mechanisms likely operate for scientific reasoning and communication in different contexts.

  4. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    Directory of Open Access Journals (Sweden)

    Joaquín Collazo

    2010-09-01

    Full Text Available This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets, completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications.

  5. Uncertainty determination methodology, sampling maps generation and trend studies with biomass thermogravimetric analysis.

    Science.gov (United States)

    Pazó, Jose A; Granada, Enrique; Saavedra, Angeles; Eguía, Pablo; Collazo, Joaquín

    2010-09-28

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications.

  6. Development of chiral methodologies by capillary electrophoresis with ultraviolet and mass spectrometry detection for duloxetine analysis in pharmaceutical formulations.

    Science.gov (United States)

    Sánchez-López, Elena; Montealegre, Cristina; Marina, María Luisa; Crego, Antonio L

    2014-10-10

    Two chiral methodologies were developed by capillary electrophoresis (CE) with UV and mass spectrometry (MS) detection to ensure the quality control of the drug duloxetine, commercialized as a pure enantiomer. Both methods were optimized to achieve a high baseline enantioresolution (Rs>2) and an acceptable precision (RSD values developed methods were validated and applied for the first time to the analysis of four pharmaceutical formulations. The content of R-duloxetine in all these samples was below the detection limit and the amount of S-duloxetine was in good agreement with the labeled content, obtaining results by the two methods that did not differ significantly (p-values >0.05).

  7. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  8. Stochastic capture zone analysis of an arsenic-contaminated well using the generalized likelihood uncertainty estimator (GLUE) methodology

    Science.gov (United States)

    Morse, Brad S.; Pohll, Greg; Huntington, Justin; Rodriguez Castillo, Ramiro

    2003-06-01

    In 1992, Mexican researchers discovered concentrations of arsenic in excess of World Heath Organization (WHO) standards in several municipal wells in the Zimapan Valley of Mexico. This study describes a method to delineate a capture zone for one of the most highly contaminated wells to aid in future well siting. A stochastic approach was used to model the capture zone because of the high level of uncertainty in several input parameters. Two stochastic techniques were performed and compared: "standard" Monte Carlo analysis and the generalized likelihood uncertainty estimator (GLUE) methodology. The GLUE procedure differs from standard Monte Carlo analysis in that it incorporates a goodness of fit (termed a likelihood measure) in evaluating the model. This allows for more information (in this case, head data) to be used in the uncertainty analysis, resulting in smaller prediction uncertainty. Two likelihood measures are tested in this study to determine which are in better agreement with the observed heads. While the standard Monte Carlo approach does not aid in parameter estimation, the GLUE methodology indicates best fit models when hydraulic conductivity is approximately 10-6.5 m/s, with vertically isotropic conditions and large quantities of interbasin flow entering the basin. Probabilistic isochrones (capture zone boundaries) are then presented, and as predicted, the GLUE-derived capture zones are significantly smaller in area than those from the standard Monte Carlo approach.

  9. Predicting Nonauditory Adverse Radiation Effects Following Radiosurgery for Vestibular Schwannoma: A Volume and Dosimetric Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hayhurst, Caroline; Monsalves, Eric; Bernstein, Mark; Gentili, Fred [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada); Heydarian, Mostafa; Tsao, May [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Schwartz, Michael [Radiation Oncology Program and Division of Neurosurgery, Sunnybrook Hospital, Toronto (Canada); Prooijen, Monique van [Radiation Medicine Program, Princess Margaret Hospital, Toronto (Canada); Millar, Barbara-Ann; Menard, Cynthia [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Kulkarni, Abhaya V. [Division of Neurosurgery, Hospital for Sick Children, University of Toronto (Canada); Laperriere, Norm [Radiation Oncology Program, Princess Margaret Hospital, Toronto (Canada); Zadeh, Gelareh, E-mail: Gelareh.Zadeh@uhn.on.ca [Gamma Knife Unit, Division of Neurosurgery, University Health Network, Toronto (Canada)

    2012-04-01

    Purpose: To define clinical and dosimetric predictors of nonauditory adverse radiation effects after radiosurgery for vestibular schwannoma treated with a 12 Gy prescription dose. Methods: We retrospectively reviewed our experience of vestibular schwannoma patients treated between September 2005 and December 2009. Two hundred patients were treated at a 12 Gy prescription dose; 80 had complete clinical and radiological follow-up for at least 24 months (median, 28.5 months). All treatment plans were reviewed for target volume and dosimetry characteristics; gradient index; homogeneity index, defined as the maximum dose in the treatment volume divided by the prescription dose; conformity index; brainstem; and trigeminal nerve dose. All adverse radiation effects (ARE) were recorded. Because the intent of our study was to focus on the nonauditory adverse effects, hearing outcome was not evaluated in this study. Results: Twenty-seven (33.8%) patients developed ARE, 5 (6%) developed hydrocephalus, 10 (12.5%) reported new ataxia, 17 (21%) developed trigeminal dysfunction, 3 (3.75%) had facial weakness, and 1 patient developed hemifacial spasm. The development of edema within the pons was significantly associated with ARE (p = 0.001). On multivariate analysis, only target volume is a significant predictor of ARE (p = 0.001). There is a target volume threshold of 5 cm3, above which ARE are more likely. The treatment plan dosimetric characteristics are not associated with ARE, although the maximum dose to the 5th nerve is a significant predictor of trigeminal dysfunction, with a threshold of 9 Gy. The overall 2-year tumor control rate was 96%. Conclusions: Target volume is the most important predictor of adverse radiation effects, and we identified the significant treatment volume threshold to be 5 cm3. We also established through our series that the maximum tolerable dose to the 5th nerve is 9 Gy.

  10. Effects of Intraday Patterns on Analysis of STOCK Market Index and Trading Volume

    Science.gov (United States)

    Choi, Hyung Wooc; Maeng, Seong Eun; Lee, Jae Woo

    We review the stylized properties of the stock market and consider effects of the intraday patterns on the analysis of the time series for the stock index and the trading volume in Korean stock market. In the stock market the probability distribution function (pdf) of the return and volatility followed the power law for the stock index and the change of the volume traded. The volatility of the stock index showed the long-time memory and the autocorrelation function followed a power law. We applied two eliminating methods of the intraday patterns: the intraday patterns of the time series itself, and the intraday patterns of the absolute return for the index or the absolute volume change. We scaled the index and return by two types of the intraday patterns. We considered the probability distribution function and the autocorrelation function (ACF) for the time series scaled by the intraday patterns. The cumulative probability distribution function of the returns scaled by the intraday patterns showed a power law, P>(r) r-α±, where α± corresponds to the exponent of the positive and negative fat tails. The pdf of the return scaled by intraday patterns by the absolute return decayed much steeper than that of the return scaled by intraday patterns of the index itself. The pdf for the volume change also followed the power law for both methods of eliminating intraday patterns. However, the exponents of the power law at fat tails do not depend on the intraday patterns. The ACF of the absolute return showed long-time correlation and followed the power law for the scaled index and for the scaled volume. The daily periodicity of the ACF was removed for scaled time series by the intraday patterns of the absolute return or the absolute volume change.

  11. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  12. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  13. GGO nodule volume-preserving nonrigid lung registration using GLCM texture analysis.

    Science.gov (United States)

    Park, Seongjin; Kim, Bohyoung; Lee, Jeongjin; Goo, Jin Mo; Shin, Yeong-Gil

    2011-10-01

    In lung cancer screening, benign and malignant nodules can be classified through nodule growth assessment by the registration and, then, subtraction between follow-up computed tomography scans. During the registration, the volume of nodule regions in the floating image should be preserved, whereas the volume of other regions in the floating image should be aligned to that in the reference image. However, ground glass opacity (GGO) nodules are very elusive to automatically segment due to their inhomogeneous interior. In other words, it is difficult to automatically define the volume-preserving regions of GGO nodules. In this paper, we propose an accurate and fast nonrigid registration method. It applies the volume-preserving constraint to candidate regions of GGO nodules, which are automatically detected by gray-level cooccurrence matrix (GLCM) texture analysis. Considering that GGO nodules can be characterized by their inner inhomogeneity and high intensity, we identify the candidate regions of GGO nodules based on the homogeneity values calculated by the GLCM and the intensity values. Furthermore, we accelerate our nonrigid registration by using Compute Unified Device Architecture (CUDA). In the nonrigid registration process, the computationally expensive procedures of the floating-image transformation and the cost-function calculation are accelerated by using CUDA. The experimental results demonstrated that our method almost perfectly preserves the volume of GGO nodules in the floating image as well as effectively aligns the lung between the reference and floating images. Regarding the computational performance, our CUDA-based method delivers about 20× faster registration than the conventional method. Our method can be successfully applied to a GGO nodule follow-up study and can be extended to the volume-preserving registration and subtraction of specific diseases in other organs (e.g., liver cancer).

  14. Fluid Vessel Quantity Using Non-invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  15. Fluid Vessel Quantity using Non-Invasive PZT Technology Flight Volume Measurements Under Zero G Analysis

    Science.gov (United States)

    Garofalo, Anthony A.

    2013-01-01

    The purpose of the project is to perform analysis of data using the Systems Engineering Educational Discovery (SEED) program data from 2011 and 2012 Fluid Vessel Quantity using Non-Invasive PZT Technology flight volume measurements under Zero G conditions (parabolic Plane flight data). Also experimental planning and lab work for future sub-orbital experiments to use the NASA PZT technology for fluid volume measurement. Along with conducting data analysis of flight data, I also did a variety of other tasks. I provided the lab with detailed technical drawings, experimented with 3d printers, made changes to the liquid nitrogen skid schematics, and learned how to weld. I also programmed microcontrollers to interact with various sensors and helped with other things going on around the lab.

  16. FINITE VOLUME NUMERICAL ANALYSIS FOR PARABOLIC EQUATION WITH ROBIN BOUNDARY CONDITION

    Institute of Scientific and Technical Information of China (English)

    Xia Cui

    2005-01-01

    In this paper, finite volume method on unstructured meshes is studied for a parabolic convection-diffusion problem on an open bounded set of Rd (d = 2 or 3) with Robin boundary condition. Upwinding approximations are adapted to treat both the convection term and Robin boundary condition. By directly getting start from the formulation of the finite volume scheme, numerical analysis is done. By using several discrete functional analysis techniques such as summation by parts, discrete norm inequality, et al, the stability and error estimates on the approximate solution are established, existence and uniqueness of the approximate solution and the 1st order temporal norm and L2 and H1 spacial norm convergence properties are obtained.

  17. A methodology for the development of RESTful semantic web services for gene expression analysis

    NARCIS (Netherlands)

    Guardia, Gabriela D.A.; Ferreira Pires, Luis; Vêncio, Ricardo Z.N.; Malmegrim, Kelen C.R.; Farias, de Cléver R.G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In r

  18. Wind power projects in the CDM: Methodologies and tools for baselines, carbon financing and substainability analysis

    DEFF Research Database (Denmark)

    Ringius, L.; Grohnheit, Poul Erik; Nielsen, Lars Henrik;

    2002-01-01

    The report is intended to be a guidance document for project developers, investors, lenders, and CDM host countries involved in wind power projects in the CDM. The report explores in particular those issues that are important in CDM project assessment anddevelopment - that is, baseline development......, carbon financing, and environmental sustainability. It does not deal in detail with those issues that are routinely covered in a standard wind power project assessment. The report tests, compares, andrecommends methodologies for and approaches to baseline development. To present the application...... the lowest (0.5496 tCO2/MWh) and the highest emission rate (0.6868 tCO2/MWh) estimated in accordancewith these three standardized approaches to baseline development according to the Marrakesh Accord. This difference in emission factors comes about partly as a result of including hydroelectric power...

  19. A methodology for the analysis of a thermal-hydraulic phenomenon investigated in a test facility

    Energy Technology Data Exchange (ETDEWEB)

    D`Auria, F. [Dept. of Mechanical and Nuclear Constructions, Pisa Univ. (Italy); Faluomi, V. [Dept. of Mechanical and Nuclear Constructions, Pisa Univ. (Italy); Aksan, N. [Lab. for Thermal-Hydraulics, Paul Scherrer Inst., Villigen (Switzerland)

    1995-08-01

    A methodology for analysing non-homogeneous sets of experimental data for a selected phenomenon from separate effect test facilities and integral test facilities is presented in this paper. The critical heat flux from the validation matrices was chosen as the phenomenon to be studied; the results obtained in three test facilities are analysed. The method presented is applied for estimating the accuracy with which a thermalhydraulic transient code can predict the critical heat flux in an actual nuclear power plant. (orig.) [Deutsch] Gegenstand des Beitrags ist ein Verfahren zur Analyse ungleichartiger Datensaetze, die bei der experimentellen Untersuchung eines bestimmten thermohydraulischen Phaenomens in speziellen oder integralen Testeinrichtungen gewonnen wurden. Bei dem untersuchten Phaenomen handelt es sich hier um die kritische Waermestromdichte; experimentelle Daten aus drei Testeinrichtungen werden analysiert. Das Verfahren wird benutzt, um die Genauigkeit abzuschaetzen, mit der ein thermohydraulischer Rechencode zur Beschreibung von Uebergangszustaenden die kritische Waermestromdichte in einem Kernkraftwerk vorhersagen kann. (orig.)

  20. Portable laserscan for in-ditch dent profiling and strain analysis: methodology and application development

    Energy Technology Data Exchange (ETDEWEB)

    Arumugam, Udayasankar; Tandon, Samarth; Gao, Ming; Krishnamurthy, Ravi [Blade Energy Partners, Houston, Texas (United States); Hanson, Ben; Rehman, Hamood; Fingerhut, Martin [Applus RTD, Houston, Texas (United States)

    2010-07-01

    Mechanical damage to pipelines has been assessed with two methodologies. The traditional one was a depth-based assessment, whose limitations could result in an underestimation of the dent severity. In more recent years, therefore, operators have preferred to complement this method with the use of strain-based criteria. The data from an ILI caliper tool can be used to calculate strain in order to determine dent severity. After every ILI run, verification of inspection performance is necessary, but this has been a challenge for the industry because of the lack of a unified protocol and other causes. According to a recent study, LaserScan 3D technology provides an accurate profile and is an ideal tool for verification of ILI performance. This paper introduces a portable LaserScan 3D mapping technology to measure dents, alone or associated with other anomalies. It discusses the accuracy and resolution of this technology and its appropriateness for pipelines.

  1. Factor Analysis in Assessing the Research Methodology Quality of Systematic Reviews

    Directory of Open Access Journals (Sweden)

    Andrada Elena URDA-CÎMPEAN

    2011-12-01

    Full Text Available Introduction: Many high quality systematic reviews available from medical journals, data bases and other electronic sources differ in quality and provide different answers to the same question. The literature recommended the use of a checklist type approach, which exceeds many of the problems associated with measurements. Aim: This study proposes to identify in a checklist type approach the most commonly used factors (from a methodological point of view in assessing the quality of systematic reviews, and then mirror the actual stage of medical writing. We want to analyze the factors’ occurrence and / or their development in the text and in the abstract of systematic reviews published in 2011. Methods: The present study randomly selected only free full text systematic reviews published in 2011, systematic reviews found in Pubmed and in Cochrane Database. The most commonly used factors were identified in PRISMA statement and quality measurement tools. Results: The evaluated systematic reviews mentioned or developed several of the factors studied. Only 78% of the papers surveyed have used the correct IMRAD format and 59% of them have mentioned the sample size used. The correspondence between the content of the paper and its abstract is summarized in the proportion of 54.63% and 51.85% for the two sets of factors, and it can lead to scarce appreciation of the article provided that only abstracts are read. Conclusions: Researchers do not properly take into consideration scientific articles and assessment tools used for quality evaluation. They should place more value over methodological factors which help assess systematic review quality, while journals form the only party who can enforce quality standards in medical writing.

  2. Development and application of an analysis methodology for interpreting ambiguous historical pressure data in the WIPP gas-generation experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Felicione, F. S.

    2006-01-23

    the headspace volume caused by thermal expansion and contraction within the brine and waste. A further effort was directed at recovering useful results from the voluminous archived pressure data. An analytic methodology to do this was developed. This methodology was applied to each archived pressure measurement to nullify temperature and other effects to yield an adjusted pressure, from which gas-generation rates could be calculated. A review of the adjusted-pressure data indicated that generated-gas concentrations among these containers after approximately 3.25 years of test operation ranged from zero to over 17,000 ppm by volume. Four test containers experienced significant gas generation. All test containers that showed evidence of significant gas generation contained carbon-steel in the waste, indicating that corrosion was the predominant source of gas generation.

  3. Effect of varicocelectomy on testis volume and semen parameters in adolescents: a meta-analysis

    Directory of Open Access Journals (Sweden)

    Tie Zhou

    2015-01-01

    Full Text Available Varicocele repair in adolescent remains controversial. Our aim is to identify and combine clinical trials results published thus far to ascertain the efficacy of varicocelectomy in improving testis volume and semen parameters compared with nontreatment control. A literature search was performed using Medline, Embase and Web of Science, which included results obtained from meta-analysis, randomized and nonrandomized controlled studies. The study population was adolescents with clinically palpable varicocele with or without the testicular asymmetry or abnormal semen parameters. Cases were allocated to treatment and observation groups, and testis volume or semen parameters were adopted as outcome measures. As a result, seven randomized controlled trials (RCTs and nonrandomized controlled trials studying bilateral testis volume or semen parameters in both treatment and observation groups were identified. Using a random effect model, mean difference of testis volume between the treatment group and the observation group was 2.9 ml (95% confidence interval [CI]: 0.6, 5.2; P< 0.05 for the varicocele side and 1.5 ml (95% CI: 0.3, 2.7; P< 0.05 for the healthy side. The random effect model analysis demonstrated that the mean difference of semen concentration, total semen motility, and normal morphology between the two groups was 13.7 × 10 6 ml−1 (95% CI: −1.4, 28.8; P = 0.075, 2.5% (95% CI: −3.6, 8.6; P= 0.424, and 2.9% (95% CI: −3.0, 8.7; P= 0.336 respectively. In conclusion, although varicocelectomy significantly improved bilateral testis volume in adolescents with varicocele compared with observation cases, semen parameters did not have any statistically significant difference between two groups. Well-planned, properly conducted RCTs are needed in order to confirm the above-mentioned conclusion further and to explore whether varicocele repair in adolescents could improve subsequently spontaneous pregnancy rates.

  4. Volume and Direction of the Atlantic Slave Trade, 1650-1870: Estimates by Markov Chain Carlo Analysis

    Directory of Open Access Journals (Sweden)

    Patrick Manning

    2016-03-01

    Full Text Available This article presents methods and results in the application of the Markov Chain Monte Carlo (MCMC analysis to a problem in missing data. The data used here are from The Atlantic Slave Trade Database (TASTD, 2010 version, available online. Of the currently known 35,000 slaving voyages, original data on the size of the cargo of captives exist for some 25 percent of voyage embarkations in Africa and for about 50 percent of arrivals in the Americas. Previous efforts to estimate the missing data (and project the to- tal number of captives who made the transatlantic migration have proceeded through eclectic projections of maximum likelihood estimates of captives per voyage, without error margins. This paper creates new estimates of total mi- grant flow through two methods: one is a formally frequentist set of multiple methods, and the other is through Markov Chain Monte Carlo methodology. Comparison of the three methods, all based on the same raw data, show that the results of the two new methods are fairly close to one another and they yield total flows of migrant captives of more than 20 percent higher than the previous estimates. Quantitative results, presented in simplified graphs and tables within the text and in detailed spreadsheets available online, provide a new estimate of the volume of African embarkations and American arrivals in the transatlantic slave trade for the period from 1650 to 1870, by decade, for eleven African regions of embarkation and seven American and European regions of arrival.

  5. Magnetic resonance velocity imaging derived pressure differential using control volume analysis

    Directory of Open Access Journals (Sweden)

    Cohen Benjamin

    2011-03-01

    Full Text Available Abstract Background Diagnosis and treatment of hydrocephalus is hindered by a lack of systemic understanding of the interrelationships between pressures and flow of cerebrospinal fluid in the brain. Control volume analysis provides a fluid physics approach to quantify and relate pressure and flow information. The objective of this study was to use control volume analysis and magnetic resonance velocity imaging to non-invasively estimate pressure differentials in vitro. Method A flow phantom was constructed and water was the experimental fluid. The phantom was connected to a high-resolution differential pressure sensor and a computer controlled pump producing sinusoidal flow. Magnetic resonance velocity measurements were taken and subsequently analyzed to derive pressure differential waveforms using momentum conservation principles. Independent sensor measurements were obtained for comparison. Results Using magnetic resonance data the momentum balance in the phantom was computed. The measured differential pressure force had amplitude of 14.4 dynes (pressure gradient amplitude 0.30 Pa/cm. A 12.5% normalized root mean square deviation between derived and directly measured pressure differential was obtained. These experiments demonstrate one example of the potential utility of control volume analysis and the concepts involved in its application. Conclusions This study validates a non-invasive measurement technique for relating velocity measurements to pressure differential. These methods may be applied to clinical measurements to estimate pressure differentials in vivo which could not be obtained with current clinical sensors.

  6. Coal conversion processes and analysis methodologies for synthetic fuels production. [technology assessment and economic analysis of reactor design for coal gasification

    Science.gov (United States)

    1979-01-01

    Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.

  7. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  8. Big Cat Coalitions: A Comparative Analysis of Regional Brain Volumes in Felidae

    Science.gov (United States)

    Sakai, Sharleen T.; Arsznov, Bradley M.; Hristova, Ani E.; Yoon, Elise J.; Lundrigan, Barbara L.

    2016-01-01

    Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of four focal species: lions (Panthera leo), leopards (Panthera pardus), cougars (Puma concolor), and cheetahs (Acinonyx jubatus). These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography. Skulls (n = 75) were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC) volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in four focal species revealed that lions and leopards, while not significantly different from one another, have relatively larger AC volumes

  9. Big Cat Coalitions: A comparative analysis of regional brain volumes in Felidae

    Directory of Open Access Journals (Sweden)

    Sharleen T Sakai

    2016-10-01

    Full Text Available Broad-based species comparisons across mammalian orders suggest a number of factors that might influence the evolution of large brains. However, the relationship between these factors and total and regional brain size remains unclear. This study investigated the relationship between relative brain size and regional brain volumes and sociality in 13 felid species in hopes of revealing relationships that are not detected in more inclusive comparative studies. In addition, a more detailed analysis was conducted of 4 focal species: lions (Panthera leo, leopards (Panthera pardus, cougars (Puma concolor, and cheetahs (Acinonyx jubatus. These species differ markedly in sociality and behavioral flexibility, factors hypothesized to contribute to increased relative brain size and/or frontal cortex size. Lions are the only truly social species, living in prides. Although cheetahs are largely solitary, males often form small groups. Both leopards and cougars are solitary. Of the four species, leopards exhibit the most behavioral flexibility, readily adapting to changing circumstances. Regional brain volumes were analyzed using computed tomography (CT. Skulls (n=75 were scanned to create three-dimensional virtual endocasts, and regional brain volumes were measured using either sulcal or bony landmarks obtained from the endocasts or skulls. Phylogenetic least squares (PGLS regression analyses found that sociality does not correspond with larger relative brain size in these species. However, the sociality/solitary variable significantly predicted anterior cerebrum (AC volume, a region that includes frontal cortex. This latter finding is despite the fact that the two social species in our sample, lions and cheetahs, possess the largest and smallest relative AC volumes, respectively. Additionally, an ANOVA comparing regional brain volumes in 4 focal species revealed that lions and leopards, while not significantly different from one another, have relatively

  10. Fabrication, testing, and analysis of anisotropic carbon/glass hybrid composites: volume 1: technical report.

    Energy Technology Data Exchange (ETDEWEB)

    Wetzel, Kyle K. (Wetzel Engineering, Inc. Lawrence, Kansas); Hermann, Thomas M. (Wichita state University, Wichita, Kansas); Locke, James (Wichita state University, Wichita, Kansas)

    2005-11-01

    o} from the long axis for approximately two-thirds of the laminate volume (discounting skin layers), with reinforcing carbon fibers oriented axially comprising the remaining one-third of the volume. Finite element analysis of each laminate has been performed to examine first ply failure. Three failure criteria--maximum stress, maximum strain, and Tsai-Wu--have been compared. Failure predicted by all three criteria proves generally conservative, with the stress-based criteria the most conservative. For laminates that respond nonlinearly to loading, large error is observed in the prediction of failure using maximum strain as the criterion. This report documents the methods and results in two volumes. Volume 1 contains descriptions of the laminates, their fabrication and testing, the methods of analysis, the results, and the conclusions and recommendations. Volume 2 contains a comprehensive summary of the individual test results for all laminates.

  11. Performance Prediction Modelling for Flexible Pavement on Low Volume Roads Using Multiple Linear Regression Analysis

    Directory of Open Access Journals (Sweden)

    C. Makendran

    2015-01-01

    Full Text Available Prediction models for low volume village roads in India are developed to evaluate the progression of different types of distress such as roughness, cracking, and potholes. Even though the Government of India is investing huge quantum of money on road construction every year, poor control over the quality of road construction and its subsequent maintenance is leading to the faster road deterioration. In this regard, it is essential that scientific maintenance procedures are to be evolved on the basis of performance of low volume flexible pavements. Considering the above, an attempt has been made in this research endeavor to develop prediction models to understand the progression of roughness, cracking, and potholes in flexible pavements exposed to least or nil routine maintenance. Distress data were collected from the low volume rural roads covering about 173 stretches spread across Tamil Nadu state in India. Based on the above collected data, distress prediction models have been developed using multiple linear regression analysis. Further, the models have been validated using independent field data. It can be concluded that the models developed in this study can serve as useful tools for the practicing engineers maintaining flexible pavements on low volume roads.

  12. Coupled Structural, Thermal, Phase-Change and Electromagnetic Analysis for Superconductors. Volume 1

    Science.gov (United States)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Militello, C.; Schuler, J. J.

    1996-01-01

    Described are the theoretical development and computer implementation of reliable and efficient methods for the analysis of coupled mechanical problems that involve the interaction of mechanical, thermal, phase-change and electromagnetic subproblems. The focus application has been the modeling of superconductivity and associated quantum-state phase-change phenomena. In support of this objective the work has addressed the following issues: (1) development of variational principles for finite elements, (2) finite element modeling of the electromagnetic problem, (3) coupling of thermal and mechanical effects, and (4) computer implementation and solution of the superconductivity transition problem. The main accomplishments have been: (1) the development of the theory of parametrized and gauged variational principles, (2) the application of those principled to the construction of electromagnetic, thermal and mechanical finite elements, and (3) the coupling of electromagnetic finite elements with thermal and superconducting effects, and (4) the first detailed finite element simulations of bulk superconductors, in particular the Meissner effect and the nature of the normal conducting boundary layer. The theoretical development is described in two volumes. This volume, Volume 1, describes mostly formulations for specific problems. Volume 2 describes generalization of those formulations.

  13. New seismic attributes and methodology for automated stratigraphic, structural, and reservoir analysis

    Energy Technology Data Exchange (ETDEWEB)

    Randen, Trygve; Reymond, Benoit; Sjulstad, Hans Ivar; Soenneland, Lars

    1998-12-31

    Seismic stratigraphy represents an attractive framework for interpretation of 3-D data. This presentation is an introduction to a set of primitives that will enable guided interpretation of seismic signals in the framework of seismic stratigraphy. A method capable of automatic detection of terminations is proposed. The new procedure can be run on the entire seismic volume or it may be restricted to a limited time interval and detects terminations in an unguided manner without prior interpretation. The density of terminations can be computed. The procedure may alternatively be guided by pre-existing interpretation, e.g. detecting terminations onto an interpreted horizon. In such a case, the density of terminations will be a new surface attribute. 6 refs., 3 figs.

  14. Independent Orbiter Assessment (IOA): Analysis of the Electrical Power Distribution and Control Subsystem, Volume 2

    Science.gov (United States)

    Schmeckpeper, K. R.

    1987-01-01

    The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA approach features a top-down analysis of the hardware to determine failure modes, criticality, and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. This report documents the independent analysis results corresponding to the Orbiter Electrical Power Distribution and Control (EPD and C) hardware. The EPD and C hardware performs the functions of distributing, sensing, and controlling 28 volt DC power and of inverting, distributing, sensing, and controlling 117 volt 400 Hz AC power to all Orbiter subsystems from the three fuel cells in the Electrical Power Generation (EPG) subsystem. Volume 2 continues the presentation of IOA analysis worksheets and contains the potential critical items list.

  15. EXPLANATORY METHODS OF MARKETING DATA ANALYSIS – THEORETICAL AND METHODOLOGICAL CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Rozalia GABOR

    2010-01-01

    Full Text Available Explanatory methods of data analysis – also named by some authors supervised learning methods - enable researchers to identify and analyse configurations of relations between two or several variables, most of them with a high accuracy, as there is possibility of testing statistic significance by calculating the confidence level associated with validation of relation concerned across the entire population and not only the surveyed sample. The paper shows some of these methods, respectively: variance analysis, covariance analysis, segmentation and discriminant analysis with the mention - for every method – of applicability area for marketing research.

  16. Use of PFMEA methodology as a competitive advantage for the analysis of improvements in an experimental procedure

    Directory of Open Access Journals (Sweden)

    Fernando Coelho

    2015-12-01

    Full Text Available The methodology of Failure Modes and Effects Analysis (FMEA, utilized by industries to investigate potential failures, contributes to ensuring the robustness of the project and the manufacturing process, even before production starts. Thus, there is a reduced likelihood of errors, and a higher level of efficiency and effectiveness at high productivity. This occurs through the elimination or reduction of productive problems. In this context, this study is based on the structured application of PFMEA (Process Failure Mode Effects Analysis, associated with other quality tools, in a simulation of the assembly of an electro-pneumatic system. This study was performed at the Experimental Laboratory of the Botucatu Technology Faculty (FATEC, with the support of five undergraduate students from the Technology Industrial Production Course. The methodology applied contributed to the forecast of 24 potential failures and improvements opportunities, investigation of their causes, proving to be a standard that is applicable to any productive process with a gain in efficiency and effectiveness. Therefore, the final strategy was to evaluate and minimize the potential failures, to reduce production costs and to increase the performance of the process.

  17. Effectiveness of the random sequential absorption algorithm in the analysis of volume elements with nanoplatelets

    DEFF Research Database (Denmark)

    Pontefisso, Alessandro; Zappalorto, Michele; Quaresimin, Marino

    2016-01-01

    In this work, a study of the Random Sequential Absorption (RSA) algorithm in the generation of nanoplatelet Volume Elements (VEs) is carried out. The effect of the algorithm input parameters on the reinforcement distribution is studied through the implementation of statistical tools, showing...... that the platelet distribution is systematically affected by these parameters. The consequence is that a parametric analysis of the VE input parameters may be biased by hidden differences in the filler distribution. The same statistical tools used in the analysis are implemented in a modified RSA algorithm...

  18. A control-volume method for analysis of unsteady thrust augmenting ejector flows

    Science.gov (United States)

    Drummond, Colin K.

    1988-01-01

    A method for predicting transient thrust augmenting ejector characteristics is presented. The analysis blends classic self-similar turbulent jet descriptions with a control volume mixing region discretization to solicit transient effects in a new way. Division of the ejector into an inlet, diffuser, and mixing region corresponds with the assumption of viscous-dominated phenomenon in the latter. Inlet and diffuser analyses are simplified by a quasi-steady analysis, justified by the assumptions that pressure is the forcing function in those regions. Details of the theoretical foundation, the solution algorithm, and sample calculations are given.

  19. Methodologies of tissue preservation and analysis of the glycogen content of the broiler chick liver.

    Science.gov (United States)

    Bennett, L W; Keirs, R W; Peebles, E D; Gerard, P D

    2007-12-01

    The current study was performed to develop convenient, rapid, reliable, and pragmatic methodologies by which to harvest and preserve liver tissue glycogen and to analyze its levels within reasonable limits of quantification and with extended chromophore stability. Absorbance values decreased by 2 h and again by 24 h after preparation of the iodine-potassium iodide chromophore, whereas absorbance values of the phenol-sulfuric acid chromophore remained constant over the same time period. These absorbance trends for each chromophore followed full color development within 5 min after combining the analyte with the respective chromophore reagent. Use of the phenol-sulfuric acid reagent allowed for a 10-fold reduction in assay limits of detection and quantification when compared with the iodine-potassium iodide reagent. Furthermore, glycogen concentration-absorbance relationships were affected by the source (i.e., rabbit liver vs. bovine liver) of glycogen standards when the iodine-potassium iodide chromophore was used, but the source of the standards had no influence when the phenol-sulfuric acid chromophore was used. The indifference of the phenol-sulfuric acid method to the glycogen source, as exhibited by similar linear regressions of absorbance, may be attributed to actual determination of glucose subunit concentrations after complete glycogen hydrolysis by sulfuric acid. This is in contrast to the actual measurement of whole glycogen, which may exhibit source- or time-related molecular structural differences. The iodine-potassium iodide methodology is a test of whole glycogen concentrations; therefore, it may be influenced by glycogen structural differences. Liver tissue sample weight (between 0.16 and 0.36 g) and processing, which included mincing, immediate freezing, or refrigeration in 10% perchloric acid for 1 wk prior to tissue grinding, had no effect on glycogen concentrations that were analyzed by using the phenol-sulfuric acid reagent. These results

  20. A regional comparative analysis of empirical and theoretical flood peak-volume relationships

    Directory of Open Access Journals (Sweden)

    Szolgay Ján

    2016-12-01

    Full Text Available This paper analyses the bivariate relationship between flood peaks and corresponding flood event volumes modelled by empirical and theoretical copulas in a regional context, with a focus on flood generation processes in general, the regional differentiation of these and the effect of the sample size on reliable discrimination among models. A total of 72 catchments in North-West of Austria are analysed for the period 1976–2007. From the hourly runoff data set, 25 697 flood events were isolated and assigned to one of three flood process types: synoptic floods (including long- and short-rain floods, flash floods or snowmelt floods (both rain-on-snow and snowmelt floods. The first step of the analysis examines whether the empirical peak-volume copulas of different flood process types are regionally statistically distinguishable, separately for each catchment and the role of the sample size on the strength of the statements. The results indicate that the empirical copulas of flash floods tend to be different from those of the synoptic and snowmelt floods. The second step examines how similar are the empirical flood peak-volume copulas between catchments for a given flood type across the region. Empirical copulas of synoptic floods are the least similar between the catchments, however with the decrease of the sample size the difference between the performances of the process types becomes small. The third step examines the goodness-of-fit of different commonly used copula types to the data samples that represent the annual maxima of flood peaks and the respective volumes both regardless of flood generating processes (the traditional engineering approach and also considering the three process-based classes. Extreme value copulas (Galambos, Gumbel and Hüsler-Reiss show the best performance both for synoptic and flash floods, while the Frank copula shows the best performance for snowmelt floods. It is concluded that there is merit in treating flood

  1. Linear and volume measurements of pulmonary nodules at different CT dose levels. Interscan and interscan analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hein, P.A.; Romano, V.C.; Rogalla, P.; Klessen, C.; Lembcke, A.; Bauknecht, H.C. [Charite-Universitaetsmedizin Berlin (Germany). Inst. fuer Radiologie; Dicken, V.; Bornemann, L. [MeVis Research, Bremen (Germany)

    2009-01-15

    Purpose: To compare the interobserver variability of the unidimensional diameter and volume measurements of pulmonary nodules in an intrascan and interscan analysis using semi-automated segmentation software on ultra-low-dose computed tomography (ULD-CT) and standard dose CT (SD-CT) data. Materials and Methods: In 33 patients with pulmonary nodules, two chest multi-slice CT (MSCT) datasets (1 mm slice thickness; 20 % reconstruction overlap) had been consecutively acquired with an ultra-low dose (120 kV, 5 mAs) and standard dose technique (120 kV, 75 mAs). MSCT data was retrospectively analyzed using the segmentation software OncoTREAT (MeVis, Bremen, Germany, version 1.3). The volume of 229 solid pulmonary nodules included in the analysis as well as the largest diameter according to RECIST (Response Evaluation Criteria for Solid Tumors) were measured by two radiologists. Interobserver variability was calculated and SD-CT and ULD-CT data compared in an intrascan and interscan analysis. Results: The median nodule diameter (n = 229 nodules) was registered with 8.2 mm (range: 2.8 to 43.6 mm, mean: 10.8 mm). The nodule volume ranged between 0.01 and 49.1 ml (median 0.1 ml, mean 1.5 ml). With respect to interobserver variability, the intrascan analysis did not reveal statistically significant differences (p > 0.05) between ULD-CT and SD-CT with broader limits of agreement for relative differences of RECIST measurements (-31.0 % + 27.0 % mean -2.0 % for SD-CT; -27.0 % + 38.6 %, mean 5.8 % for ULD-CT) than for volume measurements (-9.4 %, 8.0 %, mean 0.7 % for SD-CT; -13 %, 13 %, mean 0.0 % for ULD-CT). The interscan analysis showed broadened 95 % confidence intervals for volume measurements (-26.5 % 29.1 % mean 1.3 %, and -25.2 %, 29.6 %, mean 2.2 %) but yielded comparable limits of agreement for RECIST measurements. Conclusion: The variability of nodule volumetry assessed by semi-automated segmentation software as well as nodule size determination by RECIST appears to be

  2. A Review and Application of Citation Analysis Methodology to Reading Research Journal Literature.

    Science.gov (United States)

    Summers, Edward G.

    1984-01-01

    Reviews citation analysis literature and explores use of citation analysis to identify core journals and indicate disciplinary structure and interrelationships of journals reporting on reading research. Use of 1980 "Journal Citation Reports" data to generate criterion for selecting high impact journals and list of 33 core journals is…

  3. A Methodology for Cost-Effective Analysis of In-Place Software Processes

    Science.gov (United States)

    1997-01-01

    routinely col- lected repositories of readily available data that can be mined for information useful in empirical process analysis . The...brings the people involved in the process on board by showing early results of what may be learned from empirical process analysis , and smooths the way for

  4. Get Real in Individual Participant Data (IPD) Meta-Analysis: A Review of the Methodology

    Science.gov (United States)

    Debray, Thomas P. A.; Moons, Karel G. M.; van Valkenhoef, Gert; Efthimiou, Orestis; Hummel, Noemi; Groenwold, Rolf H. H.; Reitsma, Johannes B.

    2015-01-01

    Individual participant data (IPD) meta-analysis is an increasingly used approach for synthesizing and investigating treatment effect estimates. Over the past few years, numerous methods for conducting an IPD meta-analysis (IPD-MA) have been proposed, often making different assumptions and modeling choices while addressing a similar research…

  5. Get real in individual participant data (IPD) meta-analysis : a review of the methodology

    NARCIS (Netherlands)

    Debray, Thomas P. A.; Moons, Karel G. M.; van Valkenhoef, Gert; Efthimiou, Orestis; Hummel, Noemi; Groenwold, Rolf H. H.; Reitsma, Johannes B.

    2015-01-01

    Individual participant data (IPD) meta-analysis is an increasingly used approach for synthesizing and investigating treatment effect estimates. Over the past few years, numerous methods for conducting an IPD meta-analysis (IPD-MA) have been proposed, often making different assumptions and modeling c

  6. Methodology for systematic analysis and improvement of manufacturing unit process life-cycle inventory (UPLCI)—CO2PE! initiative (cooperative effort on process emissions in manufacturing). Part 1: Methodology description

    DEFF Research Database (Denmark)

    Kellens, Karel; Dewulf, Wim; Overcash, Michael

    2012-01-01

    the provision of high-quality data for LCA studies of products using these unit process datasets for the manufacturing processes, as well as the in-depth analysis of individual manufacturing unit processes.In addition, the accruing availability of data for a range of similar machines (same process, different......This report proposes a life-cycle analysis (LCA)-oriented methodology for systematic inventory analysis of the use phase of manufacturing unit processes providing unit process datasets to be used in life-cycle inventory (LCI) databases and libraries. The methodology has been developed...... and resource efficiency improvements of the manufacturing unit process. To ensure optimal reproducibility and applicability, documentation guidelines for data and metadata are included in both approaches. Guidance on definition of functional unit and reference flow as well as on determination of system...

  7. Development of human reliability analysis methodology and its computer code during low power/shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Huh, Chang Wook; Kim, Ju Yeul; Kim Do Hyung; Kim, Yoon Ik; Yang, Hui Chang [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hansung University, Seoul (Korea, Republic of)

    1997-07-01

    The objective of this study is to develop the appropriate procedure that can evaluate the human error in LP/S(lower power/shutdown) and the computer code that calculate the human error probabilities(HEPs) using this framework. The assessment of applicability of the typical HRA methodologies to LP/S is conducted and a new HRA procedure, SEPLOT (Systematic Evaluation Procedure for LP/S Operation Tasks) which presents the characteristics of LP/S is developed by selection and categorization of human actions by reviewing present studies. This procedure is applied to evaluate the LOOP(Loss of Off-site Power) sequence and the HEPs obtained by using SEPLOT are used to quantitative evaluation of the core uncovery frequency. In this evaluation one of the dynamic reliability computer codes, DYLAM-3 which has the advantages against the ET/FT is used. The SEPLOT developed in this study can give the basis and arrangement as to the human error evaluation technique. And this procedure can make it possible to assess the dynamic aspects of accidents leading to core uncovery applying the HEPs obtained by using the SEPLOT as input data to DYLAM-3 code, Eventually, it is expected that the results of this study will contribute to improve safety in LP/S and reduce uncertainties in risk. 57 refs. 17 tabs., 33 figs. (author)

  8. Extremely low frequency electromagnetic field measurements at the Hylaty station and methodology of signal analysis

    Science.gov (United States)

    Kulak, Andrzej; Kubisz, Jerzy; Klucjasz, Slawomir; Michalec, Adam; Mlynarczyk, Janusz; Nieckarz, Zenon; Ostrowski, Michal; Zieba, Stanislaw

    2014-06-01

    We present the Hylaty geophysical station, a high-sensitivity and low-noise facility for extremely low frequency (ELF, 0.03-300 Hz) electromagnetic field measurements, which enables a variety of geophysical and climatological research related to atmospheric, ionospheric, magnetospheric, and space weather physics. The first systematic observations of ELF electromagnetic fields at the Jagiellonian University were undertaken in 1994. At the beginning the measurements were carried out sporadically, during expeditions to sparsely populated areas of the Bieszczady Mountains in the southeast of Poland. In 2004, an automatic Hylaty ELF station was built there, in a very low electromagnetic noise environment, which enabled continuous recording of the magnetic field components of the ELF electromagnetic field in the frequency range below 60 Hz. In 2013, after 8 years of successful operation, the station was upgraded by extending its frequency range up to 300 Hz. In this paper we show the station's technical setup, and how it has changed over the years. We discuss the design of ELF equipment, including antennas, receivers, the time control circuit, and power supply, as well as antenna and receiver calibration. We also discuss the methodology we developed for observations of the Schumann resonance and wideband observations of ELF field pulses. We provide examples of various kinds of signals recorded at the station.

  9. Evaluation of the methodology in publications describing epidemiological design for dental research: a critical analysis

    Directory of Open Access Journals (Sweden)

    Telmo Oliveira Bittar

    2011-01-01

    Full Text Available Introduction and objective: To describe, analyze, and critically review the methodology employed in dental epidemiological research available on electronic databases, evaluating their structures according to Strobe and Consort initiative. Material and methods: ISI Web of knowledge, Scopus, and Pubmed electronic databases were selected for literature research, gathering publications in dental epidemiological area using the following designs: cross-sectional, cohort, case-control, descriptive, experimental, and quasi-experimental. Subsequently, five specific dentistry journals were selected and had their abstracts content analyzed under Strobe and Consort statement criterion. Results: From a universe of 10,160 articles from Pubmed (the greatest number, only 3,198 could be classified according to their epidemiological design by the electronic database searching tool. The most common designs were cross-sectional, cohort, case-control, descriptive, experimental and quasi-experimental publications, showing a tendency towards occurring bias and confounding factors in literature research due to missing words in papers structure. Even though Consort and Strobe initiatives have been accomplished since 2001 and 2004 respectively, some publications are not suitable for their checklist. Conclusion: Consort and Strobe statements must be strengthened by dental journals, editors and reviewers to improve the quality of the studies, attempting to avoid any sort of bias or confounding factors in the literature research performed by electronic database.

  10. Landscape analysis as a theoretical-methodological tool and bridge for territory socio-environmental management

    Directory of Open Access Journals (Sweden)

    Susana Barrera Lobatón

    2013-12-01

    Full Text Available This paper attempts to understand how the concept of landscape is transformed from its academic production and discussion to its adaptation by the State; and also how this concept ends up being applied on the territory by various professionals interested in it. In order to do this we look at some definitions developed by various disciplines: Geography, Anthropology, Archeology, Ecology, Agronomy and Architecture, highlighting key points; and we make a comparison between their proposals and the way these concepts have been adopted by the State, specifically through the so-called “terms of reference”. Then we analyze how these concepts and methodologies are put into practice by professionals working in different consulting companies, many times disregarding how local inhabitants understand and live the landscape from their territory. We conclude that there is a clear need to build bridges between the Academia, the State, the people and the business sector, through professional practice and based on the need to acknowledge the way local inhabitants live the concept from their territory. Only in this way will we manage to make a real impact and better people’s life quality.

  11. English Language Teaching in Spain: Do Textbooks Comply with the Official Methodological Regulations? A Sample Analysis

    Directory of Open Access Journals (Sweden)

    Aquilino Sánchez

    2009-06-01

    Full Text Available The goal of this paper is to verify up to what point ELT textbooks used in Spanish educational settings comply with the official regulations prescribed, which fully advocate the Communicative Language Teaching Method (CLT. For that purpose, seven representative coursebooks of different educational levels and modalities in Spain – secondary, upper secondary, teenager and adult textbooks – were selected to be analysed. A full unit randomly selected from each coursebook was examined through the parameters of the communicative potential of the activities – measured on a scale from 0 to 10 – and the communicative nature of the methodological strategies implemented – measured on a dichotomous scale (yes/no. Global results per educational levels point to the prevailing communicative nature of all the materials, which was shown to be above 50%. The remaining non-communicative block was covered by activities focused on the formal features of language (grammar and vocabulary. This resulting degree of dissociation between official regulations and what is really found in teaching materials may be positive, since the learning of languages is complex and results from the intervention of multiple factors and learning styles, as is evidenced by the professional experience of teachers from different backgrounds and beliefs.

  12. Rocket Engine Nozzle Side Load Transient Analysis Methodology: A Practical Approach

    Science.gov (United States)

    Shi, John J.

    2005-01-01

    At the sea level, a phenomenon common with all rocket engines, especially for a highly over-expanded nozzle, during ignition and shutdown is that of flow separation as the plume fills and empties the nozzle, Since the flow will be separated randomly. it will generate side loads, i.e. non-axial forces. Since rocket engines are designed to produce axial thrust to power the vehicles, it is not desirable to be excited by non-axial input forcing functions, In the past, several engine failures were attributed to side loads. During the development stage, in order to design/size the rocket engine components and to reduce the risks, the local dynamic environments as well as dynamic interface loads have to be defined. The methodology developed here is the way to determine the peak loads and shock environments for new engine components. In the past it is not feasible to predict the shock environments, e.g. shock response spectra, from one engine to the other, because it is not scaleable. Therefore, the problem has been resolved and the shock environments can be defined in the early stage of new engine development. Additional information is included in the original extended abstract.

  13. Electric power transport costs - methodologies analysis; Custos de transporte de energia eletrica: analise de metodologias

    Energy Technology Data Exchange (ETDEWEB)

    Takahata, Dario

    1997-07-01

    The dissertation presents the aspects related to the restructuring of power systems in terms of international experiences, and the possible implications for the definition of the new power system in Brazil. The experience shows that the reform in various countries has started from the sector deverticalization, together with the transmissions open access scheme. The retrospect of researched countries indicates that the transmissions remuneration is based on a methodology that recovers the operative cost of transmission transactions, along with an additional amount that take into account the cost of the existing transmission system. The following countries have been analyzed: Chile, Norway, England and Argentina. This work also shows the current situation in Brazil, as in terms of tariffs, as regarding the power system organizational structure, as well as a preliminary proposal conceived by SINTREL (National System of Electrical Energy Transmission) to evaluate the transmission transaction cost. This dissertation ended with comments and conclusions, depicting a future program which might be followed, considering the aspects quoted above and the peculiarities of brazilian power system. (author)

  14. PRESSURE-VOLUME ANALYSIS OF THE LUNG WITH AN EXPONENTIAL AND LINEAR-EXPONENTIAL MODEL IN ASTHMA AND COPD

    NARCIS (Netherlands)

    BOGAARD, JM; OVERBEEK, SE; VERBRAAK, AFM; VONS, C; FOLGERING, HTM; VANDERMARK, TW; ROOS, CM; STERK, PJ

    1995-01-01

    The prevalence of abnormalities in lung elasticity in patients with asthma or chronic obstructive pulmonary disease (COPD) is still unclear, This might be due to uncertainties concerning the method of analysis of quasistatic deflation long pressure-volume curves. Pressure-volume curves were obtained

  15. Development of methodology for horizontal-axis wind-turbine dynamic analysis. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    Dugundji, J.

    1982-09-01

    The 3-year effort reported included: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal-axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotar; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastic; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  16. Fast implementation of kernel simplex volume analysis based on modified Cholesky factorization for endmember extraction

    Institute of Scientific and Technical Information of China (English)

    Jing LI; Xiao-run LI; Li-jiao WANG; Liao-ying ZHAO

    2016-01-01

    Endmember extraction is a key step in the hyperspectral image analysis process. The kernel new simplex growing algorithm (KNSGA), recently developed as a nonlinear alternative to the simplex growing algorithm (SGA), has proven a prom-ising endmember extraction technique. However, KNSGA still suffers from two issues limiting its application. First, its random initialization leads to inconsistency in final results; second, excessive computation is caused by the iterations of a simplex volume calculation. To solve the first issue, the spatial pixel purity index (SPPI) method is used in this study to extract the first endmember, eliminating the initialization dependence. A novel approach tackles the second issue by initially using a modified Cholesky fac-torization to decompose the volume matrix into triangular matrices, in order to avoid directly computing the determinant tauto-logically in the simplex volume formula. Theoretical analysis and experiments on both simulated and real spectral data demon-strate that the proposed algorithm significantly reduces computational complexity, and runs faster than the original algorithm.

  17. Final safety analysis report for the Galileo Mission: Volume 1, Reference design document

    Energy Technology Data Exchange (ETDEWEB)

    1988-05-01

    The Galileo mission uses nuclear power sources called Radioisotope Thermoelectric Generators (RTGs) to provide the spacecraft's primary electrical power. Because these generators contain nuclear material, a Safety Analysis Report (SAR) is required. A preliminary SAR and an updated SAR were previously issued that provided an evolving status report on the safety analysis. As a result of the Challenger accident, the launch dates for both Galileo and Ulysses missions were later rescheduled for November 1989 and October 1990, respectively. The decision was made by agreement between the DOE and the NASA to have a revised safety evaluation and report (FSAR) prepared on the basis of these revised vehicle accidents and environments. The results of this latest revised safety evaluation are presented in this document (Galileo FSAR). Volume I, this document, provides the background design information required to understand the analyses presented in Volumes II and III. It contains descriptions of the RTGs, the Galileo spacecraft, the Space Shuttle, the Inertial Upper Stage (IUS), the trajectory and flight characteristics including flight contingency modes, and the launch site. There are two appendices in Volume I which provide detailed material properties for the RTG.

  18. Conceptual design and systems analysis of photovoltaic power systems. Final report. Volume III(2). Technology

    Energy Technology Data Exchange (ETDEWEB)

    Pittman, P.F.

    1977-05-01

    Conceptual designs were made and analyses were performed on three types of solar photovoltaic power systems. Included were Residential (1 to 10 kW), Intermediate (0.1 to 10 MW), and Central (50 to 1000 MW) Power Systems to be installed in the 1985 to 2000 time period. The following analyses and simulations are covered: residential power system computer simulations, intermediate power systems computer simulation, central power systems computer simulation, array comparative performance, utility economic and margin analyses, and financial analysis methodology.

  19. STATISTICAL METHODOLOGY FOR THE SIMULTANEOUS ANALYSIS OF MULTIPLE TYPES OF OUTCOMES IN NONLINEAR THRESHOLD MODELS.

    Science.gov (United States)

    Multiple outcomes are often measured on each experimental unit in toxicology experiments. These multiple observations typically imply the existence of correlation between endpoints, and a statistical analysis that incorporates it may result in improved inference. When both disc...

  20. Methodological considerations for global analysis of cellular FLIM/FRET measurements

    Science.gov (United States)

    Adbul Rahim, Nur Aida; Pelet, Serge; Kamm, Roger D.; So, Peter T. C.

    2012-02-01

    Global algorithms can improve the analysis of fluorescence energy transfer (FRET) measurement based on fluorescence lifetime microscopy. However, global analysis of FRET data is also susceptible to experimental artifacts. This work examines several common artifacts and suggests remedial experimental protocols. Specifically, we examined the accuracy of different methods for instrument response extraction and propose an adaptive method based on the mean lifetime of fluorescent proteins. We further examined the effects of image segmentation and a priori constraints on the accuracy of lifetime extraction. Methods to test the applicability of global analysis on cellular data are proposed and demonstrated. The accuracy of global fitting degrades with lower photon count. By systematically tracking the effect of the minimum photon count on lifetime and FRET prefactors when carrying out global analysis, we demonstrate a correction procedure to recover the correct FRET parameters, allowing us to obtain protein interaction information even in dim cellular regions with photon counts as low as 100 per decay curve.

  1. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reyes F, M. del C.

    2015-07-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  2. An Analysis Methodology for Stochastic Characteristic of Volumetric Error in Multiaxis CNC Machine Tool

    Directory of Open Access Journals (Sweden)

    Qiang Cheng

    2013-01-01

    Full Text Available Traditional approaches about error modeling and analysis of machine tool few consider the probability characteristics of the geometric error and volumetric error systematically. However, the individual geometric error measured at different points is variational and stochastic, and therefore the resultant volumetric error is aslo stochastic and uncertain. In order to address the stochastic characteristic of the volumetric error for multiaxis machine tool, a new probability analysis mathematical model of volumetric error is proposed in this paper. According to multibody system theory, a mean value analysis model for volumetric error is established with consideration of geometric errors. The probability characteristics of geometric errors are obtained by statistical analysis to the measured sample data. Based on probability statistics and stochastic process theory, the variance analysis model of volumetric error is established in matrix, which can avoid the complex mathematics operations during the direct differential. A four-axis horizontal machining center is selected as an illustration example. The analysis results can reveal the stochastic characteristic of volumetric error and are also helpful to make full use of the best workspace to reduce the random uncertainty of the volumetric error and improve the machining accuracy.

  3. Methodological and technological implications of quantitative human movement analysis in long term space flights.

    Science.gov (United States)

    Ferrigno, G; Baroni, G; Pedotti, A

    1999-04-01

    In the frame of the 179-days EUROMIR '95 space mission, two in-flight experiments foresaw the analysis of three-dimensional human movements in microgravity. For this aim, a space qualified opto-electronic motion analyser based on passive markers was installed onboard the MIR Space Station. The paper describes the experimental procedures designed in order to face technical and operational limitations imposed by the critical environment of the orbital module. The reliability of the performed analysis is discussed, focusing two related aspects: accuracy in three-dimensional marker localisation and data comparability among different experimental sessions. The effect of the critical experimental set-up and of TV cameras optical distortions is evaluated on in-flight acquired data, by performing an analysis on Euclidean distance conservation on rigid bodies. An optimisation method for the recovering of a unique reference frame throughout the whole mission is described. Results highlight the potentiality that opto-electronics and close-range photogrammetry have for automatic motion analysis onboard orbital modules. The discussion of the obtained results provides general suggestions for the implementation of experimental human movement analysis in critical environments, based on the suitable trade-off between external constraints and achievable analysis reliability.

  4. Does bioimpedance analysis or measurement of natriuretic peptides aid volume assessment in peritoneal dialysis patients?

    Science.gov (United States)

    Davenport, Andrew

    2013-01-01

    Cardiovascular mortality remains the commonest cause of death for peritoneal dialysis patients. As such, preventing persistent hypervolemia is important. On the other hand, hypovolemia may potentially risk episodes of acute kidney injury and loss of residual renal function, a major determinant of peritoneal dialysis technique survival. Bioimpedance has developed from a single-frequency research tool to a multi-frequency bioelectrical impedance analysis readily available in the clinic and capable of measuring extracellular, intracellular, and total body water. Similarly, natriuretic peptides released from the heart because of myocardial stretch and increased intracardiac volume have also been variously reported to be helpful in assessing volume status in peritoneal dialysis patients. The question then arises whether these newer technologies and biomarkers have supplanted the time-honored clinical assessment of hydration status or whether they are merely adjuncts that aid the experienced clinician.

  5. In vitro analysis of neurospheres derived from glioblastoma primary culture: a novel methodology paradigm.

    Directory of Open Access Journals (Sweden)

    Lorena Favaro Pavon

    2014-01-01

    Full Text Available Glioblastomas are the most lethal primary brain tumour frequently relapse or progress as focal masses after radiation, suggesting that only a fraction of tumour cells are responsible for the tumor regrowth. The identification of a brain tumour cell subpopulation with potent tumorigenic activity supports the cancer stem cell hypothesis in solid tumours. The goal of this study was to determine a methodology for the establishment of primary human glioblastoma stem cell lines. Our aim was achieved by taking the following approaches: i the establishment of primary glioblastoma cell culture; ii isolation of neurospheres derived from glioblastoma primary culture and derived straight from the tumor; iii CD133 microbeads purified neurospheres by MACS, iv Formation of subspheres in the CD133+ population, v Study of the expression level of GFAP, CD133, Nestin, Nanog, CD34 and Sox2 markers on tumor subspheres. Here, we describe a successful method for isolation of CD133+ cell population and establishment of glioblastoma neurospheres from this primary culture, which are more robust than the ones derived straight from the tumor. Highlight that the neurospheres derived from glioblatoma primary culture showed 89% expression of CD133+ cells, whereas tumor-derived neurospheres showed a 60% expression of CD133+ cells. These results show a higher concentration of CD133+ cells in neurospheres derived from glioblastoma primary culture. These CD133+ fractions were able to further generate subspheres. The subspheres derived from glioblastoma primary culture presented a well defined morphology while the ones derived form the fresh tumor were sparce and less robust. The negative fraction of CD133 cells was unable to generate subspheres. The tumor subspheres expressed GFAP, CD133, Nestin and Nanog. The present study describes an optimization of isolation of neurospheres/subspheres derived from glioblastoma primary culture by process of selection of CD133+ adherent stem

  6. Comparison of quenching and extraction methodologies for metabolome analysis of Lactobacillus plantarum

    Directory of Open Access Journals (Sweden)

    Faijes Magda

    2007-08-01

    Full Text Available Abstract Background A reliable quenching and metabolite extraction method has been developed for Lactobacillus plantarum. The energy charge value was used as a critical indicator for fixation of metabolism. Results Four different aqueous quenching solutions, all containing 60% of methanol, were compared for their efficiency. Only the solutions containing either 70 mM HEPES or 0.85% (w/v ammonium carbonate (pH 5.5 caused less than 10% cell leakage and the energy charge of the quenched cells was high, indicating rapid inactivation of the metabolism. The efficiency of extraction of intracellular metabolites from cell cultures depends on the extraction methods, and is expected to vary between micro-organisms. For L. plantarum, we have compared five different extraction methodologies based on (i cold methanol, (ii perchloric acid, (iii boiling ethanol, (iv chloroform/methanol (1:1 and (v chloroform/water (1:1. Quantification of representative intracellular metabolites showed that the best extraction efficiencies were achieved with cold methanol, boiling ethanol and perchloric acid. Conclusion The ammonium carbonate solution was selected as the most suitable quenching buffer for metabolomics studies in L. plantarum because (i leakage is minimal, (ii the energy charge indicates good fixation of metabolism, and (iii all components are easily removed during freeze-drying. A modified procedure based on cold methanol extraction combined good extractability with mild extraction conditions and high enzymatic inactivation. These features make the combination of these quenching and extraction protocols very suitable for metabolomics studies with L. plantarum.

  7. Ethnozoology in Brazil: analysis of the methodological risks in published studies

    Directory of Open Access Journals (Sweden)

    R. M. Lyra-Neves

    Full Text Available Abstract There has been a growth in the field of Ethnozoology throughout the years, especially in Brazil, where a considerable number of scientific articles pertaining to this subject has been published in recent decades. With this increase in publications comes the opportunity to assess the quality of these publications, as there are no known studies assessing the methodological risks in this area. Based on this observation, our objectives were to compile the papers published on the subject of ethnozoology and to answer the following questions: 1 Do the Brazilian ethnozoological studies use sound sampling methods?; 2 Is the sampling quality influenced by characteristics of the studies/publications? The studies found in databases and using web search engines were compiled to answer these questions. The studies were assessed based on their nature, sampling methods, use of hypotheses and tests, journal’s impact factor, and animal group studied. The majority of the studies analyzed exhibited problems associated with the samples, as 144 (66.98% studies were classified as having a high risk of bias. With regard to the characteristics analyzed, we determined that a quantitative nature and the use of tests are essential components of good sampling. Most studies classified as moderate and low risk either did not provide these data or provided data that were not clear; therefore, these studies were classified as being of a quali-quantitative nature. Studies performed with vertebrate groups were of high risk. Most of the papers analyzed here focused on fish, insects, and/or mollusks, thus highlighting the difficulties associated with conducting interviews regarding tetrapod vertebrates. Such difficulties are largely related to the extremely strict Brazilian laws, justified by the decline and extinction of some species, related to the use of wild tetrapod vertebrates.

  8. Analysis of simulation methodology for calculation of the heat of transport for vacancy thermodiffusion

    Energy Technology Data Exchange (ETDEWEB)

    Tucker, William C.; Schelling, Patrick K., E-mail: patrick.schelling@ucf.edu [Advanced Material Processing and Analysis Center and Department of Physics, University of Central Florida, 4000 Central Florida Blvd., Orlando, Florida 32816 (United States)

    2014-07-14

    Computation of the heat of transport Q{sub a}{sup *} in monatomic crystalline solids is investigated using the methodology first developed by Gillan [J. Phys. C: Solid State Phys. 11, 4469 (1978)] and further developed by Grout and coworkers [Philos. Mag. Lett. 74, 217 (1996)], referred to as the Grout-Gillan method. In the case of pair potentials, the hopping of a vacancy results in a heat wave that persists for up to 10 ps, consistent with previous studies. This leads to generally positive values for Q{sub a}{sup *} which can be quite large and are strongly dependent on the specific details of the pair potential. By contrast, when the interactions are described using the embedded atom model, there is no evidence of a heat wave, and Q{sub a}{sup *} is found to be negative. This demonstrates that the dynamics of vacancy hopping depends strongly on the details of the empirical potential. However, the results obtained here are in strong disagreement with experiment. Arguments are presented which demonstrate that there is a fundamental error made in the Grout-Gillan method due to the fact that the ensemble of states only includes successful atom hops and hence does not represent an equilibrium ensemble. This places the interpretation of the quantity computed in the Grout-Gillan method as the heat of transport in doubt. It is demonstrated that trajectories which do not yield hopping events are nevertheless relevant to computation of the heat of transport Q{sub a}{sup *}.

  9. Ethnozoology in Brazil: analysis of the methodological risks in published studies.

    Science.gov (United States)

    Lyra-Neves, R M; Santos, E M; Medeiros, P M; Alves, R R N; Albuquerque, U P

    2015-11-01

    There has been a growth in the field of Ethnozoology throughout the years, especially in Brazil, where a considerable number of scientific articles pertaining to this subject has been published in recent decades. With this increase in publications comes the opportunity to assess the quality of these publications, as there are no known studies assessing the methodological risks in this area. Based on this observation, our objectives were to compile the papers published on the subject of ethnozoology and to answer the following questions: 1) Do the Brazilian ethnozoological studies use sound sampling methods?; 2) Is the sampling quality influenced by characteristics of the studies/publications? The studies found in databases and using web search engines were compiled to answer these questions. The studies were assessed based on their nature, sampling methods, use of hypotheses and tests, journal's impact factor, and animal group studied. The majority of the studies analyzed exhibited problems associated with the samples, as 144 (66.98%) studies were classified as having a high risk of bias. With regard to the characteristics analyzed, we determined that a quantitative nature and the use of tests are essential components of good sampling. Most studies classified as moderate and low risk either did not provide these data or provided data that were not clear; therefore, these studies were classified as being of a quali-quantitative nature. Studies performed with vertebrate groups were of high risk. Most of the papers analyzed here focused on fish, insects, and/or mollusks, thus highlighting the difficulties associated with conducting interviews regarding tetrapod vertebrates. Such difficulties are largely related to the extremely strict Brazilian laws, justified by the decline and extinction of some species, related to the use of wild tetrapod vertebrates.

  10. Thermodynamic Equilibrium Analysis of Methanol Conversion to Hydrocarbons Using Cantera Methodology

    Directory of Open Access Journals (Sweden)

    Duminda A. Gunawardena

    2012-01-01

    Full Text Available Reactions associated with removal of oxygen from oxygenates (deoxygenation are an important aspect of hydrocarbon fuels production process from biorenewable substrates. Here we report the equilibrium composition of methanol-to-hydrocarbon system by minimizing the total Gibbs energy of the system using Cantera methodology. The system was treated as a mixture of 14 components which had CH3OH, C6H6, C7H8, C8H10 (ethyl benzene, C8H10 (xylenes, C2H4, C2H6, C3H6, CH4, H2O, C, CO2, CO, H2. The carbon in the equilibrium mixture was used as a measure of coke formation which causes deactivation of catalysts that are used in aromatization reaction(s. Equilibrium compositions of each species were analyzed for temperatures ranging from 300 to 1380 K and pressure at 0–15 atm gauge. It was observed that when the temperature increases the mole fractions of benzene, toluene, ethylbenzene, and xylene pass through a maximum around 1020 K. At 300 K the most abundant species in the system were CH4, CO2, and H2O with mole fractions 50%, 16.67%, and 33.33%, respectively. Similarly at high temperature (1380 K, the most abundant species in the system were H2 and CO with mole fractions 64.5% and 32.6% respectively. The pressure in the system shows a significant impact on the composition of species.

  11. Qualitative Secondary Analysis in Austere Times: Ethical, Professional and Methodological Considerations

    Directory of Open Access Journals (Sweden)

    Carrie Coltart

    2013-01-01

    Full Text Available Recent debates in qualitative secondary analysis (QSA have sought to move beyond polarising arguments in order to develop more nuanced perspectives on the epistemological, analytical and practical opportunities and challenges associated with its methods. This is generally to be welcomed, although there are also signs of unhelpful primary/secondary divisions finding new forms of expression. Focusing on definitional issues and wider contexts of QSA helps to explain the possible sources of ongoing tensions while affording tentative insights into potential opportunities and synergies across the primary/secondary spectrum. Building on work undertaken within the Timescapes Qualitative Longitudinal study, the article also highlights some under-examined costs and risks that may come along with new opportunities created by secondary analysis. Issues of over-privileging secondary analysis claims, making and the timing of qualitative secondary analysis are foregrounded as requiring further consideration if researchers are to take seriously lingering suspicions and fears about qualitative secondary analysis and not dismiss them as simply reactionary or self-serving. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1301181

  12. A discourse analysis methodology based on semantic principles - an application to brands, journalists and consumers discourses

    Directory of Open Access Journals (Sweden)

    Luc Grivel

    2011-12-01

    Full Text Available This is a R&D Paper. It describes an analysis coming from a research project about opinion measurement and monitoring on the Internet. This research is realized within "Paragraphe" laboratory, in partnership with the market research institute Harris Interactive (CIFRE grant beginning July 2010. The purpose of the study was to define CRM possibilities. The targets of the study were self-employed workers and very small businesses. The discourses analysis is linked to a qualitative study. It turns around three types of discourses: brands, journalists and clients’ discourses. In the brand discourses analysis we benchmarked brand websites belonging to several businesses. In this first step, we tried to identify the most used words and promises by brands to the target we were studying. For that benchmark, we downloaded "Professionals" sections of the websites. Clients’ discourses analysis is based on opened answers coming from satisfaction questionnaires. The questions we are studying have been asked after a call to a hot line or after a technician intervention. Journalists’ discourses analysis is based on articles, published on information websites specialized in Harris Interactive's client sector. These websites were chosen because we considered them to be representative of information sources, which the target could consult.

  13. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  14. Durability of recycled aggregate concrete designed with the Equivalent Mortar Volume (EMV method: Validation under the Spanish context and its adaptation to Bolomey methodology

    Directory of Open Access Journals (Sweden)

    Jiménez, C.

    2014-03-01

    Full Text Available Some durability properties are analyzed in concretes made with a novel method for recycled aggregates concrete (RAC proportioning, in order to validate it under the Spanish context. Two types of concrete mixes were elaborated; one following the guidelines of the named method, and other based on an adaptation of the method to Bolomey methodology. Two types of recycled concrete aggregates (RCA were used. RCA replacement for natural aggregates (NA ranged from 20% to 100%. The 20% was chosen in order to comply with Spanish recommendations. Water penetration under pressure, water absorption and chlorides attack were the studied properties. It is verified that the new method and the developed adaptation results in concrete mixes of better or similar properties to those of the natural aggregates concrete (NAC and the conventional RAC, saving important amounts of cement.Algunas propiedades de durabilidad son analizadas en hormigones elaborados con el nuevo método para la dosificación de hormigones con árido reciclado (HAR para validarlo bajo el contexto español. Se elaboraron dos tipos de hormigones; uno siguiendo las directrices del nuevo método y otro basado en una adaptación del anterior a la metodología Bolomey. Se utilizaron dos tipos de árido reciclado (ARH. Los reemplazos de áridos variaron entre 20% y 100%. El 20% ha sido elegido para cumplir con recomendaciones españolas sobre HAR. Las propiedades estudiadas fueron: penetración de agua bajo presión, absorción de agua y susceptibilidad al ataque de cloruros. Se verifica que el nuevo método y la adaptación desarrollada resultan en hormigones con mejores o similares características que las de un hormigón con áridos naturales (HAN y las de HAR convencional, ahorrando, además, importantes cantidades de cemento.

  15. Methodological comparison of DNA extraction from Holcocerrus hippophaecolus (Lepidoptera: Cossidae) for AFLP analysis

    Institute of Scientific and Technical Information of China (English)

    CHEN Min; ZHU Yang-yu; TAO Jing; Luo You-qing

    2008-01-01

    Amplified fragment length polymorphism (AFLP) is a powerful DNA fingerprinting technique for studying genetic rela-tionships and genetic diversity in insects. However, the crucial prerequisite for AFLP analysis is to extract DNA of high quality. In this study, we evaluate four different protocols (SDS method, improved SDS method, CTAB method and a complex method with SDS and CTAB) for isolating DNA from the seabuckthorn carpenter moth (Holcocerrus hippophaecolus (Lepidoptera: Cossidae)). The results indicate that the CTAB method does not produce DNA suitable for AFLP analysis. The SDS method and the complex method with SDS and CTAB are comparatively time-consuming and resulted in low yields of DNA and were therefore not used for AFLP assay. The improved SDS method is recommended for preparing DNA templates from H. hippophaecolus for AFLP analysis.

  16. Imaging and finite element analysis: a methodology for non-invasive characterization of aortic tissue.

    Science.gov (United States)

    Flamini, Vittoria; Creane, Arthur P; Kerskens, Christian M; Lally, Caitríona

    2015-01-01

    Characterization of the mechanical properties of arterial tissues usually involves an invasive procedure requiring tissue removal. In this work we propose a non-invasive method to perform a biomechanical analysis of cardiovascular aortic tissue. This method is based on combining medical imaging and finite element analysis (FEA). Magnetic resonance imaging (MRI) was chosen since it presents relatively low risks for human health. A finite element model was created from the MRI images and loaded with systolic physiological pressures. By means of an optimization routine, the structural material properties were changed until average strains matched those measured by MRI. The method outlined in this work produced an estimate of the in situ properties of cardiovascular tissue based on non-invasive image datasets and finite element analysis.

  17. Gas chromatography analysis with olfactometric detection (GC-O) as a useful methodology for chemical characterization of odorous compounds.

    Science.gov (United States)

    Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria

    2013-12-05

    The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications.

  18. Gas Chromatography Analysis with Olfactometric Detection (GC-O as a Useful Methodology for Chemical Characterization of Odorous Compounds

    Directory of Open Access Journals (Sweden)

    Magda Brattoli

    2013-12-01

    Full Text Available The gas chromatography-olfactometry (GC-O technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications.

  19. A methodology to incorporate life cycle analysis and the triple bottom line mechanism for sustainable management of industrial enterprises

    Science.gov (United States)

    Wang, Ling; Lin, Li

    2004-02-01

    Since 1970"s, the environmental protection movement has challenged industries to increase their investment in Environmentally Conscious Manufacturing (ECM) techniques and management tools. Social considerations for global citizens and their descendants also motivated the examination on the complex issues of sustainable development beyond the immediate economic impact. Consequently, industrial enterprises have started to understand sustainable development in considering the Triple Bottom Line (TBL): economic prosperity, environmental quality and social justice. For the management, however, a lack of systematic ECM methodologies hinders their effort in planning, evaluating, reporting and auditing of sustainability. To address this critical need, this research develops a framework of a sustainable management system by incorporating a Life Cycle Analysis (LCA) of industrial operations with the TBL mechanism. A TBL metric system with seven sets of indices for the TBL elements and their complex relations is identified for the comprehensive evaluation of a company"s sustainability performance. Utilities of the TBL indices are estimated to represent the views of various stakeholders, including the company, investors, employees and the society at large. Costs of these indices are also captured to reflect the company"s effort in meeting the utilities. An optimization model is formulated to maximize the economic, environmental and social benefits by the company"s effort in developing sustainable strategies. To promote environmental and social consciousness, the methodology can significantly facilitate management decisions by its capabilities of including "non-business" values and external costs that the company has not contemplated before.

  20. Modelling extrudate expansion in a twin-screw food extrusion cooking process through dimensional analysis methodology

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    2010-01-01

    A new phenomenological model is proposed to correlate extrudate expansion and extruder operation parameters in a twin-screw food extrusion cooking process. Buckingham's pi dimensional analysis method is applied to establish the model. Three dimensionless groups, i.e. pump efficiency, water content...... and temperature, are formed to model the extrusion process from dimensional analysis. The model is evaluated with experimental data for extrusion of whole wheat flour and fish feed. The average deviations of the model correlations are 5.9% and 9% based on experimental data for the whole wheat flour and fish feed...