WorldWideScience

Sample records for analysis methodology based

  1. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  2. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  3. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  4. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  5. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  6. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  7. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  8. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  9. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  10. Value and Vision-based Methodology in Integrated Design

    DEFF Research Database (Denmark)

    Tollestrup, Christian

    on empirical data from workshop where the Value and Vision-based methodology has been taught. The research approach chosen for this investigation is Action Research, where the researcher plays an active role in generating the data and gains a deeper understanding of the investigated phenomena. The result...... of this thesis is the value transformation from an explicit set of values to a product concept using a vision based concept development methodology based on the Pyramid Model (Lerdahl, 2001) in a design team context. The aim of this thesis is to examine how the process of value transformation is occurring within...... is divided in three; the systemic unfolding of the Value and Vision-based methodology, the structured presentation of practical implementation of the methodology and finally the analysis and conclusion regarding the value transformation, phenomena and learning aspects of the methodology....

  11. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  12. The methodology of semantic analysis for extracting physical effects

    Science.gov (United States)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  13. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  14. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  15. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  16. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  17. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  18. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  19. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  20. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  1. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  2. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  3. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  4. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  5. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  6. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  7. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  8. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  9. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  10. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  11. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  12. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  13. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  14. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  15. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Reynolds, J.T.

    1998-01-01

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  16. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  17. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  18. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  19. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  20. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  1. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  2. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  3. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  4. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  5. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  6. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  7. An Efficient Power Estimation Methodology for Complex RISC Processor-based Platforms

    OpenAIRE

    Rethinagiri , Santhosh Kumar; Ben Atitallah , Rabie; Dekeyser , Jean-Luc; Niar , Smail; Senn , Eric

    2012-01-01

    International audience; In this contribution, we propose an efficient power estima- tion methodology for complex RISC processor-based plat- forms. In this methodology, the Functional Level Power Analysis (FLPA) is used to set up generic power models for the different parts of the system. Then, a simulation framework based on virtual platform is developed to evalu- ate accurately the activities used in the related power mod- els. The combination of the two parts above leads to a het- erogeneou...

  8. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  9. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  10. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  11. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  12. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  13. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanopa, Amornchai; Gani, Rafiqul

    2013-01-01

    A systematic design methodology is developed for producing multiple main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data........ Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  14. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...... the successful applications of the methodology. Moreover, energy requirements for various column configurations corresponding to different feed locatio...

  15. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  16. Design methodology for bio-based processing: Biodiesel and fatty alcohol production

    DEFF Research Database (Denmark)

    Simasatikul, Lida; Arpornwichanop, Amornchai; Gani, Rafiqul

    2012-01-01

    A systematic design methodology is developed for producing two main products plus side products starting with one or more bio-based renewable source. A superstructure that includes all possible reaction and separation operations is generated through thermodynamic insights and available data. The ....... Economic analysis and net present value are determined to find the best economically and operationally feasible process. The application of the methodology is presented through a case study involving biodiesel and fatty alcohol productions....

  17. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance.......The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  18. A solar reserve methodology for renewable energy integration studies based on sub-hourly variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Eduardo; Brinkman, Gregory; Hummon, Marissa [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lew, Debra

    2012-07-01

    Increasing penetration of wind and solar energy are raising concerns among electric system operators because of the variability and uncertainty associated with the power sources. Previous work focused on the quantification of reserves for systems with wind power. This paper presents a new methodology that allows the determination of necessary reserves for high penetrations of photovoltaic power and compares it to the wind-based methodology. The solar reserve methodology was applied to Phase 2 of the Western Wind and Solar Integration Study. A summary of the results is included. (orig.)

  19. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  20. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  1. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  2. MS-based analytical methodologies to characterize genetically modified crops.

    Science.gov (United States)

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  3. Risk-based methodology for USNRC inspections

    International Nuclear Information System (INIS)

    Wong, S.M.; Holahan, G.M.; Chung, J.W.; Johnson, M.R.

    1995-01-01

    This paper describes the development and trial applications of a risk-based methodology to enhance the inspection processes for US nuclear power plants. Objectives of risk-based methods to complement prescriptive engineering approaches in US Nuclear Regulatory Commission (USNRC) inspection programs are presented. Insights from time-dependent risk profiles of plant configurational from Individual Plant Evaluation (IPE) studies were integrated to develop a framework for optimizing inspection efforts in NRC regulatory initiatives. Lessons learned from NRC pilot applications of the risk-based methodology for evaluation of the effectiveness of operational risk management programs at US nuclear power plant sites are also discussed

  4. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  5. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  6. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  7. A methodology for sunlight urban planning: a computer-based solar and sky vault obstruction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Fernando Oscar Ruttkay; Silva, Carlos Alejandro Nome [Federal Univ. of Santa Catarina (UFSC), Dept. of Architecture and Urbanism, Florianopolis, SC (Brazil); Turkienikz, Benamy [Federal Univ. of Rio Grande do Sul (UFRGS), Faculty of Architecture, Porto Alegre, RS (Brazil)

    2001-07-01

    The main purpose of the present study is to describe a planning methodology to improve the quality of the built environment based on the rational control of solar radiation and the view of the sky vault. The main criterion used to control the access and obstruction of solar radiation was the concept of desirability and undesirability of solar radiation. A case study for implementing the proposed methodology is developed. Although needing further developments to find its way into regulations and practical applications, the methodology has shown a strong potential to deal with an aspect that otherwise would be almost impossible. (Author)

  8. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  9. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  10. Multiple-Fault Detection Methodology Based on Vibration and Current Analysis Applied to Bearings in Induction Motors and Gearboxes on the Kinematic Chain

    Directory of Open Access Journals (Sweden)

    Juan Jose Saucedo-Dorantes

    2016-01-01

    Full Text Available Gearboxes and induction motors are important components in industrial applications and their monitoring condition is critical in the industrial sector so as to reduce costs and maintenance downtimes. There are several techniques associated with the fault diagnosis in rotating machinery; however, vibration and stator currents analysis are commonly used due to their proven reliability. Indeed, vibration and current analysis provide fault condition information by means of the fault-related spectral component identification. This work presents a methodology based on vibration and current analysis for the diagnosis of wear in a gearbox and the detection of bearing defect in an induction motor both linked to the same kinematic chain; besides, the location of the fault-related components for analysis is supported by the corresponding theoretical models. The theoretical models are based on calculation of characteristic gearbox and bearings fault frequencies, in order to locate the spectral components of the faults. In this work, the influence of vibrations over the system is observed by performing motor current signal analysis to detect the presence of faults. The obtained results show the feasibility of detecting multiple faults in a kinematic chain, making the proposed methodology suitable to be used in the application of industrial machinery diagnosis.

  11. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    Naegle, Brad R.

    2010-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  12. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  13. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  14. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  15. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  16. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  17. Modernising educational programmes in ICT based on the Tuning methodology

    Directory of Open Access Journals (Sweden)

    Alexander Bedny

    2014-07-01

    Full Text Available An analysis is presented of the experience of modernising undergraduate educational programs using the TUNING methodology, based on the example of the area of studies “Fundamental computer science and information technology” (FCSIT implemented at Lobachevsky State University of Nizhni Novgorod (Russia. The algorithm for reforming curricula for the subject area of information technology in accordance with the TUNING methodology is explained. A comparison is drawn between the existing Russian and European standards in the area of ICT education, including the European e-Competence Framework, with the focus on relevant competences. Some guidelines for the preparation of educational programmes are also provided.

  18. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  19. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  20. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  1. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    Science.gov (United States)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  2. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  3. METHODOLOGICAL APPROACH TO ANALYSIS AND EVALUATION OF INFORMATION PROTECTION IN INFORMATION SYSTEMS BASED ON VULNERABILITY DANGER

    Directory of Open Access Journals (Sweden)

    Y. M. Krotiuk

    2008-01-01

    Full Text Available The paper considers a methodological approach to an analysis and estimation of information security in the information systems which is based on the analysis of vulnerabilities and an extent of their hazard. By vulnerability hazard it is meant a complexity of its operation as a part of an information system. The required and sufficient vulnerability operational conditions  have  been  determined in the paper. The paper proposes a generalized model for attack realization which is used as a basis for construction of an attack realization model for an operation of a particular vulnerability. A criterion for estimation of information protection in the information systems which is based on the estimation of vulnerability hazard is formulated in the paper. The proposed approach allows to obtain a quantitative estimation of the information system security on the basis of the proposed schemes on realization of typical attacks for the distinguished classes of vulnerabilities.The methodical approach is used for choosing variants to be applied for realization of protection mechanisms in the information systems as well as for estimation of information safety in the operating information systems.

  4. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  5. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  6. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  7. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  8. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  9. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  10. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  11. SNE's methodological basis - web-based software in entrepreneurial surveys

    DEFF Research Database (Denmark)

    Madsen, Henning

    This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project.......This overhead based paper gives an introduction to the research methodology applied in the surveys carried out in the SNE-project....

  12. Preparing for budget-based payment methodologies: global payment and episode-based payment.

    Science.gov (United States)

    Hudson, Mark E

    2015-10-01

    Use of budget-based payment methodologies (capitation and episode-based bundled payment) has been demonstrated to drive value in healthcare delivery. With a focus on high-volume, high-cost surgical procedures, inclusion of anaesthesiology services in these methodologies is likely. This review provides a summary of budget-based payment methodologies and practical information necessary for anaesthesiologists to prepare for participation in these programmes. Although few examples of anaesthesiologists' participation in these models exist, an understanding of the structure of these programmes and opportunities for participation are available. Prospective preparation in developing anaesthesiology-specific bundled payment profiles and early participation in pathway development associated with selected episodes of care are essential for successful participation as a gainsharing partner. With significant opportunity to contribute to care coordination and cost management, anaesthesiology can play an important role in budget-based payment programmes and should expect to participate as full gainsharing partners. Precise costing methodologies and accurate economic modelling, along with identification of quality management and cost control opportunities, will help identify participation opportunities and appropriate payment and gainsharing agreements. Anaesthesiology-specific examples with budget-based payment models are needed to help guide increased participation in these programmes.

  13. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  14. Methodology for Mode Selection in Corridor Analysis of Freight Transportation

    OpenAIRE

    Kanafani, Adib

    1984-01-01

    The purpose of tins report is to outline a methodology for the analysis of mode selection in freight transportation. This methodology is intended to partake of transportation corridor analysts, a component of demand analysis that is part of a national transportation process. The methodological framework presented here provides a basis on which specific models and calculation procedures might be developed. It also provides a basis for the development of a data management system suitable for co...

  15. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  16. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  17. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  18. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  19. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  20. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  1. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  2. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  3. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  4. The Development Of Learning Sets And Research Methodology Module Using Problem Based Learning For Accounting Education Students

    OpenAIRE

    Thomas, Partono; Nurkhin, Ahmad

    2016-01-01

    Improving the learning process is very important for every lecturer by implement innovative learning methods or media. The purpose of this study is to develop a research methodology learning instruction and module based of problem based learning for accounting education students. This research applied research and development design in the research methodology course in Economics Education (Accounting) Department, Faculty Of Economics, Semarang State University. Data analysis was used to test...

  5. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  6. Radiation protection optimization using a knowledge based methodology

    International Nuclear Information System (INIS)

    Reyes-Jimenez, J.; Tsoukalas, L.H.

    1991-01-01

    This paper presents a knowledge based methodology for radiological planning and radiation protection optimization. The cost-benefit methodology described on International Commission of Radiation Protection Report No. 37 is employed within a knowledge based framework for the purpose of optimizing radiation protection and plan maintenance activities while optimizing radiation protection. 1, 2 The methodology is demonstrated through an application to a heating ventilation and air conditioning (HVAC) system. HVAC is used to reduce radioactivity concentration levels in selected contaminated multi-compartment models at nuclear power plants when higher than normal radiation levels are detected. The overall objective is to reduce personnel exposure resulting from airborne radioactivity, when routine or maintenance access is required in contaminated areas. 2 figs, 15 refs

  7. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  8. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  9. Evaluation of analytical performance based on partial order methodology.

    Science.gov (United States)

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  11. ECO-ENVIRONMENTAL ASSESSMENT AND ANALYSIS OF TONGLVSHAN MINING AREA IN DAYE CITY, HUBEI PROVINCE BASED ON SPATIOTEMPORAL METHODOLOGY

    Directory of Open Access Journals (Sweden)

    X. M. Zhang

    2015-07-01

    Full Text Available Mine exploitation has a significant impact on the ecological environment status of the surroundings. To analyze the impact of Tonglvshan Mining area to its surroundings, this paper adopted the spatiotemporal methodology based on the extracted Eco-environmental Quality Index (EQI to analysis the extent and degree of the effect. The spatiotemporal methodologies are based on two scales: buffers and administrative units. EQI includes Biological Abundance Index (BAI, Vegetation Index (VI, Water Network Density Index (WNDI, and Land Degradation Index (LDI. The weight of each Index was determined by the analytic hierarchy process (AHP and scores of the experts. The calculating of EQI was referenced to the standard “Technical criterion for Eco-environment Status Evaluation” (HJ/T192-2006)and the “Standards for Classification and Gradation of Soil Erosion” (SL 190-96). Considering ecological and environmental characteristics relevant to China, this method has been widely used to study the environment status of specific regions in China. The assessment based on buffers adopted the radius of 300m, 500m, 700m, 1000m, 1500m, 2000m, 2500m, 3000m, 3500m, and 4000m as the buffers in 3 typical miners respectively. The calculated result indicates that, the REI is increasing with the radius and the increasing rate becoming smaller until REI is stable. Which means the effect of miner is getting weaker with the distance to the miner is increasing and the effect is diminished when the distance is far enough. The analysis of the 3 typical miner shows that the extent and degree of the effect of miner relates not only with the area of the miner, but also with type of mineral resource, the status of mining and the ecological restoration. The assessment was also carried out by calculating the EQI in 14 administrative units in Daye city in 2000, 2005, and 2010. The study shows that the EQI is decreasing in 14 units from 2000 to 2010. The spatiotemporal

  12. A neural network based methodology to predict site-specific spectral acceleration values

    Science.gov (United States)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  13. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  14. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well......Process intensification (PI) has the potential to improve existing as well as conceptual processes, in order to achieve a more sustainable production. PI can be achieved at different levels. That is, the unit operations, functional and/or phenomena level. The highest impact is expected by looking...... as generation and screening of phenomena based flowsheet options are presented using a decomposition based solution approach. The developed methodology as well as necessary tools and supporting methods are highlighted through a case study involving the production of isopropyl-acetate....

  15. AREVA main steam line break fully coupled methodology based on CATHARE-ARTEMIS - 15496

    International Nuclear Information System (INIS)

    Denis, L.; Jasserand, L.; Tomatis, D.; Segond, M.; Royere, C.; Sauvage, J.Y.

    2015-01-01

    The CATHARE code developed since 1979 by AREVA, CEA, EDF and IRSN is one of the major thermal-hydraulic system codes worldwide. In order to have at disposal realistic methodologies based on CATHARE for the whole transient and accident analysis in Chapter 15 of Safety Reports, a coupling with the code ARTEMIS was developed. ARTEMIS is the core code in AREVA's new reactor simulator system ARCADIA, using COBRA-FLX to model the thermal-hydraulics in the core. The Fully Coupled Methodology was adapted to the CATHARE-ARTEMIS coupling to perform Main Steam Line Break studies. This methodology, originally applied to the MANTA-SMART-FLICA coupling, is dedicated to Main Steam Line Break transients at zero power. The aim of this paper is to present the coupling between CATHARE and ARTEMIS and the application of the Fully Coupled Methodology in a different code environment. (authors)

  16. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  17. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    Science.gov (United States)

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  18. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  19. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  20. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  1. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  2. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  3. An evolving systems-based methodology for healthcare planning.

    Science.gov (United States)

    Warwick, Jon; Bell, Gary

    2007-01-01

    Healthcare planning seems beset with problems at all hierarchical levels. These are caused by the 'soft' nature of many of the issues present in healthcare planning and the high levels of complexity inherent in healthcare services. There has, in recent years, been a move to utilize systems thinking ideas in an effort to gain a better understanding of the forces at work within the healthcare environment and these have had some success. This paper argues that systems-based methodologies can be further enhanced by metrication and modeling which assist in exploring the changed emergent behavior of a system resulting from management intervention. The paper describes the Holon Framework as an evolving systems-based approach that has been used to help clients understand complex systems (in the education domain) that would have application in the analysis of healthcare problems.

  4. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  5. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Methodological challenges of optical tweezers-based X-ray fluorescence imaging of biological model organisms at synchrotron facilities.

    Science.gov (United States)

    Vergucht, Eva; Brans, Toon; Beunis, Filip; Garrevoet, Jan; Bauters, Stephen; De Rijcke, Maarten; Deruytter, David; Janssen, Colin; Riekel, Christian; Burghammer, Manfred; Vincze, Laszlo

    2015-07-01

    Recently, a radically new synchrotron radiation-based elemental imaging approach for the analysis of biological model organisms and single cells in their natural in vivo state was introduced. The methodology combines optical tweezers (OT) technology for non-contact laser-based sample manipulation with synchrotron radiation confocal X-ray fluorescence (XRF) microimaging for the first time at ESRF-ID13. The optical manipulation possibilities and limitations of biological model organisms, the OT setup developments for XRF imaging and the confocal XRF-related challenges are reported. In general, the applicability of the OT-based setup is extended with the aim of introducing the OT XRF methodology in all research fields where highly sensitive in vivo multi-elemental analysis is of relevance at the (sub)micrometre spatial resolution level.

  7. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  8. A novel registration-based methodology for prediction of trabecular bone fabric from clinical QCT: A comprehensive analysis.

    Directory of Open Access Journals (Sweden)

    Vimal Chandran

    Full Text Available Osteoporosis leads to hip fractures in aging populations and is diagnosed by modern medical imaging techniques such as quantitative computed tomography (QCT. Hip fracture sites involve trabecular bone, whose strength is determined by volume fraction and orientation, known as fabric. However, bone fabric cannot be reliably assessed in clinical QCT images of proximal femur. Accordingly, we propose a novel registration-based estimation of bone fabric designed to preserve tensor properties of bone fabric and to map bone fabric by a global and local decomposition of the gradient of a non-rigid image registration transformation. Furthermore, no comprehensive analysis on the critical components of this methodology has been previously conducted. Hence, the aim of this work was to identify the best registration-based strategy to assign bone fabric to the QCT image of a patient's proximal femur. The normalized correlation coefficient and curvature-based regularization were used for image-based registration and the Frobenius norm of the stretch tensor of the local gradient was selected to quantify the distance among the proximal femora in the population. Based on this distance, closest, farthest and mean femora with a distinction of sex were chosen as alternative atlases to evaluate their influence on bone fabric prediction. Second, we analyzed different tensor mapping schemes for bone fabric prediction: identity, rotation-only, rotation and stretch tensor. Third, we investigated the use of a population average fabric atlas. A leave one out (LOO evaluation study was performed with a dual QCT and HR-pQCT database of 36 pairs of human femora. The quality of the fabric prediction was assessed with three metrics, the tensor norm (TN error, the degree of anisotropy (DA error and the angular deviation of the principal tensor direction (PTD. The closest femur atlas (CTP with a full rotation (CR for fabric mapping delivered the best results with a TN error of 7

  9. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Science.gov (United States)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  10. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one-b...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  11. Design of an Emulsion-based Personal Detergent through a Model-based Chemical Product Design Methodology

    DEFF Research Database (Denmark)

    Mattei, Michele; Hill, Michael; Kontogeorgis, Georgios

    2013-01-01

    An extended systematic methodology for the design of emulsion-based Chemical products is presented. The methodology consists of a model-based framework involving seven sequential hierarchical steps: starting with the identification of the needs to be satisfied by the product and then adding one...... to obtain one or more candidate formulations. A conceptual casestudy representing a personal detergent is presented to highlight the methodology....

  12. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  13. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  14. A new methodology based on functional principal component analysis to study postural stability post-stroke.

    Science.gov (United States)

    Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz

    2018-05-05

    A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Williams, K.E.; Kotnour, T.

    1991-01-01

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  16. Automated knowledge acquisition for second generation knowledge base systems: A conceptual analysis and taxonomy

    Energy Technology Data Exchange (ETDEWEB)

    Williams, K.E.; Kotnour, T.

    1991-12-31

    In this paper, we present a conceptual analysis of knowledge-base development methodologies. The purpose of this research is to help overcome the high cost and lack of efficiency in developing knowledge base representations for artificial intelligence applications. To accomplish this purpose, we analyzed the available methodologies and developed a knowledge-base development methodology taxonomy. We review manual, machine-aided, and machine-learning methodologies. A set of developed characteristics allows description and comparison among the methodologies. We present the results of this conceptual analysis of methodologies and recommendations for development of more efficient and effective tools.

  17. Multiple Methodologies: Using Community-Based Participatory Research and Decolonizing Methodologies in Kenya

    Science.gov (United States)

    Elder, Brent C.; Odoyo, Kenneth O.

    2018-01-01

    In this project, we examined the development of a sustainable inclusive education system in western Kenya by combining community-based participatory research (CBPR) and decolonizing methodologies. Through three cycles of qualitative interviews with stakeholders in inclusive education, participants explained what they saw as foundational components…

  18. Sensor-based activity recognition using extended belief rule-based inference methodology.

    Science.gov (United States)

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  19. Damage detection methodology on beam-like structures based on combined modal Wavelet Transform strategy

    Science.gov (United States)

    Serra, Roger; Lopez, Lautaro

    2018-05-01

    Different approaches on the detection of damages based on dynamic measurement of structures have appeared in the last decades. They were based, amongst others, on changes in natural frequencies, modal curvatures, strain energy or flexibility. Wavelet analysis has also been used to detect the abnormalities on modal shapes induced by damages. However the majority of previous work was made with non-corrupted by noise signals. Moreover, the damage influence for each mode shape was studied separately. This paper proposes a new methodology based on combined modal wavelet transform strategy to cope with noisy signals, while at the same time, able to extract the relevant information from each mode shape. The proposed methodology will be then compared with the most frequently used and wide-studied methods from the bibliography. To evaluate the performance of each method, their capacity to detect and localize damage will be analyzed in different cases. The comparison will be done by simulating the oscillations of a cantilever steel beam with and without defect as a numerical case. The proposed methodology proved to outperform classical methods in terms of noisy signals.

  20. A deviation based assessment methodology for multiple machine health patterns classification and fault detection

    Science.gov (United States)

    Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay

    2018-01-01

    Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.

  1. A SAS2H/KENO-V methodology for 3D fuel burnup analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.

    2002-01-01

    An efficient methodology for 3D fuel burnup analysis of LWR reactors is described in this paper. This methodology is founded on coupling Monte Carlo method for 3D calculation of node power distribution, and transport method for depletion calculation in ID Wigner-Seitz equivalent cell for each node independently. The proposed fuel burnup modeling, based on application of SCALE-4.4a control modules SAS2H and KENO-V.a is verified for the case of 2D x-y model of IRIS 15 x 15 fuel assembly (with reflective boundary condition) by using two well benchmarked code systems. The one is MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility code, and the second is KENO-V.a/ORIGEN2.1 code system recently developed by authors of this paper. The proposed SAS2H/KENO-V.a methodology was applied for 3D burnup analysis of IRIS-1000 benchmark.44 core. Detailed k sub e sub f sub f and power density evolution with burnup are reported. (author)

  2. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    Science.gov (United States)

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become

  3. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  4. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  5. A Model-Based Prognostics Methodology For Electrolytic Capacitors Based On Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  6. A Duration Prediction Using a Material-Based Progress Management Methodology for Construction Operation Plans

    Directory of Open Access Journals (Sweden)

    Yongho Ko

    2017-04-01

    Full Text Available Precise and accurate prediction models for duration and cost enable contractors to improve their decision making for effective resource management in terms of sustainability in construction. Previous studies have been limited to cost-based estimations, but this study focuses on a material-based progress management method. Cost-based estimations typically used in construction, such as the earned value method, rely on comparing the planned budget with the actual cost. However, accurately planning budgets requires analysis of many factors, such as the financial status of the sectors involved. Furthermore, there is a higher possibility of changes in the budget than in the total amount of material used during construction, which is deduced from the quantity take-off from drawings and specifications. Accordingly, this study proposes a material-based progress management methodology, which was developed using different predictive analysis models (regression, neural network, and auto-regressive moving average as well as datasets on material and labor, which can be extracted from daily work reports from contractors. A case study on actual datasets was conducted, and the results show that the proposed methodology can be efficiently used for progress management in construction.

  7. A Model-based Prognostics Methodology for Electrolytic Capacitors Based on Electrical Overstress Accelerated Aging

    Data.gov (United States)

    National Aeronautics and Space Administration — A remaining useful life prediction methodology for elec- trolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical...

  8. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  9. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  10. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  11. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies

    International Nuclear Information System (INIS)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-01-01

    Graphical abstract: -- Highlights: •Several methods based on nanotechnology achieve limit of detections in the pM and nM ranges for mercury (II) analysis. •Most of these methods are validated in filtered water samples and/or spiked samples. •Thiols in real samples constitute an actual competence for any sensor based on the binding of mercury (II) ions. •Future research should include the study of matrix interferences including thiols and dissolved organic matter. -- Abstract: In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis

  12. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  13. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  14. An ontological case base engineering methodology for diabetes management.

    Science.gov (United States)

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  15. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  16. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  17. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  18. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  19. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  20. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    Science.gov (United States)

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  1. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  2. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  3. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  4. Methodology for thermal-hydraulics analysis of pool type MTR fuel research reactors

    International Nuclear Information System (INIS)

    Umbehaun, Pedro Ernesto

    2000-01-01

    This work presents a methodology developed for thermal-hydraulic analysis of pool type MTR fuel research reactors. For this methodology a computational program, FLOW, and a model, MTRCR-IEAR1 were developed. FLOW calculates the cooling flow distribution in the fuel elements, control elements, irradiators, and through the channels formed among the fuel elements and among the irradiators and reflectors. This computer program was validated against experimental data for the IEA-R1 research reactor core at IPEN-CNEN/SP. MTRCR-IEAR1 is a model based on the commercial program Engineering Equation Solver (EES). Besides the thermal-hydraulic analyses of the core in steady state accomplished by traditional computational programs like COBRA-3C/RERTR and PARET, this model allows to analyze parallel channels with different cooling flow and/or geometry. Uncertainty factors of the variables from neutronic and thermalhydraulic calculation and also from the fabrication of the fuel element are introduced in the model. For steady state analyses MTRCR-IEAR1 showed good agreement with the results of COBRA-3C/RERTR and PARET. The developed methodology was used for the calculation of the cooling flow distribution and the thermal-hydraulic analysis of a typical configuration of the IEA-R1 research reactor core. (author)

  5. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  6. An overview of farming system typology methodologies and its use in the study of pasture-based farming system: a review

    Energy Technology Data Exchange (ETDEWEB)

    Madry, W.; Mena, Y.; Roszkowska, B.; Gozdowski, D.; Hryniewski, R.; Castel, J. M.

    2013-06-01

    The main objective of the paper is to do a critic study of the use of typology methodologies within pasture-based farming systems (PBFS), especially those situated in less favoured areas, showing in each case the more relevant variables or indicators determining the farming system classification. Another objective is to do an overview of the most used farming system typology methodologies in general. First some considerations about the concept of farming system and approaches to its study have been done. Next, the farming system typology methodologies have been showed in general to different farming systems, but addressed preferably to PBFS. The different tools integrated in these methodologies have been considered: sampling methods, sources of data, variables or indicators obtained from available data and techniques of analysis (statistical or not). Methods for farming system classification have been presented (expert methods, analytical methods or a combination of both types). Among the statistical methods, the multivariate analysis has been overall treated, including the principal component analysis and the cluster analysis. Finally, the use of farming system typology methodologies on different pasture-based farming systems has been presented. The most important aspects considered are following: the main objective of the typology, the main animal species, the employed methods of classification and the main variables involved in this classification. (Author) 56 refs.

  7. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  8. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  9. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  10. A GIS-based methodology for the estimation of potential volcanic damage and its application to Tenerife Island, Spain

    Science.gov (United States)

    Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.

    2014-05-01

    This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.

  11. Final Report, Nuclear Energy Research Initiative (NERI) Project: An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model

    International Nuclear Information System (INIS)

    Anistratov, Dmitriy Y.; Adams, Marvin L.; Palmer, Todd S.; Smith, Kord S.; Clarno, Kevin; Hikaru Hiruta; Razvan Nes

    2003-01-01

    OAK (B204) Final Report, NERI Project: ''An Innovative Reactor Analysis Methodology Based on a Quasidiffusion Nodal Core Model'' The present generation of reactor analysis methods uses few-group nodal diffusion approximations to calculate full-core eigenvalues and power distributions. The cross sections, diffusion coefficients, and discontinuity factors (collectively called ''group constants'') in the nodal diffusion equations are parameterized as functions of many variables, ranging from the obvious (temperature, boron concentration, etc.) to the more obscure (spectral index, moderator temperature history, etc.). These group constants, and their variations as functions of the many variables, are calculated by assembly-level transport codes. The current methodology has two main weaknesses that this project addressed. The first weakness is the diffusion approximation in the full-core calculation; this can be significantly inaccurate at interfaces between different assemblies. This project used the nodal diffusion framework to implement nodal quasidiffusion equations, which can capture transport effects to an arbitrary degree of accuracy. The second weakness is in the parameterization of the group constants; current models do not always perform well, especially at interfaces between unlike assemblies. The project developed a theoretical foundation for parameterization and homogenization models and used that theory to devise improved models. The new models were extended to tabulate information that the nodal quasidiffusion equations can use to capture transport effects in full-core calculations

  12. Design process dynamics in an experience-based context : a design methodological analysis of the Brabantia corkscrew development

    NARCIS (Netherlands)

    Vries, de M.J.

    1994-01-01

    In design methodology, the influence of various factors on design processes is studied. In this article the design of the Brabantia corkscrew is presented as a case study in which these factors are analysed. The aim of the analysis is to gain insight into the way Brabantia took these factors into

  13. A Java-Web-Based-Learning Methodology, Case Study ...

    African Journals Online (AJOL)

    A Java-Web-Based-Learning Methodology, Case Study : Waterborne diseases. The recent advances in web technologies have opened new opportunities for computer-based-education. One can learn independently of time and place constraints, and have instantaneous access to relevant updated material at minimal cost.

  14. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  15. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  16. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  17. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  18. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  19. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  20. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  1. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  2. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  3. Coal conversion processes and analysis methodologies for synthetic fuels production. [technology assessment and economic analysis of reactor design for coal gasification

    Science.gov (United States)

    1979-01-01

    Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.

  4. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    Science.gov (United States)

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  6. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Rebollo, L.

    1993-01-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  7. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  8. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  9. A methodology for comprehensive breast cancer Ki67 labeling index with intra-tumor heterogeneity appraisal based on hexagonal tiling of digital image analysis data.

    Science.gov (United States)

    Plancoulaine, Benoit; Laurinaviciene, Aida; Herlin, Paulette; Besusparis, Justinas; Meskauskas, Raimundas; Baltrusaityte, Indra; Iqbal, Yasir; Laurinavicius, Arvydas

    2015-10-19

    Digital image analysis (DIA) enables higher accuracy, reproducibility, and capacity to enumerate cell populations by immunohistochemistry; however, the most unique benefits may be obtained by evaluating the spatial distribution and intra-tissue variance of markers. The proliferative activity of breast cancer tissue, estimated by the Ki67 labeling index (Ki67 LI), is a prognostic and predictive biomarker requiring robust measurement methodologies. We performed DIA on whole-slide images (WSI) of 302 surgically removed Ki67-stained breast cancer specimens; the tumour classifier algorithm was used to automatically detect tumour tissue but was not trained to distinguish between invasive and non-invasive carcinoma cells. The WSI DIA-generated data were subsampled by hexagonal tiling (HexT). Distribution and texture parameters were compared to conventional WSI DIA and pathology report data. Factor analysis of the data set, including total numbers of tumor cells, the Ki67 LI and Ki67 distribution, and texture indicators, extracted 4 factors, identified as entropy, proliferation, bimodality, and cellularity. The factor scores were further utilized in cluster analysis, outlining subcategories of heterogeneous tumors with predominant entropy, bimodality, or both at different levels of proliferative activity. The methodology also allowed the visualization of Ki67 LI heterogeneity in tumors and the automated detection and quantitative evaluation of Ki67 hotspots, based on the upper quintile of the HexT data, conceptualized as the "Pareto hotspot". We conclude that systematic subsampling of DIA-generated data into HexT enables comprehensive Ki67 LI analysis that reflects aspects of intra-tumor heterogeneity and may serve as a methodology to improve digital immunohistochemistry in general.

  10. A risk-based sensor placement methodology

    International Nuclear Information System (INIS)

    Lee, Ronald W.; Kulesz, James J.

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors

  11. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  12. DEVELOPING FINAL COURSE MONOGRAPHS USING A TEAM-BASED LEARNING METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ani Mari Hartz

    2016-04-01

    Full Text Available This article describes an experience with the Team-Based Learning (TBL methodology in courses designed to support the planning and execution of final course monographs. It contains both professors’ and students’ perceptions, through observation and assessment. A qualitative approach using observation techniques and desk research was used in conjunction with a quantitative approach based on a questionnaire. The sample consisted of 49 students from a higher education institution, 27 of them in a Communication Course and the remaining 22 in a Business Administration course. Qualitative data analysis was performed through simple categorization with back-defined categories, while the quantitative data analysis employed descriptive statistics and cluster analysis using Minitab 17.1 software. The main findings include the identification of: three student profiles (designated as traditional, collaborative and practical; a preference for guidance and feedback from the professor rather than other students; and a need for a professor-led closing discussion when applying the TBL method. As regards the main benefits to students, they recognized that discussion in groups allowed them to realize how much they really know about the subject studied. Finally, most students seemed to like the TBL approach.

  13. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  14. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  15. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    Science.gov (United States)

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  16. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  17. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  18. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  19. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  20. INSTALLING AN ERP SYSTEM WITH A METHODOLOGY BASED ON THE PRINCIPLES OF GOAL DIRECTED PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ioannis Zafeiropoulos

    2010-01-01

    Full Text Available This paper describes a generic methodology to support the process of modelling, adaptation and implementation (MAI of Enterprise Resource Planning Systems (ERPS based on the principles of goal directed project management (GDPM. The proposed methodology guides the project manager through specific stages in order to successfully complete the ERPS implementation. The development of the proper MAI methodology is deemed necessary because it will simplify the installation process of ERPS. The goal directed project management method was chosen since it provides a way of focusing all changes towards a predetermined goal. The main stages of the methodology are the promotion and preparation steps, the proposal, the contract, the implementation and the completion. The methodology was applied as a pilot application by a major ERPS development company. Important benefits were the easy and effective guidance for all installation and analysis stages, the faster installation for the ERPS and the control and cost reduction for the installation, in terms of time, manpower, technological equipment and other resources.

  1. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  2. Methodology of evaluation of value created in the productive processes

    OpenAIRE

    M.T. Roszak

    2008-01-01

    Purpose: Of this paper was to present the methodology of analysis of the productive processes with applicationof value analysis and multi-criterion-analysis which allow to evaluate the technology and organization of theproductive processes.Design/methodology/approach: Presented in the paper methodology of evaluation of the productive processesis based on analysis of activities in the productive processes and their characteristics with reference to createdvalue in the productive chain.Findings...

  3. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  4. Performance-based, cost- and time-effective PCB analytical methodology

    International Nuclear Information System (INIS)

    Alvarado, J. S.

    1998-01-01

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the new sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval

  5. Validation of a Methodology to Predict Micro-Vibrations Based on Finite Element Model Approach

    Science.gov (United States)

    Soula, Laurent; Rathband, Ian; Laduree, Gregory

    2014-06-01

    This paper presents the second part of the ESA R&D study called "METhodology for Analysis of structure- borne MICro-vibrations" (METAMIC). After defining an integrated analysis and test methodology to help predicting micro-vibrations [1], a full-scale validation test campaign has been carried out. It is based on a bread-board representative of typical spacecraft (S/C) platform consisting in a versatile structure made of aluminium sandwich panels equipped with different disturbance sources and a dummy payload made of a silicon carbide (SiC) bench. The bread-board has been instrumented with a large set of sensitive accelerometers and tests have been performed including back-ground noise measurement, modal characterization and micro- vibration tests. The results provided responses to the perturbation coming from a reaction wheel or cryo-cooler compressors, operated independently then simultaneously with different operation modes. Using consistent modelling and associated experimental characterization techniques, a correlation status has been assessed by comparing test results with predictions based on FEM approach. Very good results have been achieved particularly for the case of a wheel in sweeping rate operation with test results over-predicted within a reasonable margin lower than two. Some limitations of the methodology have also been identified for sources operating at a fixed rate or coming with a small number of dominant harmonics and recommendations have been issued in order to deal with model uncertainties and stay conservative.

  6. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  7. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  8. Guidelines for reporting evaluations based on observational methodology.

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2015-01-01

    Observational methodology is one of the most suitable research designs for evaluating fidelity of implementation, especially in complex interventions. However, the conduct and reporting of observational studies is hampered by the absence of specific guidelines, such as those that exist for other evaluation designs. This lack of specific guidance poses a threat to the quality and transparency of these studies and also constitutes a considerable publication hurdle. The aim of this study thus was to draw up a set of proposed guidelines for reporting evaluations based on observational methodology. The guidelines were developed by triangulating three sources of information: observational studies performed in different fields by experts in observational methodology, reporting guidelines for general studies and studies with similar designs to observational studies, and proposals from experts in observational methodology at scientific meetings. We produced a list of guidelines grouped into three domains: intervention and expected outcomes, methods, and results. The result is a useful, carefully crafted set of simple guidelines for conducting and reporting observational studies in the field of program evaluation.

  9. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    Energy Technology Data Exchange (ETDEWEB)

    Vismari, Lucio Flavio, E-mail: lucio.vismari@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil); Batista Camargo Junior, Joao, E-mail: joaocamargo@usp.b [Safety Analysis Group (GAS), School of Engineering at University of Sao Paulo (Poli-USP), Av. Prof. Luciano Gualberto, Trav.3, n.158, Predio da Engenharia de Eletricidade, Sala C2-32, CEP 05508-900, Sao Paulo (Brazil)

    2011-07-15

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  10. A safety assessment methodology applied to CNS/ATM-based air traffic control system

    International Nuclear Information System (INIS)

    Vismari, Lucio Flavio; Batista Camargo Junior, Joao

    2011-01-01

    In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm , based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining 'absolute' and 'relative' safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 , using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the 'Automatic Dependent Surveillance-Broadcasting' (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric.

  11. Export Potential of the Enterprise: Essence and Methodological Bases of the Analysis

    Directory of Open Access Journals (Sweden)

    Melnyk Olga G.

    2017-03-01

    Full Text Available The article considers theoretical and methodological aspects of the analysis of the enterprise’s export potential and the methodological basis for its measurement. Analyzing and summarizing scientific works on the problem, the views of researchers on the definition of the concept of “export potential of the enterprise” are systematized. The article considers the economic content of the enterprise’s export potential from the standpoint of the system-structural approach defining it as a complex systemic formation of interrelated and interacting elements of economic and non-economic origin, internal and external action. It is found out that in the international economic space the export potential of the enterprise acquires new qualitative features reflecting not just the resource potential of the national economic entity but also the needs and interests of foreign countries and their economic agents. It is identified that the functional role of the export potential is to implement the targets of the foreign economic activity of the enterprise. The nature of these targets can be different and is formed on the principle of ensuring the needs of external markets. The level of satisfaction of these needs by an individual enterprise can be evaluated through such indicators as the volume of exports, the quality of exported products, the level of export diversification, which determine the result of the export activity and in relation to its purpose serve as a criterion of the efficiency of the enterprise’s export potential. As a result of the study, the components of the export potential of the enterprise are singled out, and a model of their interrelationships is presented. The prospects of the research are connected with branch aspects of the formation of the enterprise’s export potential allowing to highlight its structural elements and directions of development.

  12. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  13. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  14. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  15. Energy use in the Greek manufacturing sector: A methodological framework based on physical indicators with aggregation and decomposition analysis

    International Nuclear Information System (INIS)

    Salta, Myrsine; Polatidis, Heracles; Haralambopoulos, Dias

    2009-01-01

    A bottom-up methodological framework was developed and applied for the period 1985-2002, to selected manufacturing sub-sectors in Greece namely, food, beverages and tobacco, iron and steel, non-ferrous metals, non-metallic minerals and paper. Disaggregate physical data were aggregated according to their specific energy consumption (SEC) values and physical energy efficiency indicators were estimated. The Logarithmic Mean Divisia index method was also used and the effects of the production, structure and energy efficiency to changes in sub-sectoral manufacturing energy use were further assessed. Primary physical energy efficiency improved by 28% for the iron and steel and by 9% for the non-metallic minerals industries, compared to the base year 1990. For the food, beverages and tobacco and the paper sub-sectors, primary efficiency deteriorated by 20% and by 15%, respectively; finally electricity efficiency deteriorated by 7% for the non-ferrous metals. Sub-sectoral energy use is mainly driven by production output and energy efficiency changes. Sensitivity analysis showed that alternative SEC values do not influence the results whereas the selected base year is more critical for this analysis. Significant efficiency improvements refer to 'heavy' industry; 'light' industry needs further attention by energy policy to modernize its production plants and improve its efficiency

  16. Energy use in the Greek manufacturing sector: A methodological framework based on physical indicators with aggregation and decomposition analysis

    Energy Technology Data Exchange (ETDEWEB)

    Salta, Myrsine; Polatidis, Heracles; Haralambopoulos, Dias [Energy Management Laboratory, Department of Environment, University of the Aegean, University Hill, Mytilene 81100 (Greece)

    2009-01-15

    A bottom-up methodological framework was developed and applied for the period 1985-2002, to selected manufacturing sub-sectors in Greece namely, food, beverages and tobacco, iron and steel, non-ferrous metals, non-metallic minerals and paper. Disaggregate physical data were aggregated according to their specific energy consumption (SEC) values and physical energy efficiency indicators were estimated. The Logarithmic Mean Divisia index method was also used and the effects of the production, structure and energy efficiency to changes in sub-sectoral manufacturing energy use were further assessed. Primary physical energy efficiency improved by 28% for the iron and steel and by 9% for the non-metallic minerals industries, compared to the base year 1990. For the food, beverages and tobacco and the paper sub-sectors, primary efficiency deteriorated by 20% and by 15%, respectively; finally electricity efficiency deteriorated by 7% for the non-ferrous metals. Sub-sectoral energy use is mainly driven by production output and energy efficiency changes. Sensitivity analysis showed that alternative SEC values do not influence the results whereas the selected base year is more critical for this analysis. Significant efficiency improvements refer to ''heavy'' industry; ''light'' industry needs further attention by energy policy to modernize its production plants and improve its efficiency. (author)

  17. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  18. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  19. Analysis of offsite dose calculation methodology for a nuclear power reactor

    International Nuclear Information System (INIS)

    Moser, D.M.

    1995-01-01

    This technical study reviews the methodology for calculating offsite dose estimates as described in the offsite dose calculation manual (ODCM) for Pennsylvania Power and Light - Susquehanna Steam Electric Station (SSES). An evaluation of the SSES ODCM dose assessment methodology indicates that it conforms with methodology accepted by the US Nuclear Regulatory Commission (NRC). Using 1993 SSES effluent data, dose estimates are calculated according to SSES ODCM methodology and compared to the dose estimates calculated according to SSES ODCM and the computer model used to produce the reported 1993 dose estimates. The 1993 SSES dose estimates are based on the axioms of Publication 2 of the International Commission of Radiological Protection (ICRP). SSES Dose estimates based on the axioms of ICRP Publication 26 and 30 reveal the total body estimates to be the most affected

  20. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  1. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  2. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  3. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    International Nuclear Information System (INIS)

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  4. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  5. Multicriteria decision-making analysis based methodology for predicting carbonate rocks' uniaxial compressive strength

    Directory of Open Access Journals (Sweden)

    Ersoy Hakan

    2012-10-01

    Full Text Available

    ABSTRACT

    Uniaxial compressive strength (UCS deals with materials' to ability to withstand axially-directed pushing forces and especially considered to be rock materials' most important mechanical properties. However, the UCS test is an expensive, very time-consuming test to perform in the laboratory and requires high-quality core samples having regular geometry. Empirical equations were thus proposed for predicting UCS as a function of rocks' index properties. Analytical hierarchy process and multiple regression analysis based methodology were used (as opposed to traditional linear regression methods on data-sets obtained from carbonate rocks in NE Turkey. Limestone samples ranging from Devonian to late Cretaceous ages were chosen; travertine-onyx samples were selected from morphological environments considering their surface environmental conditions Test results from experiments carried out on about 250 carbonate rock samples were used in deriving the model. While the hierarchy model focused on determining the most important index properties affecting on UCS, regression analysis established meaningful relationships between UCS and index properties; 0. 85 and 0. 83 positive coefficient correlations between the variables were determined by regression analysis. The methodology provided an appropriate alternative to quantitative estimation of UCS and avoided the need for tedious and time consuming laboratory testing


    RESUMEN

    La resistencia a la compresión uniaxial (RCU trata con la capacidad de los materiales para soportar fuerzas empujantes dirigidas axialmente y, especialmente, es considerada ser uno de las más importantes propiedades mecánicas de

  6. Opening Remarks of the Acquisition Path Analysis Methodology Session

    International Nuclear Information System (INIS)

    Renis, T.

    2015-01-01

    An overview of the recent development work that has been done on acquisition path analysis, implementation of the methodologies within the Department of Safeguards, lessons learned and future areas for development will be provided. (author)

  7. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  8. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  9. A Metadata based Knowledge Discovery Methodology for Seeding Translational Research.

    Science.gov (United States)

    Kothari, Cartik R; Payne, Philip R O

    2015-01-01

    In this paper, we present a semantic, metadata based knowledge discovery methodology for identifying teams of researchers from diverse backgrounds who can collaborate on interdisciplinary research projects: projects in areas that have been identified as high-impact areas at The Ohio State University. This methodology involves the semantic annotation of keywords and the postulation of semantic metrics to improve the efficiency of the path exploration algorithm as well as to rank the results. Results indicate that our methodology can discover groups of experts from diverse areas who can collaborate on translational research projects.

  10. Value-Based Assessment of New Medical Technologies: Towards a Robust Methodological Framework for the Application of Multiple Criteria Decision Analysis in the Context of Health Technology Assessment.

    Science.gov (United States)

    Angelis, Aris; Kanavos, Panos

    2016-05-01

    In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.

  11. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  12. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  13. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  14. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  15. Methodology of Maqamat Hamadani and Hariri Based on Buseman’s statistical methodology

    Directory of Open Access Journals (Sweden)

    Hamed Sedghi

    2016-02-01

    Full Text Available Abstract  Stylistics can be defined as analysis and interpretation of the expression and different forms of speech, based on linguistic elements. German theorist , Buseman , devised his theses in statistical style based on the ratio of verbs to adjectives, generalizing a variety of literary and non- literary genera in German Language and Literature .   According to Buseman , increasing the number of verbs in verses, makes the text closer to literary style. Therefore, increasing the number of adjectives, makes the text closer to scientific or subjective style.   The most important achievement of statistical methodology of Buseman can be considered as: a comparison of author’s style, literary periods and genres; b studying the language system and variety of words; and, c classification and grading differences or similarities of literary, periods and genres.   The purpose of this study: Stylistic analysis of Maqamat Hamadani and al-Hariri based on the statistical model of Buseman.   The question proposed in this study : a How effective is the use of the statistical methods , in identifying and analyzing a variety of literature including Maqamat? b How effective is the quantitative scope of verbs and adjectives , as two key parameters in determining the style of literary works? And c Which element of fiction is most effective in enriching the literary approach ?   Specific research method is statistical – analysis ; We initially discovered and classified the number of verbs and adjectives in the Fifty- one of Hamadani Maqamehs and Fifty of Hariri Maqamehs ; then the scope of verbs and adjectives usage is shown in the form of tables and graphs . After that , we will assess the style of the literary work , based on the use of verbs and adjectives.   The research findings show : All Hamadani and Hariri Maqamat s quantitatively benefit from high -active approach, for the use of the verb . At 46 Maqameh Hamadani , and 48 Maqameh Hariri the

  16. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  17. Spatial analysis of electricity demand patterns in Greece: Application of a GIS-based methodological framework

    Science.gov (United States)

    Tyralis, Hristos; Mamassis, Nikos; Photis, Yorgos N.

    2016-04-01

    We investigate various uses of electricity demand in Greece (agricultural, commercial, domestic, industrial use as well as use for public and municipal authorities and street lightning) and we examine their relation with variables such as population, total area, population density and the Gross Domestic Product. The analysis is performed on data which span from 2008 to 2012 and have annual temporal resolution and spatial resolution down to the level of prefecture. We both visualize the results of the analysis and we perform cluster and outlier analysis using the Anselin local Moran's I statistic as well as hot spot analysis using the Getis-Ord Gi* statistic. The definition of the spatial patterns and relationships of the aforementioned variables in a GIS environment provides meaningful insight and better understanding of the regional development model in Greece and justifies the basis for an energy demand forecasting methodology. Acknowledgement: This research has been partly financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARISTEIA II: Reinforcement of the interdisciplinary and/ or inter-institutional research and innovation (CRESSENDO project; grant number 5145).

  18. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  19. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  20. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  1. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  2. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  3. Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

    OpenAIRE

    J. R. Wang; S. W. Chen; Y. Chiang; W. S. Hsu; J. H. Yang; Y. S. Tseng; C. Shih

    2017-01-01

    In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the paramet...

  4. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  5. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  6. Active teaching-learning methodologies: medical students' views of problem-based learning

    Directory of Open Access Journals (Sweden)

    José Roberto Bittencourt Costa

    Full Text Available The prevailing undergraduate medical training process still favors disconnection and professional distancing from social needs. The Brazilian Ministries of Education and Health, through the National Curriculum Guidelines, the Incentives Program for Changes in the Medical Curriculum (PROMED, and the National Program for Reorientation of Professional Training in Health (PRO-SAÚDE, promoted the stimulus for an effective connection between medical institutions and the Unified National Health System (SUS. In accordance to the new paradigm for medical training, the Centro Universitário Serra dos Órgãos (UNIFESO established a teaching plan in 2005 using active methodologies, specifically problem-based learning (PBL. Research was conducted through semi-structured interviews with third-year undergraduate students at the UNIFESO Medical School. The results were categorized as proposed by Bardin's thematic analysis, with the purpose of verifying the students' impressions of the new curriculum. Active methodologies proved to be well-accepted by students, who defined them as exciting and inclusive of theory and practice in medical education.

  7. A pattern-based methodology for optimizing stitches in double-patterning technology

    Science.gov (United States)

    Wang, Lynn T.; Madhavan, Sriram; Dai, Vito; Capodieci, Luigi

    2015-03-01

    A pattern-based methodology for optimizing stitches is developed based on identifying stitch topologies and replacing them with pre-characterized fixing solutions in decomposed layouts. A topology-based library of stitches with predetermined fixing solutions is built. A pattern-based engine searches for matching topologies in the decomposed layouts. When a match is found, the engine opportunistically replaces the predetermined fixing solution: only a design rule check error-free replacement is preserved. The methodology is demonstrated on a 20nm layout design that contains over 67 million, first metal layer stitches. Results show that a small library containing 3 stitch topologies improves the stitch area regularity by 4x.

  8. Health economic assessment: a methodological primer.

    Science.gov (United States)

    Simoens, Steven

    2009-12-01

    This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs), an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis), and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  9. Health Economic Assessment: A Methodological Primer

    Directory of Open Access Journals (Sweden)

    Steven Simoens

    2009-11-01

    Full Text Available This review article aims to provide an introduction to the methodology of health economic assessment of a health technology. Attention is paid to defining the fundamental concepts and terms that are relevant to health economic assessments. The article describes the methodology underlying a cost study (identification, measurement and valuation of resource use, calculation of costs, an economic evaluation (type of economic evaluation, the cost-effectiveness plane, trial- and model-based economic evaluation, discounting, sensitivity analysis, incremental analysis, and a budget impact analysis. Key references are provided for those readers who wish a more advanced understanding of health economic assessments.

  10. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  11. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  12. Directed Graph Methodology for Acquisition Path Analysis: a possible tool to support the state-level approach

    International Nuclear Information System (INIS)

    Vincze, Arpad; Nemeth, Andras

    2013-01-01

    According to a recent statement, the IAEA seeks to develop a more effective safeguards system to achieve greater deterrence, because deterrence of proliferation is much more effective than detection. To achieve this goal, a less predictive safeguards system is being developed based on the advanced state-level approach that is driven by all available safeguards-relevant information. The 'directed graph analysis' is recommended as a possible methodology to implement acquisition path analysis by the IAEA to support the State evaluation process. The basic methodology is simple, well established, powerful, and its adaptation to the modelling of the nuclear profile of a State requires minimum software development. Based on this methodology the material flow network model has been developed under the Hungarian Support Programme to the IAEA, which is described in detail. In the proposed model, materials in different chemical and physical form can flow through pipes representing declared processes, material transports, diversions or undeclared processes. The nodes of the network are the material types, while the edges of the network are the pipes. A state parameter (p) is assigned to each node and edge representing the probability of their existence in the State. The possible application of this model in the State-level analytical approach will be discussed and outlook for further work will be given. The paper is followed by the slides of the presentation

  13. Health Data Entanglement and artificial intelligence-based analysis: a brand new methodology to improve the effectiveness of healthcare services.

    Science.gov (United States)

    Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G

    2016-01-01

    Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.

  14. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  15. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  16. Methodology сomparative statistical analysis of Russian industry based on cluster analysis

    Directory of Open Access Journals (Sweden)

    Sergey S. Shishulin

    2017-01-01

    Full Text Available The article is devoted to researching of the possibilities of applying multidimensional statistical analysis in the study of industrial production on the basis of comparing its growth rates and structure with other developed and developing countries of the world. The purpose of this article is to determine the optimal set of statistical methods and the results of their application to industrial production data, which would give the best access to the analysis of the result.Data includes such indicators as output, output, gross value added, the number of employed and other indicators of the system of national accounts and operational business statistics. The objects of observation are the industry of the countrys of the Customs Union, the United States, Japan and Erope in 2005-2015. As the research tool used as the simplest methods of transformation, graphical and tabular visualization of data, and methods of statistical analysis. In particular, based on a specialized software package (SPSS, the main components method, discriminant analysis, hierarchical methods of cluster analysis, Ward’s method and k-means were applied.The application of the method of principal components to the initial data makes it possible to substantially and effectively reduce the initial space of industrial production data. Thus, for example, in analyzing the structure of industrial production, the reduction was from fifteen industries to three basic, well-interpreted factors: the relatively extractive industries (with a low degree of processing, high-tech industries and consumer goods (medium-technology sectors. At the same time, as a result of comparison of the results of application of cluster analysis to the initial data and data obtained on the basis of the principal components method, it was established that clustering industrial production data on the basis of new factors significantly improves the results of clustering.As a result of analyzing the parameters of

  17. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  18. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  19. Benefits of Exergy-Based Analysis for Aerospace Engineering Applications—Part I

    Directory of Open Access Journals (Sweden)

    John H. Doty

    2009-01-01

    Full Text Available This paper compares the analysis of systems from two different perspectives: an energy-based focus and an exergy-based focus. A complex system was simply modeled as interacting thermodynamic systems to illustrate the differences in analysis methodologies and results. The energy-based analysis had combinations of calculated states that are infeasible. On the other hand, the exergy-based analyses only allow feasible states. More importantly, the exergy-based analyses provide clearer insight to the combination of operating conditions for optimum system-level performance. The results strongly suggest changing the analysis/design paradigm used in aerospace engineering from energy-based to exergy-based. This methodology shift is even more critical in exploratory research and development where previous experience may not be available to provide guidance. Although the models used herein may appear simplistic, the message is very powerful and extensible to higher-fidelity models: the 1st Law is only a necessary condition for design, whereas the 1st and 2nd Laws provide the sufficiency condition.

  20. A grey-based group decision-making methodology for the selection of hydrogen technologiess in Life Cycle Sustainability perspective

    DEFF Research Database (Denmark)

    Manzardo, Alessandro; Ren, Jingzheng; Mazzi, Anna

    2012-01-01

    The objective of this research is to develop a grey-based group decision-making methodology for the selection of the best renewable energy technology (including hydrogen) using a life cycle sustainability perspective. The traditional grey relational analysis has been modified to better address...... the issue of uncertainty. The proposed methodology allows multi-person to participate in the decision-making process and to give linguistic evaluation on the weights of the criteria and the performance of the alternative technologies. In this paper, twelve hydrogen production technologies have been assessed...... using the proposed methodology, electrolysis of water technology by hydropower has been considered to be the best technology for hydrogen production according to the decision-making group....

  1. A methodology for analysis of impacts of grid integration of renewable energy

    International Nuclear Information System (INIS)

    George, Mel; Banerjee, Rangan

    2011-01-01

    Present electricity grids are predominantly thermal (coal, gas) and hydro based. Conventional power planning involves hydro-thermal scheduling and merit order dispatch. In the future, modern renewables (hydro, solar and biomass) are likely to have a significant share in the power sector. This paper presents a method to analyse the impacts of renewables in the electricity grid. A load duration curve based approach has been developed. Renewable energy sources have been treated as negative loads to obtain a modified load duration curve from which capacity savings in terms of base and peak load generation can be computed. The methodology is illustrated for solar, wind and biomass power for Tamil Nadu (a state in India). The trade-offs and interaction between renewable sources are analysed. The impacts on capacity savings by varying the wind regime have also been shown. Scenarios for 2021-22 have been constructed to illustrate the methodology proposed. This technique can be useful for power planners for an analysis of renewables in future electricity grids. - Research highlights: → A new method to analyse impacts of renewables in the electricity grid. → Effects of wind, solar PV and biomass power on load duration curve and capacity savings are shown. → Illustration of intermittent renewables and their interplay for sites in India and the UK. → Future scenarios constructed for generation expansion planning with higher levels of renewable.

  2. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  3. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  4. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  5. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  6. A modified GO-FLOW methodology with common cause failure based on Discrete Time Bayesian Network

    International Nuclear Information System (INIS)

    Fan, Dongming; Wang, Zili; Liu, Linlin; Ren, Yi

    2016-01-01

    Highlights: • Identification of particular causes of failure for common cause failure analysis. • Comparison two formalisms (GO-FLOW and Discrete Time Bayesian network) and establish the correlation between them. • Mapping the GO-FLOW model into Bayesian network model. • Calculated GO-FLOW model with common cause failures based on DTBN. - Abstract: The GO-FLOW methodology is a success-oriented system reliability modelling technique for multi-phase missions involving complex time-dependent, multi-state and common cause failure (CCF) features. However, the analysis algorithm cannot easily handle the multiple shared signals and CCFs. In addition, the simulative algorithm is time consuming when vast multi-state components exist in the model, and the multiple time points of phased mission problems increases the difficulty of the analysis method. In this paper, the Discrete Time Bayesian Network (DTBN) and the GO-FLOW methodology are integrated by the unified mapping rules. Based on these rules, the multi operators can be mapped into DTBN followed by, a complete GO-FLOW model with complex characteristics (e.g. phased mission, multi-state, and CCF) can be converted to the isomorphic DTBN and easily analyzed by utilizing the DTBN. With mature algorithms and tools, the multi-phase mission reliability parameter can be efficiently obtained via the proposed approach without considering the shared signals and the various complex logic operation. Meanwhile, CCF can also arise in the computing process.

  7. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  8. An Image Analysis-Based Methodology for Chromite Exploration through Opto-Geometric Parameters; a Case Study in Faryab Area, SE of Iran

    Directory of Open Access Journals (Sweden)

    Mansur Ziaii

    2017-06-01

    Full Text Available Traditional methods of chromite exploration are mostly based on geophysical techniques and drilling operations. They are expensive and time-consuming. Furthermore, they suffer from several shortcomings such as lack of sufficient geophysical density contrast. In order to overcome these drawbacks, the current research work is carried out to introduce a novel, automatic and opto-geometric image analysis (OGIA technique for extracting the structural properties of chromite minerals using polished thin sections prepared from outcrops. Several images are taken from polished thick sections through a reflected-light microscope equipped with a digital camera. The images are processed in filtering and segmentation steps to extract the worthwhile information of chromite minerals. The directional density of chromite minerals, as a textural property, is studied in different inclinations, and the main trend of chromite growth is identified. Microscopic inclination of chromite veins can be generalized for exploring the macroscopic layers of chromite buried under either the surface quaternary alluvium or overburden rocks. The performance of the OGIA methodology is tested in a real case study, where several exploratory boreholes are drilled. The results obtained show that the microscopic investigation outlines through image analysis are in good agreement with the results obtained from interpretation of boreholes. The OGIA method represents a reliable map of the absence or existence of chromite ore deposits in different horizontal surfaces. Directing the exploration investigations toward more susceptible zones (potentials and preventing from wasting time and money are the major contributions of the OGIA methodology. It leads to make an optimal managerial and economical decision.

  9. The TE coupled RELAP5/PANTHER/COBRA code package and methodology for integrated PWR accident analysis

    International Nuclear Information System (INIS)

    Schneidesch, Christophe R.; Zhang, Jinzhao; Ammirabile, Luca; Dalleur, Jean-Paul

    2006-01-01

    At Tractebel Engineering (TE), a dynamic coupling has been developed between the best estimate thermal hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel core thermal-hydraulic analysis code COBRA 3C has been established for on-line calculation of the Departure from Nucleate Boiling Ratio (DNBR). In addition to the standard RELAP5-PANTHER coupling, the fully dynamic coupling of the RELAP5/PANTHER/COBRA3C-TE code package can be activated for evaluation purposes in which the PANTHER close-channel thermal-hydraulics module is replaced by the COBRA3C-TE with cross flow modelling and extended T-H flow conditions capabilities. The qualification of the RELAP5-PANTHER coupling demonstrated the robustness achieved by the combined 3-D neutron kinetics/system T-H code package for transient simulations. The coupled TE code package has been approved by the Belgian Safety Authorities and is used at TE for analyzing asymmetric PWR accidents with strong core-system interactions. In particular, the TE coupled code package was first used to develop a main steam line break in hot shutdown conditions (SLBHZP) accident analysis methodology based on the TE deterministic bounding approach. This methodology has been reviewed and accepted by the Belgian Safety Authorities for specific applications. Those specific applications are related to the power up-rate and steam generator replacement project of the Doel 2 plant or to the Tihange-3 SLB accident re-analysis. A coupled feedwater line break (FLB) accident analysis methodology is currently being reviewed for application approval. The results of coupled thermal-hydraulic and neutronic analysis of SLB and FLB show that there exist important margins in the traditional final safety analysis report (FSAR) accident analysis. Those margins can be used to increase the operational flexibility of the plants. Moreover, the

  10. The TE coupled RELAP5/PANTHER/COBRA code package and methodology for integrated PWR accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schneidesch, Christophe R.; Zhang, Jinzhao; Ammirabile, Luca; Dalleur, Jean-Paul [Suez-Tractebel Engineering, Avenue Ariane 7, B-1200 Brussels (Belgium)

    2006-07-01

    At Tractebel Engineering (TE), a dynamic coupling has been developed between the best estimate thermal hydraulics system code RELAP5 and the 3-dimensional neutronics code PANTHER via the transient analysis code linkage program TALINK. An interface between PANTHER and the subchannel core thermal-hydraulic analysis code COBRA 3C has been established for on-line calculation of the Departure from Nucleate Boiling Ratio (DNBR). In addition to the standard RELAP5-PANTHER coupling, the fully dynamic coupling of the RELAP5/PANTHER/COBRA3C-TE code package can be activated for evaluation purposes in which the PANTHER close-channel thermal-hydraulics module is replaced by the COBRA3C-TE with cross flow modelling and extended T-H flow conditions capabilities. The qualification of the RELAP5-PANTHER coupling demonstrated the robustness achieved by the combined 3-D neutron kinetics/system T-H code package for transient simulations. The coupled TE code package has been approved by the Belgian Safety Authorities and is used at TE for analyzing asymmetric PWR accidents with strong core-system interactions. In particular, the TE coupled code package was first used to develop a main steam line break in hot shutdown conditions (SLBHZP) accident analysis methodology based on the TE deterministic bounding approach. This methodology has been reviewed and accepted by the Belgian Safety Authorities for specific applications. Those specific applications are related to the power up-rate and steam generator replacement project of the Doel 2 plant or to the Tihange-3 SLB accident re-analysis. A coupled feedwater line break (FLB) accident analysis methodology is currently being reviewed for application approval. The results of coupled thermal-hydraulic and neutronic analysis of SLB and FLB show that there exist important margins in the traditional final safety analysis report (FSAR) accident analysis. Those margins can be used to increase the operational flexibility of the plants. Moreover, the

  11. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  12. The bounds on tracking performance utilising a laser-based linear and angular sensing and measurement methodology for micro/nano manipulation

    International Nuclear Information System (INIS)

    Clark, Leon; Shirinzadeh, Bijan; Tian, Yanling; Zhong, Yongmin

    2014-01-01

    This paper presents an analysis of the tracking performance of a planar three degrees of freedom (DOF) flexure-based mechanism for micro/nano manipulation, utilising a tracking methodology for the measurement of coupled linear and angular motions. The methodology permits trajectories over a workspace with large angular range through the reduction of geometric errors. However, when combining this methodology with feedback control systems, the accuracy of performed manipulations can only be stated within the bounds of the uncertainties in measurement. The dominant sources of error and uncertainty within each sensing subsystem are therefore identified, which leads to a formulation of the measurement uncertainty in the final system outputs, in addition to methods of reducing their magnitude. Specific attention is paid to the analysis of the vision-based subsystem utilised for the measurement of angular displacement. Furthermore, a feedback control scheme is employed to minimise tracking errors, and the coupling of certain measurement errors is shown to have a detrimental effect on the controller operation. The combination of controller tracking errors and measurement uncertainty provides the bounds on the final tracking performance. (paper)

  13. Motivating Students for Project-based Learning for Application of Research Methodology Skills.

    Science.gov (United States)

    Tiwari, Ranjana; Arya, Raj Kumar; Bansal, Manoj

    2017-12-01

    Project-based learning (PBL) is motivational for students to learn research methodology skills. It is a way to engage and give them ownership over their own learning. The aim of this study is to use PBL for application of research methodology skills for better learning by encouraging an all-inclusive approach in teaching and learning rather than an individualized tailored approach. The present study was carried out for MBBS 6 th - and 7 th -semester students of community medicine. Students and faculties were sensitized about PBL and components of research methodology skills. They worked in small groups. The students were asked to fill the student feedback Questionnaire and the faculty was also asked to fill the faculty feedback Questionnaire. Both the Questionnaires were assessed on a 5 point Likert scale. After submitted projects, document analysis was done. A total of 99 students of the 6 th and 7 th semester were participated in PBL. About 90.91% students agreed that there should be continuation of PBL in subsequent batches. 73.74% felt satisfied and motivated with PBL, whereas 76.77% felt that they would be able to use research methodology in the near future. PBL requires considerable knowledge, effort, persistence, and self-regulation on the part of the students. They need to devise plans, gather information evaluate both the findings, and their approach. Facilitator plays a critical role in helping students in the process by shaping opportunity for learning, guiding students, thinking, and helping them construct new understanding.

  14. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  15. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task; SWR Stabilitaetsanalyse: Methodik der Stabilitaetsanalyse und PSI-Ergebnisse zur NEA/NCR Benchmarkaufgabe

    Energy Technology Data Exchange (ETDEWEB)

    Hennig, D.; Nechvatal, L. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs.

  16. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...... parameters influence the performance of the WEC can also be investigated using this methodology.......This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests...... leads to testing campaigns that are not as extensive as desired. Therefore, the performance analysis should be robust enough to allow for not fully complete sea trials and sub optimal performance data. In other words, this methodology is focused at retrieving the maximum amount of useful information out...

  17. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Directory of Open Access Journals (Sweden)

    Faycal Mimouni

    2016-04-01

    Full Text Available Purpose: Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Design/methodology/approach: Network modeling by combining Petri and Bayesian network. Findings: Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Research limitations/implications: Demands are independent from returns. Practical implications: Model can only be used on nonperishable products. Social implications: Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Originality/value: Bayesian network with a cycle combined with the Petri Network.

  19. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  20. A methodology for eliciting, representing, and analysing stakeholder knowledge for decision making on complex socio-ecological systems: from cognitive maps to agent-based models.

    Science.gov (United States)

    Elsawah, Sondoss; Guillaume, Joseph H A; Filatova, Tatiana; Rook, Josefine; Jakeman, Anthony J

    2015-03-15

    This paper aims to contribute to developing better ways for incorporating essential human elements in decision making processes for modelling of complex socio-ecological systems. It presents a step-wise methodology for integrating perceptions of stakeholders (qualitative) into formal simulation models (quantitative) with the ultimate goal of improving understanding and communication about decision making in complex socio-ecological systems. The methodology integrates cognitive mapping and agent based modelling. It cascades through a sequence of qualitative/soft and numerical methods comprising: (1) Interviews to elicit mental models; (2) Cognitive maps to represent and analyse individual and group mental models; (3) Time-sequence diagrams to chronologically structure the decision making process; (4) All-encompassing conceptual model of decision making, and (5) computational (in this case agent-based) Model. We apply the proposed methodology (labelled ICTAM) in a case study of viticulture irrigation in South Australia. Finally, we use strengths-weakness-opportunities-threats (SWOT) analysis to reflect on the methodology. Results show that the methodology leverages the use of cognitive mapping to capture the richness of decision making and mental models, and provides a combination of divergent and convergent analysis methods leading to the construction of an Agent Based Model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Methodology for assessing laser-based equipment

    Science.gov (United States)

    Pelegrina-Bonilla, Gabriel; Hermsdorf, Jörg; Thombansen, Ulrich; Abels, Peter; Kaierle, Stefan; Neumann, Jörg

    2017-10-01

    Methodologies for the assessment of technology's maturity are widely used in industry and research. Probably the best known are technology readiness levels (TRLs), initially pioneered by the National Aeronautics and Space Administration (NASA). At the beginning, only descriptively defined TRLs existed, but over time, automated assessment techniques in the form of questionnaires emerged in order to determine TRLs. Originally TRLs targeted equipment for space applications, but the demands on industrial relevant equipment are partly different in terms of, for example, overall costs, product quantities, or the presence of competitors. Therefore, we present a commonly valid assessment methodology with the aim of assessing laser-based equipment for industrial use, in general. The assessment is carried out with the help of a questionnaire, which allows for a user-friendly and easy accessible way to monitor the progress from the lab-proven state to the application-ready product throughout the complete development period. The assessment result is presented in a multidimensional metric in order to reveal the current specific strengths and weaknesses of the equipment development process, which can be used to direct the remaining development process of the equipment in the right direction.

  2. A Simulation-Based Soft Error Estimation Methodology for Computer Systems

    OpenAIRE

    Sugihara, Makoto; Ishihara, Tohru; Hashimoto, Koji; Muroyama, Masanori

    2006-01-01

    This paper proposes a simulation-based soft error estimation methodology for computer systems. Accumulating soft error rates (SERs) of all memories in a computer system results in pessimistic soft error estimation. This is because memory cells are used spatially and temporally and not all soft errors in them make the computer system faulty. Our soft-error estimation methodology considers the locations and the timings of soft errors occurring at every level of memory hierarchy and estimates th...

  3. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  4. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  5. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  6. Morphological images analysis and chromosomic aberrations classification based on fuzzy logic

    International Nuclear Information System (INIS)

    Souza, Leonardo Peres

    2011-01-01

    This work has implemented a methodology for automation of images analysis of chromosomes of human cells irradiated at IEA-R1 nuclear reactor (located at IPEN, Sao Paulo, Brazil), and therefore subject to morphological aberrations. This methodology intends to be a tool for helping cytogeneticists on identification, characterization and classification of chromosomal metaphasic analysis. The methodology development has included the creation of a software application based on artificial intelligence techniques using Fuzzy Logic combined with image processing techniques. The developed application was named CHRIMAN and is composed of modules that contain the methodological steps which are important requirements in order to achieve an automated analysis. The first step is the standardization of the bi-dimensional digital image acquisition procedure through coupling a simple digital camera to the ocular of the conventional metaphasic analysis microscope. Second step is related to the image treatment achieved through digital filters application; storing and organization of information obtained both from image content itself, and from selected extracted features, for further use on pattern recognition algorithms. The third step consists on characterizing, counting and classification of stored digital images and extracted features information. The accuracy in the recognition of chromosome images is 93.9%. This classification is based on classical standards obtained at Buckton [1973], and enables support to geneticist on chromosomic analysis procedure, decreasing analysis time, and creating conditions to include this method on a broader evaluation system on human cell damage due to ionizing radiation exposure. (author)

  7. Thermodynamic and exergoeconomic analysis of a cement plant: Part I – Methodology

    International Nuclear Information System (INIS)

    Atmaca, Adem; Yumrutaş, Recep

    2014-01-01

    Highlights: • Energy, exergy and exergoeconomic analysis of a complete cement plant have been investigated. • The first and second law efficiencies based on the energy and exergy analysis are defined for the entire cement plant. • The specific energy consumption of the whole sections of the cement plant have been analyzed. • The specific manufacturing costs of farine, clinker and cement have been determined by the cost analysis. - Abstract: The energy, exergy and exergoeconomic analysis of a cement factory has been studied within two parts. This paper is the first part of the study which includes the thermodynamic and exergoeconomic methodology and formulations developed for such a comprehensive and detailed analysis. The second part of this study is about the application of the developed formulation which considers an actual cement plant located in Gaziantep, Turkey. The energy consumption by the cement industry is about 5% of the total global industrial energy consumption. It is also one of the world’s largest industrial sources of CO 2 emissions. In this paper, a cement plant is considered with all main manufacturing units. Mass, energy, and exergy balances are applied to each system. The first and second law efficiencies based on the energy and exergy analysis and performance assessment parameters are defined for the entire cement plant. The formulations for the cost of products, and cost formation and allocation within the system are developed based on exergoeconomic analysis. In order to obtain the optimal marketing price of cement and to decrease specific energy consumption of the whole plant, the cost analysis formulated here have substantial importance

  8. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  9. Consensus-based methodology for detection communities in multilayered networks

    Science.gov (United States)

    Karimi-Majd, Amir-Mohsen; Fathian, Mohammad; Makrehchi, Masoud

    2018-03-01

    Finding groups of network users who are densely related with each other has emerged as an interesting problem in the area of social network analysis. These groups or so-called communities would be hidden behind the behavior of users. Most studies assume that such behavior could be understood by focusing on user interfaces, their behavioral attributes or a combination of these network layers (i.e., interfaces with their attributes). They also assume that all network layers refer to the same behavior. However, in real-life networks, users' behavior in one layer may differ from their behavior in another one. In order to cope with these issues, this article proposes a consensus-based community detection approach (CBC). CBC finds communities among nodes at each layer, in parallel. Then, the results of layers should be aggregated using a consensus clustering method. This means that different behavior could be detected and used in the analysis. As for other significant advantages, the methodology would be able to handle missing values. Three experiments on real-life and computer-generated datasets have been conducted in order to evaluate the performance of CBC. The results indicate superiority and stability of CBC in comparison to other approaches.

  10. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  11. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  12. Applications of a damage tolerance analysis methodology in aircraft design and production

    Science.gov (United States)

    Woodward, M. R.; Owens, S. D.; Law, G. E.; Mignery, L. A.

    1992-01-01

    Objectives of customer mandated aircraft structural integrity initiatives in design are to guide material selection, to incorporate fracture resistant concepts in the design, to utilize damage tolerance based allowables and planned inspection procedures necessary to enhance the safety and reliability of manned flight vehicles. However, validated fracture analysis tools for composite structures are needed to accomplish these objectives in a timely and economical manner. This paper briefly describes the development, validation, and application of a damage tolerance methodology for composite airframe structures. A closed-form analysis code, entitled SUBLAM was developed to predict the critical biaxial strain state necessary to cause sublaminate buckling-induced delamination extension in an impact damaged composite laminate. An embedded elliptical delamination separating a thin sublaminate from a thick parent laminate is modelled. Predicted failure strains were correlated against a variety of experimental data that included results from compression after impact coupon and element tests. An integrated analysis package was developed to predict damage tolerance based margin-of-safety (MS) using NASTRAN generated loads and element information. Damage tolerance aspects of new concepts are quickly and cost-effectively determined without the need for excessive testing.

  13. PETA: Methodology of Information Systems Security Penetration Testing

    Directory of Open Access Journals (Sweden)

    Tomáš Klíma

    2016-12-01

    Full Text Available Current methodologies of information systems penetration testing focuses mainly on a high level and technical description of the testing process. Unfortunately, there is no methodology focused primarily on the management of these tests. It often results in a situation when the tests are badly planned, managed and the vulnerabilities found are unsystematically remediated. The goal of this article is to present new methodology called PETA which is focused mainly on the management of penetration tests. Development of this methodology was based on the comparative analysis of current methodologies. New methodology incorporates current best practices of IT governance and project management represented by COBIT and PRINCE2 principles. Presented methodology has been quantitatively evaluated.

  14. State of the art in HGPT (Heuristically Based Generalized Perturbation) methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1993-01-01

    A distinctive feature of heuristically based generalized perturbation theory (HGPT) methodology consists in the systematic use of importance conservation concepts. As well known, this use leads to fundamental reciprocity relationships from which perturbation, or sensitivity, expressions can be derived. The state of the art of the HGPT methodology is here illustrated. The application to a number of specific nonlinear fields of interest is commented. (author)

  15. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  16. METHODOLOGICAL APPROACHES TO THE ANALYSIS OF EFFICIENCY OF CASH FLOW MANAGEMENT IN INVESTMENT ACTIVITY OF THE ENTERPRISES

    OpenAIRE

    I. Magdych

    2015-01-01

    The article explores the methodological approaches to the analysis of cash flows in investment activity of the enterprise; the system of motion net cash flows, reflecting the impact of cash management efficiency on the amount and source of investment cash flows of the enterprise; analytical model of definition of effectiveness of cash management of the enterprise is proposed, based on the selected principals of modeling, comprehensive analysis of cash flows in investing activities and their o...

  17. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  18. A methodology for selection of wind energy system locations using multicriterial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sansevic, M.; Rabadan, Lj. Pilic [Croatia Univ., Faculty of Electrical Engineering, Mechanical Engineering and Naval Architecture, Split (Croatia)

    1996-12-31

    The effectiveness of a wind turbine generator depends not only on its performance but also on the site`s wind resource. Thus the problem of location selection should be approached systematically, by considering a set of relevant parameters particularly those having a significant economical and ecological impact. This paper presents the methodology used in location selection for the operation of wind energy system. It is based on a multicriterial analysis which enables comparison and ranking of locations according to a set of different parameters. Principal objectives (criteria) in location selection are: energy-economical, technical-technological, physical planning and environment and life protection objectives. For the mathematical modeling of this multicriterial problem the PROMETHEE method is chosen which is developed especially for the solution of rather ``poorly`` structured problems, thus justifying its application in the preliminary stage of site selection for the wind energy systems. The developed methodology is applied in selecting the locations in the island of Rhodes using the available database of the Geographic Information System and the wind potential data obtained by means of the AIOLOS program. (Author)

  19. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  20. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  1. Vulnerability curves vs. vulnerability indicators: application of an indicator-based methodology for debris-flow hazards

    Science.gov (United States)

    Papathoma-Köhle, Maria

    2016-08-01

    The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.

  2. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  3. The GDOR model. A new methodology for the analysis of training needs: The case of Andorra

    Directory of Open Access Journals (Sweden)

    Marc Eguiguren Huerta

    2012-09-01

    Full Text Available Purpose: This article investigates the status and importance of continuing training in companies in the Principality of Andorra and its impact on the economic development of the country.Design/methodology/approach: The analysis is based on GDOR, a methodology based on the impact of training decisions on economic indicators and ratios that has been developed by the authors. By using GDOR, the authors explore and understand the current situation and the training needs of the main sectors in the Andorran economy.Findings: The findings incorporate a different view of lifelong learning training needs in Andorra much more related to the development needs of the country.Originality/value: With reference to best practice from different countries, particularly those in Europe, an original and new proposal to address those training needs is presented including recommendations to the country’s authorities on how to manage lifelong learning policies.

  4. Methodology for Selecting Best Management Practices Integrating Multiple Stakeholders and Criteria. Part 1: Methodology

    Directory of Open Access Journals (Sweden)

    Mauricio Carvallo Aceves

    2016-02-01

    Full Text Available The implementation of stormwater Best Management Practices (BMPs could help re-establish the natural hydrological cycle of watersheds after urbanization, with each BMP presenting a different performance across a range of criteria (flood prevention, pollutant removal, etc.. Additionally, conflicting views from the relevant stakeholders may arise, resulting in a complex selection process. This paper proposes a methodology for BMP selection based on the application of multi-criteria decision aid (MCDA methods, integrating multiple stakeholder priorities and BMP combinations. First, in the problem definition, the MCDA methods, relevant criteria and design guidelines are selected. Next, information from the preliminary analysis of the watershed is used to obtain a list of relevant BMPs. The third step comprises the watershed modeling and analysis of the BMP alternatives to obtain performance values across purely objective criteria. Afterwards, a stakeholder analysis based on survey applications is carried out to obtain social performance values and criteria priorities. Then, the MCDA methods are applied to obtain the final BMP rankings. The last step considers the sensitivity analysis and rank comparisons in order to draw the final conclusions and recommendations. Future improvements to the methodology could explore inclusion of multiple objective analysis, and alternative means for obtaining social performance values.

  5. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  6. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  7. A methodology of SiP testing based on boundary scan

    Science.gov (United States)

    Qin, He; Quan, Haiyang; Han, Yifei; Zhu, Tianrui; Zheng, Tuo

    2017-10-01

    System in Package (SiP) play an important role in portable, aerospace and military electronic with the microminiaturization, light weight, high density, and high reliability. At present, SiP system test has encountered the problem on system complexity and malfunction location with the system scale exponentially increase. For SiP system, this paper proposed a testing methodology and testing process based on the boundary scan technology. Combining the character of SiP system and referencing the boundary scan theory of PCB circuit and embedded core test, the specific testing methodology and process has been proposed. The hardware requirement of the under test SiP system has been provided, and the hardware platform of the testing has been constructed. The testing methodology has the character of high test efficiency and accurate malfunction location.

  8. Structural health monitoring methodology for aircraft condition-based maintenance

    Science.gov (United States)

    Saniger, Jordi; Reithler, Livier; Guedra-Degeorges, Didier; Takeda, Nobuo; Dupuis, Jean Pierre

    2001-06-01

    Reducing maintenance costs while keeping a constant level of safety is a major issue for Air Forces and airlines. The long term perspective is to implement condition based maintenance to guarantee a constant safety level while decreasing maintenance costs. On this purpose, the development of a generalized Structural Health Monitoring System (SHMS) is needed. The objective of such a system is to localize the damages and to assess their severity, with enough accuracy to allow low cost corrective actions. The present paper describes a SHMS based on acoustic emission technology. This choice was driven by its reliability and wide use in the aerospace industry. The described SHMS uses a new learning methodology which relies on the generation of artificial acoustic emission events on the structure and an acoustic emission sensor network. The calibrated acoustic emission events picked up by the sensors constitute the knowledge set that the system relies on. With this methodology, the anisotropy of composite structures is taken into account, thus avoiding the major cause of errors of classical localization methods. Moreover, it is adaptive to different structures as it does not rely on any particular model but on measured data. The acquired data is processed and the event's location and corrected amplitude are computed. The methodology has been demonstrated and experimental tests on elementary samples presented a degree of accuracy of 1cm.

  9. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  10. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  11. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  12. Methodology for cloud-based design of robots

    Science.gov (United States)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  13. Abnormal condition and events analysis for instrumentation and control systems. Volume 1: Methodology for nuclear power plant digital upgrades. Final report

    International Nuclear Information System (INIS)

    McKemy, S.; Marcelli, M.; Boehmer, N.; Crandall, D.

    1996-01-01

    The ACES project was initiated to identify a cost-effective methodology for addressing abnormal conditions and events (ACES) in digital upgrades to nuclear power plant systems, as introduced by IEEE Standard 7-4.3.2-1993. Several methodologies and techniques currently in use in the defense, aerospace, and other communities for the assurance of digital safety systems were surveyed, and although several were shown to possess desirable qualities, non sufficiently met the needs of the nuclear power industry. This report describes a tailorable methodology for performing ACES analysis that is based on the more desirable aspects of the reviewed methodologies and techniques. The methodology is applicable to both safety- and non-safety-grade systems, addresses hardware, software, and system-level concerns, and can be applied in either a lifecycle or post-design timeframe. Employing this methodology for safety systems should facilitate the digital upgrade licensing process

  14. Application of best estimate and uncertainty safety analysis methodology to loss of flow events at Ontario's Power Generation's Darlington Nuclear Generating Station

    International Nuclear Information System (INIS)

    Huget, R.G.; Lau, D.K.; Luxat, J.C.

    2001-01-01

    Ontario Power Generation (OPG) is currently developing a new safety analysis methodology based on best estimate and uncertainty (BEAU) analysis. The framework and elements of the new safety analysis methodology are defined. The evolution of safety analysis technology at OPG has been thoroughly documented. Over the years, the use of conservative limiting assumptions in OPG safety analyses has led to gradual erosion of predicted safety margins. The main purpose of the new methodology is to provide a more realistic quantification of safety margins within a probabilistic framework, using best estimate results, with an integrated accounting of the underlying uncertainties. Another objective of the new methodology is to provide a cost-effective means for on-going safety analysis support of OPG's nuclear generating stations. Discovery issues and plant aging effects require that the safety analyses be periodically revised and, in the past, the cost of reanalysis at OPG has been significant. As OPG enters the new competitive marketplace for electricity, there is a strong need to conduct safety analysis in a less cumbersome manner. This paper presents the results of the first licensing application of the new methodology in support of planned design modifications to the shutdown systems (SDSs) at Darlington Nuclear Generating Station (NGS). The design modifications restore dual trip parameter coverage over the full range of reactor power for certain postulated loss-of-flow (LOF) events. The application of BEAU analysis to the single heat transport pump trip event provides a realistic estimation of the safety margins for the primary and backup trip parameters. These margins are significantly larger than those predicted by conventional limit of the operating envelope (LOE) analysis techniques. (author)

  15. A methodology for developing high-integrity knowledge base using document analysis and ECPN matrix analysis with backward simulation

    International Nuclear Information System (INIS)

    Park, Joo Hyun

    1999-02-01

    When transitions occur in large systems such as nuclear power plants (NPPs) or industrial process plants, it is often difficult to diagnose them. Various computer-based operator-aiding systems have been developed in order to help operators diagnose the transitions of the plants. In procedures for developing knowledge base system like operator-aiding systems, the knowledge acquisition and the knowledge base verification are core activities. This dissertation describes a knowledge acquisition method and a knowledge base verification method for developing high-integrity knowledge base system of NPP expert systems. The knowledge acquisition is one of the most difficult and time-consuming activities in developing knowledge base systems. There are two kinds of knowledge acquisition methods in view of knowledge sources. One is an acquisition method from human expert. This method, however, is not adequate to acquire the knowledge of NPP expert systems because the number of experts is not sufficient. In this work, we propose a novel knowledge acquisition method through documents analysis. The knowledge base can be built correctly, rapidly, and partially automatically through this method. This method is especially useful when it is difficult to find domain experts. Reliability of knowledge base systems depends on the quality of their knowledge base. Petri Net has been used to verify knowledge bases due to their formal outputs. The methods using Petri Net however are difficult to apply to large and complex knowledge bases because the Net becomes very large and complex. Also, with Petri Net, it is difficult to find proper input patterns that make anomalies occur. In order to overcome this difficulty, in this work, the anomaly candidates detection methods are developed based on Extended CPN (ECPN) matrix analysis. This work also defines the backward simulation of CPN to find compact input patterns for anomaly detection, which starts simulation from the anomaly candidates

  16. Methodological Analysis of Gregarious Behaviour of Agents in the Financial Markets

    OpenAIRE

    Solodukhin Stanislav V.

    2013-01-01

    The article considers methodological approaches to analysis of gregarious behaviour of agents in the financial markets and also studies foundations of the agent modelling of decision making processes with consideration of the gregarious instinct.

  17. Electroencephalogram-based methodology for determining unconsciousness during depopulation.

    Science.gov (United States)

    Benson, E R; Alphin, R L; Rankin, M K; Caputo, M P; Johnson, A L

    2012-12-01

    When an avian influenza or virulent Newcastle disease outbreak occurs within commercial poultry, key steps involved in managing a fast-moving poultry disease can include: education; biosecurity; diagnostics and surveillance; quarantine; elimination of infected poultry through depopulation or culling, disposal, and disinfection; and decreasing host susceptibility. Available mass emergency depopulation procedures include whole-house gassing, partial-house gassing, containerized gassing, and water-based foam. To evaluate potential depopulation methods, it is often necessary to determine the time to the loss of consciousness (LOC) in poultry. Many current approaches to evaluating LOC are qualitative and require visual observation of the birds. This study outlines an electroencephalogram (EEG) frequency domain-based approach for determining the point at which a bird loses consciousness. In this study, commercial broilers were used to develop the methodology, and the methodology was validated with layer hens. In total, 42 data sets from 13 broilers aged 5-10 wk and 12 data sets from four spent hens (age greater than 1 yr) were collected and analyzed. A wireless EEG transmitter was surgically implanted, and each bird was monitored during individual treatment with isoflurane anesthesia. EEG data were evaluated using a frequency-based approach. The alpha/delta (A/D, alpha: 8-12 Hz, delta: 0.5-4 Hz) ratio and loss of posture (LOP) were used to determine the point at which the birds became unconscious. Unconsciousness, regardless of the method of induction, causes suppression in alpha and a rise in the delta frequency component, and this change is used to determine unconsciousness. There was no statistically significant difference between time to unconsciousness as measured by A/D ratio or LOP, and the A/D values were correlated at the times of unconsciousness. The correlation between LOP and A/D ratio indicates that the methodology is appropriate for determining

  18. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  19. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  20. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  1. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    Bojacá, Carlos Ricardo; Casilimas, Héctor Albeiro; Gil, Rodrigo; Schrevens, Eddie

    2012-01-01

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha −1 to produce 96.4 GJ ha −1 , resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  2. Bioclim deliverable D8a: development of the rule-based down-scaling methodology for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    BidiC output. The original rule-based down-scaling methodology has been modified in order to be as objective and quantitative as possible. However, some of the steps still inevitably involve subjective judgement and qualitative analysis. Thus, in order to provide complete traceability and detailed guidance for implementation of the methodology in other regions, the detailed region-specific work involved at Steps 1, 3 and 4 is reported in appendices to this deliverable. The rule-based down-scaling methodology developed here is based on an earlier study for Central England. Thus, development initially focused on Central England, with similar work then undertaken for the other two regions (drawing on the expertise of the French and Spanish country teams and modifying the analyses as appropriate to reflect inter-regional climatic variations). Thus, the Central England work tends to be reported in somewhat greater detail. According to the BIOCLIM proposal, it was intended to apply the rule-based down-scaling methodology to MoBidiC output only. However, after completion of Step 5: Identification of appropriate MoBidiC simulations and variables for the identification of down-scaling rules/thresholds, it became clear that the methodology could also be applied to the BIOCLIM simulations for the next one Million years performed with LLN 2-D NH. The application of the down-scaling methodology to these simulations is outlined in Section 3. Finally, Section 4 provides a summary and conclusions of the rule-based down-scaling methodology

  3. Keeping it pure – a pedagogical case study of teaching soft systems methodology in scenario and policy analysis

    Directory of Open Access Journals (Sweden)

    Ian Yeoman

    2016-09-01

    Full Text Available Purpose – Soft systems methodology (SSM is well documented in the academic and management literature. Over the last 40 years, the methodology has come to be adapted depending on the tool users’ skills and experience in order to fit the problem. The purpose of this paper is to demonstrate good teaching and learning practice from a pedagogical perspective. Design/methodology/approach – Dr Ian Yeoman of Victoria University of Wellington provides a personal reflection of how the methodology is used in the teaching and learning of TOUR301 Tourism Policy and Planning as a policy and scenario analysis method. Findings – The paper articulates the seven stages of SSM from problem situation unstructured, through to Rich Pictures, vision and guiding principles, policy solutions, comparisons, feasibility and implementation stages. The paper uses a series of teaching tasks to breakdown the complexity of the methodology thus guiding students and teachers in how to deploy the methodology in the classroom. Originality/value – The value of the paper demonstrates the reflective practice of SSM in action as an exemplar of good practice. The paper clearly articulates the stages of the methodology so students and teachers can adopt this approach in classroom environments following a scaffolding learning approach. The use of teaching tasks throughout the paper helps bring clarity and order thus enabling the teacher to effectively teach the subject and the students to learn. The most significant contribution of this paper is the articulation of good teaching practice in policy and scenario analysis which articulated through four learning lessons: facilitating a learning environment; the impact of visual thinking; political theory; the importance of incremental learning; and problem-based learning and international students.

  4. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  5. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  6. A methodology for the analysis of differential coexpression across the human lifespan.

    Science.gov (United States)

    Gillis, Jesse; Pavlidis, Paul

    2009-09-22

    Differential coexpression is a change in coexpression between genes that may reflect 'rewiring' of transcriptional networks. It has previously been hypothesized that such changes might be occurring over time in the lifespan of an organism. While both coexpression and differential expression of genes have been previously studied in life stage change or aging, differential coexpression has not. Generalizing differential coexpression analysis to many time points presents a methodological challenge. Here we introduce a method for analyzing changes in coexpression across multiple ordered groups (e.g., over time) and extensively test its validity and usefulness. Our method is based on the use of the Haar basis set to efficiently represent changes in coexpression at multiple time scales, and thus represents a principled and generalizable extension of the idea of differential coexpression to life stage data. We used published microarray studies categorized by age to test the methodology. We validated the methodology by testing our ability to reconstruct Gene Ontology (GO) categories using our measure of differential coexpression and compared this result to using coexpression alone. Our method allows significant improvement in characterizing these groups of genes. Further, we examine the statistical properties of our measure of differential coexpression and establish that the results are significant both statistically and by an improvement in semantic similarity. In addition, we found that our method finds more significant changes in gene relationships compared to several other methods of expressing temporal relationships between genes, such as coexpression over time. Differential coexpression over age generates significant and biologically relevant information about the genes producing it. Our Haar basis methodology for determining age-related differential coexpression performs better than other tested methods. The Haar basis set also lends itself to ready interpretation

  7. Methodological triangulation in work life research

    DEFF Research Database (Denmark)

    Warring, Niels

    Based on examples from two research projects on preschool teachers' work, the paper will discuss potentials and challenges in methodological triangulation in work life research. Analysis of ethnographic and phenomenological inspired observations of everyday life in day care centers formed the basis...... for individual interviews and informal talks with employees. The interviews and conversations were based on a critical hermeneutic approach. The analysis of observations and interviews constituted a knowledge base as the project went in to the last phase: action research workshops. In the workshops findings from...

  8. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    Science.gov (United States)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  9. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    Science.gov (United States)

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  10. Building a Cultural Heritage Corridor Based on Geodesign Theory and Methodology

    Directory of Open Access Journals (Sweden)

    Yang Chen

    Full Text Available ABSTRACT: Geodesign is a type of methodology that integrates dynamic environment modeling based on GIS with planning and design in order to support relevant decision making. It has substantially changed the dominant ways of thinking in planning and design, and has solved spatial issues relating to cultural and natural resources from a new perspective. Taking the Qionglai section of the Southern Silk Road as an example, the present study implemented geodesign theory and methods to investigate the technical approach to building a cultural heritage corridor based on GIS spatial analysis and overlay analysis.Firstly, we analyzed the various data layers of the cultural and natural features in the planning region. We organized all the data based on the principle of classification, organizing it into categories such as natural, cultural, and recreational data. Therefore, we defined the theme of the Southern Silk Road as a historical cultural heritage corridor. Secondly, based on the background, the heritage corridor boundary was defined according to its natural, cultural, and administrative spatial characteristics, with the three thematic boundaries overlaid in order to define a boundary location area covering about 852 square kilometers. Next, we divided all of the resources into three categories: natural heritage resources, cultural heritage resources, and intangible heritage resources and recreational spaces. The elements which could be used to build up the cultural heritage corridor were selected by evaluation and spatial analysis. In this way, we obtained some conclusive spatial information, such as element structures, the heritage density distribution, and the heritage number distribution. Finally, within the heritage boundary, we connected the tangible and intangible heritage resources to form various kinds of linear spaces, with the aim of obtaining the spatial pattern of the heritage corridor. KEYWORDS: Geodesign, heritage corridor, heritage

  11. A situational analysis methodology to inform comprehensive HIV prevention and treatment programming, applied in rural South Africa.

    Science.gov (United States)

    Treves-Kagan, Sarah; Naidoo, Evasen; Gilvydis, Jennifer M; Raphela, Elsie; Barnhart, Scott; Lippman, Sheri A

    2017-09-01

    Successful HIV prevention programming requires engaging communities in the planning process and responding to the social environmental factors that shape health and behaviour in a specific local context. We conducted two community-based situational analyses to inform a large, comprehensive HIV prevention programme in two rural districts of North West Province South Africa in 2012. The methodology includes: initial partnership building, goal setting and background research; 1 week of field work; in-field and subsequent data analysis; and community dissemination and programmatic incorporation of results. We describe the methodology and a case study of the approach in rural South Africa; assess if the methodology generated data with sufficient saturation, breadth and utility for programming purposes; and evaluate if this process successfully engaged the community. Between the two sites, 87 men and 105 women consented to in-depth interviews; 17 focus groups were conducted; and 13 health facilities and 7 NGOs were assessed. The methodology succeeded in quickly collecting high-quality data relevant to tailoring a comprehensive HIV programme and created a strong foundation for community engagement and integration with local health services. This methodology can be an accessible tool in guiding community engagement and tailoring future combination HIV prevention and care programmes.

  12. A multicriteria-based methodology for site prioritisation in sediment management.

    Science.gov (United States)

    Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos

    2009-08-01

    Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.

  13. MEGASTAR: The meaning of growth. An assessment of systems, technologies, and requirements. [methodology for display and analysis of energy production and consumption

    Science.gov (United States)

    1974-01-01

    A methodology for the display and analysis of postulated energy futures for the United States is presented. A systems approach methodology including the methodology of technology assessment is used to examine three energy scenarios--the Westinghouse Nuclear Electric Economy, the Ford Technical Fix Base Case and a MEGASTAR generated Alternate to the Ford Technical Fix Base Case. The three scenarios represent different paths of energy consumption from the present to the year 2000. Associated with these paths are various mixes of fuels, conversion, distribution, conservation and end-use technologies. MEGASTAR presents the estimated times and unit requirements to supply the fuels, conversion and distribution systems for the postulated end uses for the three scenarios and then estimates the aggregate manpower, materials, and capital requirements needed to develop the energy system described by the particular scenario.

  14. 3-D rod ejection analysis using a conservative methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Ho; Park, Jin Woo; Park, Guen Tae; Um, Kil Sup; Ryu, Seok Hee; Lee, Jae Il; Choi, Tong Soo [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    The point kinetics model which simplifies the core phenomena and physical specifications is used for the conventional rod ejection accident analysis. The point kinetics model is convenient to assume conservative core parameters but this simplification loses large amount of safety margin. The CHASER system couples the three-dimensional core neutron kinetics code ASTRA, the sub-channel analysis code THALES and the fuel performance analysis code FROST. The validation study for the CHASER system is addressed using the NEACRP three-dimensional PWR core transient benchmark problem. A series of conservative rod ejection analyses for the APR1400 type plant is performed for both hot full power (HFP) and hot zero power (HZP) conditions to determine the most limiting cases. The conservative rod ejection analysis methodology is designed to properly consider important phenomena and physical parameters.

  15. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology

    Science.gov (United States)

    Hammami, Muhammad M; Hammami, Safa; Amer, Hala A; Khodr, Nesrine A

    2016-01-01

    Background Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females’ end-of-life choices. Methods A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology. Results The mean age of the females in the sample was 30.3 years (range, 19–55 years). Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: “physical and emotional privacy concerned, family caring” (younger, lower religiosity), “whole person” (higher religiosity), “pain and informational privacy concerned” (lower life quality), “decisional privacy concerned” (older, higher life quality), and “life quantity concerned, family dependent” (high life quality, low life satisfaction). Out of the extreme 14 priorities/dis-priorities for each group, 21%–50% were not represented among the extreme 20 priorities/dis-priorities for the entire sample. Conclusion Consistent with the previously reported findings in Saudi males, transcendence and dying in

  17. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology.

    Science.gov (United States)

    Hammami, Muhammad M; Hammami, Safa; Amer, Hala A; Khodr, Nesrine A

    2016-01-01

    Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females' end-of-life choices. A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology. The mean age of the females in the sample was 30.3 years (range, 19-55 years). Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: "physical and emotional privacy concerned, family caring" (younger, lower religiosity), "whole person" (higher religiosity), "pain and informational privacy concerned" (lower life quality), "decisional privacy concerned" (older, higher life quality), and "life quantity concerned, family dependent" (high life quality, low life satisfaction). Out of the extreme 14 priorities/dis-priorities for each group, 21%-50% were not represented among the extreme 20 priorities/dis-priorities for the entire sample. Consistent with the previously reported findings in Saudi males, transcendence and dying in the hospital were the extreme end-of-life priority and dis

  18. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    Science.gov (United States)

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Typology of end-of-life priorities in Saudi females: averaging analysis and Q-methodology

    Directory of Open Access Journals (Sweden)

    Hammami MM

    2016-05-01

    Full Text Available Muhammad M Hammami,1,2 Safa Hammami,1 Hala A Amer,1 Nesrine A Khodr1 1Clinical Studies and Empirical Ethics Department, King Faisal Specialist Hospital and Research Centre, 2College of Medicine, Alfaisal University, Riyadh, Saudi Arabia Background: Understanding culture-and sex-related end-of-life preferences is essential to provide quality end-of-life care. We have previously explored end-of-life choices in Saudi males and found important culture-related differences and that Q-methodology is useful in identifying intraculture, opinion-based groups. Here, we explore Saudi females’ end-of-life choices.Methods: A volunteer sample of 68 females rank-ordered 47 opinion statements on end-of-life issues into a nine-category symmetrical distribution. The ranking scores of the statements were analyzed by averaging analysis and Q-methodology.Results: The mean age of the females in the sample was 30.3 years (range, 19–55 years. Among them, 51% reported average religiosity, 78% reported very good health, 79% reported very good life quality, and 100% reported high-school education or more. The extreme five overall priorities were to be able to say the statement of faith, be at peace with God, die without having the body exposed, maintain dignity, and resolve all conflicts. The extreme five overall dis-priorities were to die in the hospital, die well dressed, be informed about impending death by family/friends rather than doctor, die at peak of life, and not know if one has a fatal illness. Q-methodology identified five opinion-based groups with qualitatively different characteristics: “physical and emotional privacy concerned, family caring” (younger, lower religiosity, “whole person” (higher religiosity, “pain and informational privacy concerned” (lower life quality, “decisional privacy concerned” (older, higher life quality, and “life quantity concerned, family dependent” (high life quality, low life satisfaction. Out of the

  20. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  1. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  2. Structured Intuition: A Methodology to Analyse Entity Authentication

    DEFF Research Database (Denmark)

    Ahmed, Naveed

    and the level of abstraction used in the analysis. Thus, the goal of developing a high level methodology that can be used with different notions of security, authentication, and abstraction is worth considering. In this thesis, we propose a new methodology, called the structured intuition (SI), which addresses...... in our methodology, which is called canonicity, which is a weaker form of message authenticity. As compared to many contemporary analysis techniques, an SI based analysis provides detailed results regarding the design rationales and entity authentication goals of a protocol....... consequences for the security of the system, e.g., private information of legitimate parties may be leaked or the security policy of a trusted system may be violated. At a corporate level, such a failure of authentication may result in loss of proprietary technology or customers' credit card information...

  3. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  4. Methodological approaches to analysis of agricultural countermeasures on radioactive contaminated areas: Estimation of effectiveness and comparison of different alternatives

    DEFF Research Database (Denmark)

    Yatsalo, B.I.; Hedemann Jensen, P.; Alexakhin, R.M.

    1997-01-01

    Methodological aspects of countermeasure analysis in the long-term period after a nuclear accident are discussed for agriculture countermeasures for illustrative purposes. The estimates of effectiveness fbr specific countermeasures as well as methods of justified action levels assessments...... and comparison of different alternatives (countermeasures) based on the use of several criteria are considered....

  5. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  6. Applying distance-to-target weighing methodology to evaluate the environmental performance of bio-based energy, fuels, and materials

    International Nuclear Information System (INIS)

    Weiss, Martin; Patel, Martin; Heilmeier, Hermann; Bringezu, Stefan

    2007-01-01

    The enhanced use of biomass for the production of energy, fuels, and materials is one of the key strategies towards sustainable production and consumption. Various life cycle assessment (LCA) studies demonstrate the great potential of bio-based products to reduce both the consumption of non-renewable energy resources and greenhouse gas emissions. However, the production of biomass requires agricultural land and is often associated with adverse environmental effects such as eutrophication of surface and ground water. Decision making in favor of or against bio-based and conventional fossil product alternatives therefore often requires weighing of environmental impacts. In this article, we apply distance-to-target weighing methodology to aggregate LCA results obtained in four different environmental impact categories (i.e., non-renewable energy consumption, global warming potential, eutrophication potential, and acidification potential) to one environmental index. We include 45 bio- and fossil-based product pairs in our analysis, which we conduct for Germany. The resulting environmental indices for all product pairs analyzed range from -19.7 to +0.2 with negative values indicating overall environmental benefits of bio-based products. Except for three options of packaging materials made from wheat and cornstarch, all bio-based products (including energy, fuels, and materials) score better than their fossil counterparts. Comparing the median values for the three options of biomass utilization reveals that bio-energy (-1.2) and bio-materials (-1.0) offer significantly higher environmental benefits than bio-fuels (-0.3). The results of this study reflect, however, subjective value judgments due to the weighing methodology applied. Given the uncertainties and controversies associated not only with distance-to-target methodologies in particular but also with weighing approaches in general, the authors strongly recommend using weighing for decision finding only as a

  7. Framework for the Economic Analysis of Hybrid Systems Based on Exergy Consumption

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cherry, Robert S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Deason, Wesley R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bragg-Sitton, Shannon M. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Boardman, Richard D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-08-01

    Starting from an overview of the dynamic behavior of the electricity market the need of the introduction of energy users that will provide a damping capability to the system is derived as also a qualitative analysis of the impact of uncertainty, both in the demand and supply side, is performed. Then it follows an introduction to the investment analysis methodologies based on the discounting of the cash flow, and then work concludes with the illustration and application of the exergonomic principles to provide a sound methodology for the cost accounting of the plant components to be used in the cash flow analysis.

  8. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  9. Optimized planning methodologies of ASON implementation

    Science.gov (United States)

    Zhou, Michael M.; Tamil, Lakshman S.

    2005-02-01

    Advanced network planning concerns effective network-resource allocation for dynamic and open business environment. Planning methodologies of ASON implementation based on qualitative analysis and mathematical modeling are presented in this paper. The methodology includes method of rationalizing technology and architecture, building network and nodal models, and developing dynamic programming for multi-period deployment. The multi-layered nodal architecture proposed here can accommodate various nodal configurations for a multi-plane optical network and the network modeling presented here computes the required network elements for optimizing resource allocation.

  10. Methodology for the analysis of self-tensioned wooden structural floors

    Directory of Open Access Journals (Sweden)

    F. Suárez-Riestra

    2017-09-01

    Full Text Available It is described a self-tensioning system constituted by a force multiplying device which, attached to the supports of the ends of the structural element, is able to convert the vertical resultant from the gravitatonial actions into an effective tensioning action, through the movement that was induced by a set of rods. The self-tensioning system is able to offer a high performance, thanks to the beneficial effect of the opposite deflection generated by the tensioning, in proportion to the increasing of the gravitational action. This allows to design long-span timber ribbed floors using reduced depths. The complexity of calculation due to the non-linearity of the system can be obviated with the methodology of analysis developed in the article. In order to illustrate the advantages of the self-tensioning system and the methodology of analysis which were developed, six cases of ribbed floors have been analysed, with spans of 9, 12 and 15 m and variable using loads of 3,00 kN/m2 and 5,00 kN/m2.

  11. A framework for characterizing usability requirements elicitation and analysis methodologies (UREAM)

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Mannaert, H.

    2012-01-01

    Dedicated methodologies for the elicitation and analysis of usability requirements have been proposed in literature, usually developed by usability experts. The usability of these approaches by non-expert software engineers is not obvious. In this paper, the objective is to support developers and

  12. Systematic screening methodology and energy efficient design of ionic liquid-based separation processes

    DEFF Research Database (Denmark)

    Kulajanpeng, Kusuma; Suriyapraphadilok, Uthaiporn; Gani, Rafiqul

    2016-01-01

    in size of the target solute was investigated using the same separation process and IL entrainer to obtain the same product purity. The proposed methodology has been evaluated through a case study of binary alcoholic aqueous azeotropic separation: water+ethanol and water+isopropanol.......A systematic methodology for the screening of ionic liquids (ILs) as entrainers and for the design of ILs-based separation processes in various homogeneous binary azeotropic mixtures has been developed. The methodology focuses on the homogeneous binary aqueous azeotropic systems (for example, water...

  13. Functional community analysis of brain: a new approach for EEG-based investigation of the brain pathology.

    Science.gov (United States)

    Ahmadlou, Mehran; Adeli, Hojjat

    2011-09-15

    Analysis of structure of the brain functional connectivity (SBFC) is a fundamental issue for understanding of the brain cognition as well as the pathology of brain disorders. Analysis of communities among sub-parts of a system is increasingly used for social, ecological, and other networks. This paper presents a new methodology for investigation of the SBFC and understanding of the brain based on graph theory and community pattern analysis of functional connectivity graph of the brain obtained from encephalograms (EEGs). The methodology consists of three main parts: fuzzy synchronization likelihood (FSL), community partitioning, and decisions based on partitions. As an example application, the methodology is applied to analysis of brain of patients with attention deficit/hyperactivity disorder (ADHD) and the problem of discrimination of ADHD EEGs from healthy (non-ADHD) EEGs. Copyright © 2011. Published by Elsevier Inc.

  14. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  15. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  16. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  17. Conjugate gradient based projection - A new explicit methodology for frictional contact

    Science.gov (United States)

    Tamma, Kumar K.; Li, Maocheng; Sha, Desong

    1993-01-01

    With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.

  18. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  19. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  20. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  1. Summer 2012 Testing and Analysis of the Chemical Mixture Methodology -- Part I

    Energy Technology Data Exchange (ETDEWEB)

    Glantz, Clifford S.; Yu, Xiao-Ying; Coggin, Rebekah L.; Ponder, Lashaundra A.; Booth, Alexander E.; Petrocchi, Achille J.; Horn, Sarah M.; Yao, Juan

    2012-07-01

    This report presents the key findings made by the Chemical Mixture Methodology (CMM) project team during the first stage of their summer 2012 testing and analysis of the CMM. The study focused on answering the following questions: o What is the percentage of the chemicals in the CMM Rev 27 database associated with each Health Code Number (HCN)? How does this result influence the relative importance of acute HCNs and chronic HCNs in the CMM data set? o What is the benefit of using the HCN-based approach? Which Modes of Action and Target Organ Effects tend to be important in determining the HCN-based Hazard Index (HI) for a chemical mixture? o What are some of the potential issues associated with the current HCN-based approach? What are the opportunities for improving the performance and/or technical defensibility of the HCN-based approach? How would those improvements increase the benefit of using the HCN-based approach? o What is the Target Organ System Effect approach and how can it be used to improve upon the current HCN-based approach? How does the benefits users would derive from using the Target Organ System Approach compare to the benefits available from the current HCN-based approach?

  2. Nuclear energy cost data base. A reference data base for nuclear and coal-fired powerplant power-generation cost analysis

    International Nuclear Information System (INIS)

    1982-10-01

    A reference data base and standard methodology are needed for performing comparative nuclear and fossil power generation cost analyses for DOE/NE. Proposals are presented for such a methodology and for reference assumptions and data to be used with the methodology. This report is intended to provide basic guidelines or a starting point for analysis and to serve as a focal point in establishing parameters and methods to be used in economic comparisons of nuclear systems with alternatives. The data base is applicable for economic comparisons of new base-load light water reactors on either a current once-through cycle or self-generated recycle, high- and low-sulfur coal-fired plants, and oil and natural gas-fired electric generating plant coming on line in the last decade of this century. This paper includes a data base containing proposed technical and economic assumptions to be used in analyses, discussions of a recommended methodology to be used in calculating power generation costs, and a sample calculation for illustrative and benchmark purposes

  3. Computer Aided Methodology for Simultaneous Synthesis, Design & Analysis of Chemical Products-Processes

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2006-01-01

    A new combined methodology for computer aided molecular design and process flowsheet design is presented. The methodology is based on the group contribution approach for prediction of molecular properties and design of molecules. Using the same principles, process groups have been developed...... a wide range of problems. In this paper, only the computer aided flowsheet design related features are presented....... together with their corresponding flowsheet property models. To represent the process flowsheets in the same way as molecules, a unique but simple notation system has been developed. The methodology has been converted into a prototype software, which has been tested with several case studies covering...

  4. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  5. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  6. Site-conditions map for Portugal based on VS measurements: methodology and final model

    Science.gov (United States)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and

  7. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...... of parameters was performed in order to get a good model fit to the data. However, not all parameters are identifiable with the given data set and model structure. Sensitivity, identifiability, and uncertainty analysis were completed and a relevant identifiable subset of parameters was determined for a new...

  8. Interpretive Phenomenological Analysis: An Appropriate Methodology for Educational Research?

    Directory of Open Access Journals (Sweden)

    Edward John Noon

    2018-04-01

    Full Text Available Interpretive phenomenological analysis (IPA is a contemporary qualitative methodology, first developed by psychologist Jonathan Smith (1996. Whilst its roots are in psychology, it is increasingly being drawn upon by scholars in the human, social and health sciences (Charlick, Pincombe, McKellar, & Fielder, 2016. Despite this, IPA has received limited attention across educationalist literature. Drawing upon my experiences of using IPA to explore the barriers to the use of humour in the teaching of Childhood Studies (Noon, 2017, this paper will discuss its theoretical orientation, sampling and methods of data collection and analysis, before examining the strengths and weaknesses to IPA’s employment in educational research.

  9. Optimal (Solvent) Mixture Design through a Decomposition Based CAMD methodology

    DEFF Research Database (Denmark)

    Achenie, L.; Karunanithi, Arunprakash T.; Gani, Rafiqul

    2004-01-01

    Computer Aided Molecular/Mixture design (CAMD) is one of the most promising techniques for solvent design and selection. A decomposition based CAMD methodology has been formulated where the mixture design problem is solved as a series of molecular and mixture design sub-problems. This approach is...

  10. An inexpensive, interdisciplinary, methodology to conduct an impact study of homeless persons on hospital based services.

    Science.gov (United States)

    Parker, R David; Regier, Michael; Brown, Zachary; Davis, Stephen

    2015-02-01

    Homelessness is a primary concern for community health. Scientific literature on homelessness is wide ranging and diverse. One opportunity to add to existing literature is the development and testing of affordable, easily implemented methods for measuring the impact of homeless on the healthcare system. Such methodological approaches rely on the strengths in a multidisciplinary approach, including providers, both healthcare and homeless services and applied clinical researchers. This paper is a proof of concept for a methodology which is easily adaptable nationwide, given the mandated implementation of homeless management information systems in the United States and other countries; medical billing systems by hospitals; and research methods of researchers. Adaptation is independent of geographic region, budget restraints, specific agency skill sets, and many other factors that impact the application of a consistent methodological science based approach to assess and address homelessness. We conducted a secondary data analysis merging data from homeless utilization and hospital case based data. These data detailed care utilization among homeless persons in a small, Appalachian city in the United States. In our sample of 269 persons who received at least one hospital based service and one homeless service between July 1, 2012 and June 30, 2013, the total billed costs were $5,979,463 with 10 people costing more than one-third ($1,957,469) of the total. Those persons were primarily men, living in an emergency shelter, with pre-existing disabling conditions. We theorize that targeted services, including Housing First, would be an effective intervention. This is proposed in a future study.

  11. Measuring the Differences between Traditional Learning and Game-Based Learning Using Electroencephalography (EEG) Physiologically Based Methodology

    Science.gov (United States)

    Chen, Ching-Huei

    2017-01-01

    Students' cognitive states can reflect a learning experience that results in engagement in an activity. In this study, we used electroencephalography (EEG) physiologically based methodology to evaluate students' levels of attention and relaxation, as well as their learning performance within a traditional and game-based learning context. While no…

  12. Combining Project-Based Learning and Community-Based Research in a Research Methodology Course: The Lessons Learned

    Science.gov (United States)

    Arantes do Amaral, João Alberto; Lino dos Santos, Rebeca Júlia Rodrigues

    2018-01-01

    In this article, we present our findings regarding the course "Research Methodology," offered to 22 first-year undergraduate students studying Administration at the Federal University of São Paulo, Osasco, Brazil. The course, which combined community-based research and project-based learning, was developed during the second semester of…

  13. A SystemC-Based Design Methodology for Digital Signal Processing Systems

    Directory of Open Access Journals (Sweden)

    Christian Haubelt

    2007-03-01

    Full Text Available Digital signal processing algorithms are of big importance in many embedded systems. Due to complexity reasons and due to the restrictions imposed on the implementations, new design methodologies are needed. In this paper, we present a SystemC-based solution supporting automatic design space exploration, automatic performance evaluation, as well as automatic system generation for mixed hardware/software solutions mapped onto FPGA-based platforms. Our proposed hardware/software codesign approach is based on a SystemC-based library called SysteMoC that permits the expression of different models of computation well known in the domain of digital signal processing. It combines the advantages of executability and analyzability of many important models of computation that can be expressed in SysteMoC. We will use the example of an MPEG-4 decoder throughout this paper to introduce our novel methodology. Results from a five-dimensional design space exploration and from automatically mapping parts of the MPEG-4 decoder onto a Xilinx FPGA platform will demonstrate the effectiveness of our approach.

  14. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  15. D2.1 - An EA Active, Problem Based Learning Methodology - EAtrain2

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne; Buus, Lillian

    This deliverable reports on the work undertaken in work package 2 with the key objective to develop a learning methodology for web 2.0 mediated Enterprise Architecture (EA) learning building on a problem based learning (PBL) approach. The deliverable reports not only on the methodology but also...... on the activities leading to its development (literature review, workshops, etc.) and on further outcomes of this work relevant to the platform specification and pilot courses preparation....

  16. A progressive methodology for seismic safety evaluation of gravity dams

    International Nuclear Information System (INIS)

    Ghrib, F.; Leger, P.; Tinawi, R.; Lupien, R.; Veilleux, M.

    1995-01-01

    A progressive methodology for the seismic safety evaluation of existing concrete gravity dams was described. The methodology was based on five structural analysis levels with increasing complexity to represent inertia forces, dam-foundation and dam-interaction mechanisms, as well as concrete cracking. The five levels were (1) preliminary screening, (2) pseudo-static method, (3) pseudo-dynamic method, (4) linear time history analysis, and (5) non-linear history analysis. The first four levels of analysis were applied for the seismic safety evaluation of Paugan gravity dam (Quebec). Results showed that internal forces from pseudo-dynamic, response spectra and transient finite element analyses could be used to interpret the dynamic stability of dams from familiar strength-based criteria. However, as soon as the base was cracked, the seismically induced forces were modified, and level IV analyses proved more suitable to handle rationally these complexities. 8 refs., 7 figs., 1 tab

  17. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  18. Predicting Dissertation Methodology Choice among Doctoral Candidates at a Faith-Based University

    Science.gov (United States)

    Lunde, Rebecca

    2017-01-01

    Limited research has investigated dissertation methodology choice and the factors that contribute to this choice. Quantitative research is based in mathematics and scientific positivism, and qualitative research is based in constructivism. These underlying philosophical differences posit the question if certain factors predict dissertation…

  19. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  20. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    Cha, Kyoon Ho; Choi, Yu Sun; Lee, Eun Ki; Park, Moon Ghu; Morita, Toshio; Heibel, Michael D.

    2004-01-01

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  1. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  2. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    International Nuclear Information System (INIS)

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein

  3. Cell-based land use screening procedure for regional siting analysis

    International Nuclear Information System (INIS)

    Jalbert, J.S.; Dobson, J.E.

    1976-01-01

    An energy facility site-screening methodology which permits the land resource planner to identify candidate siting areas was developed. Through the use of spatial analysis procedures and computer graphics, a selection of candidate areas is obtained. Specific sites then may be selected from among candidate areas for environmental impact analysis. The computerized methodology utilizes a cell-based geographic information system for specifying the suitability of candidate areas for an energy facility. The criteria to be considered may be specified by the user and weighted in terms of importance. Three primary computer programs have been developed. These programs produce thematic maps, proximity calculations, and suitability calculations. Programs are written so as to be transferrable to regional planning or regulatory agencies to assist in rational and comprehensive power plant site identification and analysis

  4. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  6. The continuous analysis of nitrate and ammonium in aerosols by the steam jet aerosol collector (SJAC): extension and validation of the methodology

    NARCIS (Netherlands)

    Slanina, J.; Brink, ten H.M.; Otjes, R.P.; Even, A.; Jongejan, P.; Khlystov, A.; Waijers-IJpelaan, A.; Hu, M.; Lu, Y.

    2001-01-01

    Classical methodology based on the application of filters for sampling, followed by extraction and analysis, introduces severe artifacts for semi-volatile compounds like ammonium nitrate. These filter methods do not meet the requirements for the assessment of the impact of aerosols on acidification,

  7. ''Training plan optimized design'' methodology application to IBERDROLA - Power generation

    International Nuclear Information System (INIS)

    Gil, S.; Mendizabal, J.L.

    1996-01-01

    The trend in both Europe and the United States, towards the understanding that no training plan may be considered suitable if not backed by the results of application of the S.A.T. (Systematic Approach to Training) methodology, led TECNATOM, S.A. to apply thy methodology through development of an application specific to the conditions of the Spanish working system. The requirement that design of the training be coherent with the realities of the working environment is met by systematic application of the SAT methodology as part of the work analysis and job-based task analysis processes, this serving as a basis for design of the training plans

  8. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  9. Discourse analysis: A useful methodology for health-care system researches.

    Science.gov (United States)

    Yazdannik, Ahmadreza; Yousefy, Alireza; Mohammadi, Sepideh

    2017-01-01

    Discourse analysis (DA) is an interdisciplinary field of inquiry and becoming an increasingly popular research strategy for researchers in various disciplines which has been little employed by health-care researchers. The methodology involves a focus on the sociocultural and political context in which text and talk occur. DA adds a linguistic approach to an understanding of the relationship between language and ideology, exploring the way in which theories of reality and relations of power are encoded in such aspects as the syntax, style, and rhetorical devices used in texts. DA is a useful and productive qualitative methodology but has been underutilized within health-care system research. Without a clear understanding of discourse theory and DA it is difficult to comprehend important research findings and impossible to use DA as a research strategy. To redress this deficiency, in this article, represents an introduction to concepts of discourse and DA, DA history, Philosophical background, DA types and analysis strategy. Finally, we discuss how affect to the ideological dimension of such phenomena discourse in health-care system, health beliefs and intra-disciplinary relationship in health-care system.

  10. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  11. ORGANIZATION OF FUTURE ENGINEERS' PROJECT-BASED LEARNING WHEN STUDYING THE PROJECT MANAGEMENT METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Halyna V. Lutsenko

    2015-02-01

    Full Text Available The peculiarities of modern world experience of implementation of project-based learning in engineering education have been considered. The potential role and place of projects in learning activity have been analyzed. The methodology of organization of project-based activity of engineering students when studying the project management methodology and computer systems of project management has been proposed. The requirements to documentation and actual results of students' projects have been described in detail. The requirements to computer-aided systems of project management developed by using Microsoft Project in the scope of diary scheduling and resources planning have been formulated.

  12. Development of methodology for the analysis of fuel behavior in light water reactor in design basis accidents

    International Nuclear Information System (INIS)

    Salatov, A. A.; Goncharov, A. A.; Eremenko, A. S.; Kuznetsov, V. I.; Bolnov, V. A.; Gusev, A. S.; Dolgov, A. B.; Ugryumov, A. V.

    2013-01-01

    The report attempts to analyze the current experience of the safety fuel for light-water reactors (LWRs) under design-basis accident conditions in terms of its compliance with international requirements for licensing nuclear power plants. The components of fuel behavior analysis methodology in design basis accidents in LWRs were considered, such as classification of design basis accidents, phenomenology of fuel behavior in design basis accidents, system of fuel safety criteria and their experimental support, applicability of used computer codes and input data for computational analysis of the fuel behavior in accidents, way of accounting for the uncertainty of calculation models and the input data. A brief history of the development of probabilistic safety analysis methodology for nuclear power plants abroad is considered. The examples of a conservative approach to safety analysis of VVER fuel and probabilistic approach to safety analysis of fuel TVS-K are performed. Actual problems in development of the methodology of analyzing the behavior of VVER fuel at the design basis accident conditions consist, according to the authors opinion, in following: 1) Development of a common methodology for analyzing the behavior of VVER fuel in the design basis accidents, implementing a realistic approach to the analysis of uncertainty - in the future it is necessary for the licensing of operating VVER fuel abroad; 2) Experimental and analytical support to the methodology: experimental studies to identify and study the characteristics of the key uncertainties of computational models of fuel and the cladding, development of computational models of key events in codes, validation code on the basis of integral experiments

  13. Conceptual design of a thermo-electrical energy storage system based on heat integration of thermodynamic cycles – Part A: Methodology and base case

    International Nuclear Information System (INIS)

    Morandin, Matteo; Maréchal, François; Mercangöz, Mehmet; Buchter, Florian

    2012-01-01

    The interest in large scale electricity storage (ES) with discharging time longer than 1 h and nominal power greater than 1 MW, is increasing worldwide as the increasing share of renewable energy, typically solar and wind energy, imposes severe load management issues. Thermo-electrical energy storage (TEES) based on thermodynamic cycles is currently under investigation at ABB corporate research as an alternative solution to pump hydro and compressed air energy storage. TEES is based on the conversion of electricity into thermal energy during charge by means of a heat pump and on the conversion of thermal energy into electricity during discharge by means of a thermal engine. The synthesis and the thermodynamic optimization of a TEES system based on hot water, ice storage and transcritical CO 2 cycles, is discussed in two papers. In this first paper a methodology for the conceptual design of a TEES system based on the analysis of the thermal integration between charging and discharging cycles through Pinch Analysis tools is introduced. According to such methodology, the heat exchanger network and temperatures and volumes of storage tanks are not defined a priori but are determined after the cycle parameters are optimized. For this purpose a heuristic procedure based on the interpretation of the composite curves obtained by optimizing the thermal integration between the cycles was developed. Such heuristic rules were implemented in a code that allows finding automatically the complete system design for given values of the intensive parameters of the charging and discharging cycles only. A base case system configuration is introduced and the results of its thermodynamic optimization are discussed here. A maximum roundtrip efficiency of 60% was obtained for the base case configuration assuming turbomachinery and heat exchanger performances in line with indications from manufacturers. -- Highlights: ► Energy storage based on water, ice, and transcritical CO 2 cycles is

  14. Effectiveness of the management of price risk methodologies for the corn market based on trading signals

    Directory of Open Access Journals (Sweden)

    W. Rossouw

    2013-03-01

    Full Text Available Corn production is scattered geographically over various continents, but most of it is grown in the United States. As such, the world price of corn futures contracts is largely dominated by North American corn prices as traded on the Chicago Board of Trade. In recent years, this market has been characterised by an increase in price volatility and magnitude of price movement as a result of decreasing stock levels. The development and implementation of an effective and successful derivative price risk management strategy based on the Chicago Board of Trade corn futures contract will therefore be of inestimable value to market stakeholders worldwide. The research focused on the efficient market hypothesis and the possibility of contesting this phenomenon through an application of a derivative price risk management methodology. The methodology is based on a combination of an analysis of market trends and technical oscillators with the objective of generating returns superior to that of a market benchmark. The study found that market participants are currently unable to exploit price movement in a manner which results in returns that contest the notion of efficient markets. The methodology proposed, however, does allow the user to consistently achieve returns superior to that of a predetermined market benchmark. The benchmark price for the purposes of this study was the average price offered by the market over the contract lifetime, and as such, the efficient market hypothesis was successfully contested

  15. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  16. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  17. A Quantitative Analysis of an EEG Epileptic Record Based on MultiresolutionWavelet Coefficients

    Directory of Open Access Journals (Sweden)

    Mariel Rosenblatt

    2014-11-01

    Full Text Available The characterization of the dynamics associated with electroencephalogram (EEG signal combining an orthogonal discrete wavelet transform analysis with quantifiers originated from information theory is reviewed. In addition, an extension of this methodology based on multiresolution quantities, called wavelet leaders, is presented. In particular, the temporal evolution of Shannon entropy and the statistical complexity evaluated with different sets of multiresolution wavelet coefficients are considered. Both methodologies are applied to the quantitative EEG time series analysis of a tonic-clonic epileptic seizure, and comparative results are presented. In particular, even when both methods describe the dynamical changes of the EEG time series, the one based on wavelet leaders presents a better time resolution.

  18. Methodology of Maqamat Hamadani and Hariri Based on Buseman’s statistical methodology

    Directory of Open Access Journals (Sweden)

    Hamed Sedghi

    2016-01-01

    Full Text Available  Abstract  Stylistics can be defined as analysis and interpretation of the expression and different forms of speech, based on linguistic elements. German theorist , Buseman , devised his theses in statistical style based on the ratio of verbs to adjectives, generalizing a variety of literary and non- literary genera in German Language and Literature . Â  According to Buseman , increasing the number of verbs in verses, makes the text closer to literary style. Therefore, increasing the number of adjectives, makes the text closer to scientific or subjective style.   The most important achievement of statistical methodology of Buseman can be considered as: a comparison of author’s style, literary periods and genres; b studying the language system and variety of words; and, c classification and grading differences or similarities of literary, periods and genres.   The purpose of this study: Stylistic analysis of Maqamat Hamadani and al-Hariri based on the statistical model of Buseman.   The question proposed in this study : a How effective is the use of the statistical methods , in identifying and analyzing a variety of literature including Maqamat? b How effective is the quantitative scope of verbs and adjectives , as two key parameters in determining the style of literary works? And c Which element of fiction is most effective in enriching the literary approach ?   Specific research method is statistical – analysis ; We initially discovered and classified the number of verbs and adjectives in the Fifty- one of Hamadani Maqamehs and Fifty of Hariri Maqamehs ; then the scope of verbs and adjectives usage is shown in the form of tables and graphs . After that , we will assess the style of the literary work , based on the use of verbs and adjectives.   The research findings show : All Hamadani and Hariri Maqamat s quantitatively benefit from high -active approach, for the use of the verb . At 46 Maqameh Hamadani , and

  19. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  20. Putting Order into Our Universe: The Concept of Blended Learning—A Methodology within the Concept-based Terminology Framework

    Directory of Open Access Journals (Sweden)

    Joana Fernandes

    2016-06-01

    Full Text Available This paper aims at discussing the advantages of a methodology design grounded on a concept-based approach to Terminology applied to the most prominent scenario of current Higher Education: blended learning. Terminology is a discipline that aims at representing, describing and defining specialized knowledge through language, putting order into our universe (Nuopponen, 2011. Concepts, as elements of the structure of knowledge (Sager, 1990 emerge as a complex research object. Can they be found in language? A concept-based approach to Terminology implies a clear-cut view of the role of language in terminological work: though language is postulated as being a fundamental tool to grasp, describe and organize knowledge, an isomorphic relationship between language and knowledge cannot be taken for granted. In other words, the foundational premise of a concept-based approach is that there is no one-to-one correspondence between atomic elements of knowledge and atomic elements of linguistic expression. This is why a methodological approach to Terminology merely based upon specialized text research is regarded as biased (Costa, 2013. As a consequence, we argue that interactional strategies between terminologist and domain expert deserve particular research attention. To our mind, the key to concept-based terminological work is to carry out a concept analysis of data gathered from a specialised text corpora combined with an elicitation process of the tacit knowledge and concept-oriented discursive negotiation. Following such view, we put forward a methodology to answer the question: how is blended learning defined in the Post-Bologna scenario? Even though there are numerous high-quality models and practical descriptions for its implementation (similarly to other concepts related to distance learning, the need to understand, demarcate and harmonize the concept of blended learning against the current Higher Education background results from the premise that

  1. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  2. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    Wallsgrove, R.; Munro, F.

    1991-01-01

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  3. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  4. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  5. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  6. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    2003-01-01

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  7. Pressure-based high-order TVD methodology for dynamic stall control

    Science.gov (United States)

    Yang, H. Q.; Przekwas, A. J.

    1992-01-01

    The quantitative prediction of the dynamics of separating unsteady flows, such as dynamic stall, is of crucial importance. This six-month SBIR Phase 1 study has developed several new pressure-based methodologies for solving 3D Navier-Stokes equations in both stationary and moving (body-comforting) coordinates. The present pressure-based algorithm is equally efficient for low speed incompressible flows and high speed compressible flows. The discretization of convective terms by the presently developed high-order TVD schemes requires no artificial dissipation and can properly resolve the concentrated vortices in the wing-body with minimum numerical diffusion. It is demonstrated that the proposed Newton's iteration technique not only increases the convergence rate but also strongly couples the iteration between pressure and velocities. The proposed hyperbolization of the pressure correction equation is shown to increase the solver's efficiency. The above proposed methodologies were implemented in an existing CFD code, REFLEQS. The modified code was used to simulate both static and dynamic stalls on two- and three-dimensional wing-body configurations. Three-dimensional effect and flow physics are discussed.

  8. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  9. A dose to curie conversion methodology

    International Nuclear Information System (INIS)

    Stowe, P.A.

    1987-01-01

    Development of the computer code RadCAT (Radioactive waste Classification And Tracking) has led to the development of a simple dose rate to curie content conversion methodology for containers with internally distributed radioactive material. It was determined early on that, if possible, the computerized dose rate to curie evaluation model employed in RadCAT should yield the same results as the hand method utilized and specified in plant procedures. A review of current industry practices indicated two distinct types of computational methodologies are presently in use. The most common methods are computer based calculations utilizing complex mathematical models specifically established for various containers geometries. This type of evaluation is tedious, however, and does not lend itself to repetition by hand. The second method of evaluation, therefore, is simplified expressions that sacrifice accuracy for ease of computation, and generally over estimate container curie content. To meet the aforementioned criterion current computer based models were deemed unacceptably complex and hand computational methods to be too inaccurate for serious consideration. The contact dose rate/curie content analysis methodology presented herein provides an equation that is easy to use in hand calculations yet provides accuracy equivalent to other computer based computations

  10. The decade 1989-1998 in Spanish psychology: an analysis of research in statistics, methodology, and psychometric theory.

    Science.gov (United States)

    García-Pérez, M A

    2001-11-01

    This paper presents an analysis of research published in the decade 1989-1998 by Spanish faculty members in the areas of statistical methods, research methodology, and psychometric theory. Database search and direct correspondence with faculty members in Departments of Methodology across Spain rendered a list of 193 papers published in these broad areas by 82 faculty members. These and other faculty members had actually published 931 papers over the decade of analysis, but 738 of them addressed topics not appropriate for description in this report. Classification and analysis of these 193 papers revealed topics that have attracted the most interest (psychophysics, item response theory, analysis of variance, sequential analysis, and meta-analysis) as well as other topics that have received less attention (scaling, factor analysis, time series, and structural models). A significant number of papers also dealt with various methodological issues (software, algorithms, instrumentation, and techniques). A substantial part of this report is devoted to describing the issues addressed across these 193 papers--most of which are written in the Spanish language and published in Spanish journals--and some representative references are given.

  11. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  13. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  14. Finite Element Based Response Surface Methodology to Optimize Segmental Tunnel Lining

    Directory of Open Access Journals (Sweden)

    A. Rastbood

    2017-04-01

    Full Text Available The main objective of this paper is to optimize the geometrical and engineering characteristics of concrete segments of tunnel lining using Finite Element (FE based Response Surface Methodology (RSM. Input data for RSM statistical analysis were obtained using FEM. In RSM analysis, thickness (t and elasticity modulus of concrete segments (E, tunnel height (H, horizontal to vertical stress ratio (K and position of key segment in tunnel lining ring (θ were considered as input independent variables. Maximum values of Mises and Tresca stresses and tunnel ring displacement (UMAX were set as responses. Analysis of variance (ANOVA was carried out to investigate the influence of each input variable on the responses. Second-order polynomial equations in terms of influencing input variables were obtained for each response. It was found that elasticity modulus and key segment position variables were not included in yield stresses and ring displacement equations, and only tunnel height and stress ratio variables were included in ring displacement equation. Finally optimization analysis of tunnel lining ring was performed. Due to absence of elasticity modulus and key segment position variables in equations, their values were kept to average level and other variables were floated in related ranges. Response parameters were set to minimum. It was concluded that to obtain optimum values for responses, ring thickness and tunnel height must be near to their maximum and minimum values, respectively and ground state must be similar to hydrostatic conditions.

  15. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  16. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  17. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  18. RANS based CFD methodology for a real scale 217-pin wire-wrapped fuel assembly of KAERI PGSFR

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Jae-Ho, E-mail: jhjeong@kaeri.re.kr [Korea Atomic Energy Research Institute, 989-111 Daedeok-daero, Yuseoung-gu, Daejeon (Korea, Republic of); Song, Min-Seop [Department of Nuclear Engineering, Seoul National University, 559 Gwanak-ro, Gwanak-gu, Seoul (Korea, Republic of); Lee, Kwi-Lim [Korea Atomic Energy Research Institute, 989-111 Daedeok-daero, Yuseoung-gu, Daejeon (Korea, Republic of)

    2017-03-15

    Highlights: • This paper presents a suitable way for a practical RANS based CFD methodology which is applicable to real scale 217-pin wire-wrapped fuel assembly of KAERI PGSFR. • A key point of differentiation of the RANS based CFD methodology in this study is adapting an innovative grid generation method using a fortran based in-house code with a GGI function in a general-purpose commercial CFD code, CFX. • The RANS based CFD methodology is implemented with high resolution scheme and SST turbulence model in the 7-pin 37-pin, and 127-pin wire-wrapped fuel assembly of PNC and JNC. Furthermore, the RANS based CFD methodology can be successfully extended to the real scale 217-pin wire-wrapped fuel bundles of KAERI PGSFR. • Three-dimensional thermal-hydraulic characteristics have been also investigated briefly. - Abstract: This paper presents a suitable way for a practical RANS (Reynolds Averaged Navier-Stokes simulation) based CFD (Computational Fluid Dynamics) methodology which is applicable to real scale 217-pin wire-wrapped fuel assembly of KAERI (Korea Atomic Energy Research Institute) PGSFR (Prototype Gen-IV Sodium-cooled Fast Reactor). The main purpose of the current study is to support license issue for the KAERI PGSFR core safety and to elucidate thermal-hydraulic characteristics in a 217-pin wire-wrapped fuel assembly of KAERI PGSFR. A key point of differentiation of the RANS based CFD methodology in this study is adapting an innovative grid generation method using a fortran based in-house code with a GGI (General Grid Interface) function in a general-purpose commercial CFD code, CFX. The innovative grid generation method with GGI function can achieve to simulate a real wire shape with minimizing cell skewness. The RANS based CFD methodology is implemented with high resolution scheme in convection term and SST (Shear Stress Transport) turbulence model in the 7-pin 37-pin, and 127-pin wire-wrapped fuel assembly of PNC (Power reactor and Nuclear fuel

  19. The Spirit of OMERACT: Q Methodology Analysis of Conference Characteristics Valued by Delegates.

    Science.gov (United States)

    Flurey, Caroline A; Kirwan, John R; Hadridge, Phillip; Richards, Pamela; Grosskleg, Shawna; Tugwell, Peter S

    2015-10-01

    To identify the major features of OMERACT meetings as valued by frequent participants and to explore whether there are groups of participants with different opinions. Using Q methodology (a qualitative and quantitative approach to grouping people according to subjective opinion), participants (who attended more than 1 OMERACT conference) sorted 66 statements relating to the "spirit of OMERACT" according to level of agreement across a normal distribution grid. Data were examined using Q factor analysis. Of 226 potential participants, 105 responded (46%). All participants highly ranked the focus on global standardization of methods, outcome measures, data-driven research, methodological discussion, and international collaboration. Four factors describing the "spirit of OMERACT" were identified: "Evidence not eminence" (n = 31) valued the data- and evidence-driven research above personality and status; "Collaboration and collegiality" (n = 19) valued the international and cross-stakeholder collaboration, interaction, and collegiality; "Equal voices, equal votes, common goals" (n = 12) valued equality in discussion and voting, with everyone striving toward the same goal; "principles and product, not process" (n = 8) valued the principles of focusing on outcome measures and the product of guiding clinical trials, but were unsure whether the process is necessary to reach this. The factors did not segregate different stakeholder groups. Delegates value different elements of OMERACT, and thus the "spirit of OMERACT" encompasses evidence-based research, collaboration, and equality, although a small group are unsure whether the process is necessary to achieve the end result. Q methodology may prove useful for conference organizers to identify their delegates' different needs to tailor conference content.

  20. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  1. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  2. An overall methodology for reliability prediction of mechatronic systems design with industrial application

    International Nuclear Information System (INIS)

    Habchi, Georges; Barthod, Christine

    2016-01-01

    We propose in this paper an overall ten-step methodology dedicated to the analysis and quantification of reliability during the design phase of a mechatronic system, considered as a complex system. The ten steps of the methodology are detailed according to the downward side of the V-development cycle usually used for the design of complex systems. Two main phases of analysis are complementary and cover the ten steps, qualitative analysis and quantitative analysis. The qualitative phase proposes to analyze the functional and dysfunctional behavior of the system and then determine its different failure modes and degradation states, based on external and internal functional analysis, organic and physical implementation, and dependencies between components, with consideration of customer specifications and mission profile. The quantitative phase is used to calculate the reliability of the system and its components, based on the qualitative behavior patterns, and considering data gathering and processing and reliability targets. Systemic approach is used to calculate the reliability of the system taking into account: the different technologies of a mechatronic system (mechanics, electronics, electrical .), dependencies and interactions between components and external influencing factors. To validate the methodology, the ten steps are applied to an industrial system, the smart actuator of Pack'Aero Company. - Highlights: • A ten-step methodology for reliability prediction of mechatronic systems design. • Qualitative and quantitative analysis for reliability evaluation using PN and RBD. • A dependency matrix proposal, based on the collateral and functional interactions. • Models consider mission profile, deterioration, interactions and influent factors. • Application and validation of the methodology on the “Smart Actuator” of PACK’AERO.

  3. Methodology of safety assessment and sensitivity analysis for geologic disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    Kimura, Hideo; Takahashi, Tomoyuki; Shima, Shigeki; Matsuzuru, Hideo

    1995-01-01

    A deterministic safety assessment methodology has been developed to evaluate long-term radiological consequences associated with geologic disposal of high-level radioactive waste, and to demonstrate a generic feasibility of geologic disposal. An exposure scenario considered here is based on a normal evolution scenario which excludes events attributable to probabilistic alterations in the environment. A computer code system GSRW thus developed is based on a non site-specific model, and consists of a set of sub-modules for calculating the release of radionuclides from engineered barriers, the transport of radionuclides in and through the geosphere, the behavior of radionuclides in the biosphere, and radiation exposures of the public. In order to identify the important parameters of the assessment models, an automated procedure for sensitivity analysis based on the Differential Algebra method has been developed to apply to the GSRW. (author)

  4. Performance analysis for disposal of mixed low-level waste. 1: Methodology

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.

    1999-01-01

    A simple methodology has been developed for evaluating the technical capabilities of potential sites for disposal of mixed low-level radioactive waste. The results of the evaluation are expressed as permissible radionuclide concentrations in disposed waste. The methodology includes an analysis of three separate pathways: (1) releases of radionuclides to groundwater; (2) releases of potentially volatile radionuclides to the atmosphere; and (3) the consequences of inadvertent intrusion into a disposal facility. For each radionuclide, its limiting permissible concentration in disposed waste is the lowest of the permissible concentrations determined from each of the three pathways. These permissible concentrations in waste at an evaluated site can be used to assess the capability of the site to dispose of waste streams containing multiple radionuclides

  5. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  6. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  7. Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base

    Science.gov (United States)

    Mcruer, Duane T.; Myers, Thomas T.

    1988-01-01

    The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.

  8. Combining a weighted caseload study with an organisational analysis in courts: first experiences with a new methodological approach in Switzerland

    Directory of Open Access Journals (Sweden)

    Daniela Winkler

    2015-07-01

    Full Text Available Determining the weighted caseload, i.e. the average amount of work time used for processing cases of different case categories, using different methodological approaches of weighted caseload studies results in case weights that indicate the current performance of a court. However, as the weighted caseload is often used in allocating resources or cases, the results of a weighted caseload study may be contested with the argument it is not clear whether they are based on an average good performance or whether higher or lower values could be assumed if operational management were optimised or qualitative aspects taken into account. Suitable methods therefore usually include quality adjustments of the weighted caseload. Also, the values can be validated using benchmarking. In Switzerland there is a general lack of workload measurement in courts. Therefore, in an analysis of the courts and in the Cantonal Prosecutor’s Office of a Swiss canton another method of validating weighted caseload values has been applied: the combination of a weighted caseload study with an organisational analysis. This paper introduces the new methodological approach and outlines preliminary methodological findings.

  9. Methodology for adding and amending glycaemic index values to a nutrition analysis package.

    LENUS (Irish Health Repository)

    Levis, Sharon P

    2011-04-01

    Since its introduction in 1981, the glycaemic index (GI) has been a useful tool for classifying the glycaemic effects of carbohydrate foods. Consumption of a low-GI diet has been associated with a reduced risk of developing CVD, diabetes mellitus and certain cancers. WISP (Tinuviel Software, Llanfechell, Anglesey, UK) is a nutrition software package used for the analysis of food intake records and 24 h recalls. Within its database, WISP contains the GI values of foods based on the International Tables 2002. The aim of the present study is to describe in detail a methodology for adding and amending GI values to the WISP database in a clinical or research setting, using data from the updated International Tables 2008.

  10. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  11. A Combined Methodology to Eliminate Artifacts in Multichannel Electrogastrogram Based on Independent Component Analysis and Ensemble Empirical Mode Decomposition.

    Science.gov (United States)

    Sengottuvel, S; Khan, Pathan Fayaz; Mariyappa, N; Patel, Rajesh; Saipriya, S; Gireesan, K

    2018-06-01

    Cutaneous measurements of electrogastrogram (EGG) signals are heavily contaminated by artifacts due to cardiac activity, breathing, motion artifacts, and electrode drifts whose effective elimination remains an open problem. A common methodology is proposed by combining independent component analysis (ICA) and ensemble empirical mode decomposition (EEMD) to denoise gastric slow-wave signals in multichannel EGG data. Sixteen electrodes are fixed over the upper abdomen to measure the EGG signals under three gastric conditions, namely, preprandial, postprandial immediately, and postprandial 2 h after food for three healthy subjects and a subject with a gastric disorder. Instantaneous frequencies of intrinsic mode functions that are obtained by applying the EEMD technique are analyzed to individually identify and remove each of the artifacts. A critical investigation on the proposed ICA-EEMD method reveals its ability to provide a higher attenuation of artifacts and lower distortion than those obtained by the ICA-EMD method and conventional techniques, like bandpass and adaptive filtering. Characteristic changes in the slow-wave frequencies across the three gastric conditions could be determined from the denoised signals for all the cases. The results therefore encourage the use of the EEMD-based technique for denoising gastric signals to be used in clinical practice.

  12. A non-linear reduced order methodology applicable to boiling water reactor stability analysis

    International Nuclear Information System (INIS)

    Prill, Dennis Paul

    2013-01-01

    Thermal-hydraulic coupling between power, flow rate and density, intensified by neutronics feedback are the main drivers of boiling water reactor (BWR) stability behavior. High-power low-flow conditions in connection with unfavorable power distributions can lead the BWR system into unstable regions where power oscillations can be triggered. This important threat to operational safety requires careful analysis for proper understanding. Analyzing an exhaustive parameter space of the non-linear BWR system becomes feasible with methodologies based on reduced order models (ROMs), saving computational cost and improving the physical understanding. Presently within reactor dynamics, no general and automatic prediction of high-dimensional ROMs based on detailed BWR models are available. In this thesis a systematic self-contained model order reduction (MOR) technique is derived which is applicable for several classes of dynamical problems, and in particular to BWRs of any degree of details. Expert knowledge can be given by operational, experimental or numerical transient data and is transfered into an optimal basis function representation. The methodology is mostly automated and provides the framework for the reduction of various different systems of any level of complexity. Only little effort is necessary to attain a reduced version within this self-written code which is based on coupling of sophisticated commercial software. The methodology reduces a complex system in a grid-free manner to a small system able to capture even non-linear dynamics. It is based on an optimal choice of basis functions given by the so-called proper orthogonal decomposition (POD). Required steps to achieve reliable and numerical stable ROM are given by a distinct calibration road-map. In validation and verification steps, a wide spectrum of representative test examples is systematically studied regarding a later BWR application. The first example is non-linear and has a dispersive character

  13. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  14. The Integration of Project-Based Methodology into Teaching in Machine Translation

    Science.gov (United States)

    Madkour, Magda

    2016-01-01

    This quantitative-qualitative analytical research aimed at investigating the effect of integrating project-based teaching methodology into teaching machine translation on students' performance. Data was collected from the graduate students in the College of Languages and Translation, at Imam Muhammad Ibn Saud Islamic University, Riyadh, Saudi…

  15. A methodological approach for designing a usable ontology-based GUI in healthcare.

    Science.gov (United States)

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  16. Universal Verification Methodology Based Register Test Automation Flow.

    Science.gov (United States)

    Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu

    2016-05-01

    In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.

  17. Development and application of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, H.Y.

    1981-01-01

    A methodology is developed to identify and flag significant trends related to the safety and availability of U.S. commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs, and significant errors affecting systems and which are caused by response action are highly associated with startup reactor mode in BWRS

  18. The methodology of root cause analysis for equipment failure and its application at Guangdong nuclear power stations

    International Nuclear Information System (INIS)

    Gao Ligang; Lu Qunxian

    2004-01-01

    The methodology of Equipment Failure Root Cause Analysis (RCA) is described, as a systematic analysis methodology, it includes 9 steps. Its process is explained by some real examples, and the 6 precautions applying RCA is pointed out. The paper also summarizes the experience of RCA application at Daya Bay Nuclear Power Station, and the 7 key factors for RCA success is emphasized, that mainly concerns organization, objective, analyst, analysis technique, external technical supporting system, corrective actions developing and monitoring system for corrective actions. (authors)

  19. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  20. Breast cancer statistics and prediction methodology: a systematic review and analysis.

    Science.gov (United States)

    Dubey, Ashutosh Kumar; Gupta, Umesh; Jain, Sonal

    2015-01-01

    Breast cancer is a menacing cancer, primarily affecting women. Continuous research is going on for detecting breast cancer in the early stage as the possibility of cure in early stages is bright. There are two main objectives of this current study, first establish statistics for breast cancer and second to find methodologies which can be helpful in the early stage detection of the breast cancer based on previous studies. The breast cancer statistics for incidence and mortality of the UK, US, India and Egypt were considered for this study. The finding of this study proved that the overall mortality rates of the UK and US have been improved because of awareness, improved medical technology and screening, but in case of India and Egypt the condition is less positive because of lack of awareness. The methodological findings of this study suggest a combined framework based on data mining and evolutionary algorithms. It provides a strong bridge in improving the classification and detection accuracy of breast cancer data.

  1. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  2. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  3. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  4. Practical implementation of a methodology for digital images authentication using forensics techniques

    OpenAIRE

    Francisco Rodríguez-Santos; Guillermo Delgado-Gutierréz; Leonardo Palacios-Luengas; Rubén Vázquez Medina

    2015-01-01

    This work presents a forensics analysis methodology implemented to detect modifications in JPEG digital images by analyzing the image’s metadata, thumbnail, camera traces and compression signatures. Best practices related with digital evidence and forensics analysis are considered to determine if the technical attributes and the qualities of an image are consistent with each other. This methodology is defined according to the recommendations of the Good Practice Guide for Computer-Based Elect...

  5. AN INDUCTIVE, INTERACTIVE AND ADAPTIVE HYBRID PROBLEM-BASED LEARNING METHODOLOGY: APPLICATION TO STATISTICS

    Directory of Open Access Journals (Sweden)

    ADA ZHENG

    2011-10-01

    Full Text Available We have developed an innovative hybrid problem-based learning (PBL methodology. The methodology has the following distinctive features: i Each complex question was decomposed into a set of coherent finer subquestions by following the carefully designed criteria to maintain a delicate balance between guiding the students and inspiring them to think independently. This learning methodology enabled the students to solve the complex questions progressively in an inductive context. ii Facilitated by the utilization of our web-based learning systems, the teacher was able to interact with the students intensively and could allocate more teaching time to provide tailor-made feedback for individual student. The students were actively engaged in the learning activities, stimulated by the intensive interaction. iii The answers submitted by the students could be automatically consolidated in the report of the Moodle system in real-time. The teacher could adjust the teaching schedule and focus of the class to adapt to the learning progress of the students by analysing the automatically generated report and log files of the web-based learning system. As a result, the attendance rate of the students increased from about 50% to more than 90%, and the students’ learning motivation have been significantly enhanced.

  6. Use of PFMEA methodology as a competitive advantage for the analysis of improvements in an experimental procedure

    Directory of Open Access Journals (Sweden)

    Fernando Coelho

    2015-12-01

    Full Text Available The methodology of Failure Modes and Effects Analysis (FMEA, utilized by industries to investigate potential failures, contributes to ensuring the robustness of the project and the manufacturing process, even before production starts. Thus, there is a reduced likelihood of errors, and a higher level of efficiency and effectiveness at high productivity. This occurs through the elimination or reduction of productive problems. In this context, this study is based on the structured application of PFMEA (Process Failure Mode Effects Analysis, associated with other quality tools, in a simulation of the assembly of an electro-pneumatic system. This study was performed at the Experimental Laboratory of the Botucatu Technology Faculty (FATEC, with the support of five undergraduate students from the Technology Industrial Production Course. The methodology applied contributed to the forecast of 24 potential failures and improvements opportunities, investigation of their causes, proving to be a standard that is applicable to any productive process with a gain in efficiency and effectiveness. Therefore, the final strategy was to evaluate and minimize the potential failures, to reduce production costs and to increase the performance of the process.

  7. Proposition of a modeling and an analysis methodology of integrated reverse logistics chain in the direct chain

    Energy Technology Data Exchange (ETDEWEB)

    Mimouni, F.; Abouabdellah, A.

    2016-07-01

    Propose a modeling and analysis methodology based on the combination of Bayesian networks and Petri networks of the reverse logistics integrated the direct supply chain. Network modeling by combining Petri and Bayesian network. Modeling with Bayesian network complimented with Petri network to break the cycle problem in the Bayesian network. Demands are independent from returns. Model can only be used on nonperishable products. Legislation aspects: Recycling laws; Protection of environment; Client satisfaction via after sale service. Bayesian network with a cycle combined with the Petri Network. (Author)

  8. Coupling Computer Codes for The Analysis of Severe Accident Using A Pseudo Shared Memory Based on MPI

    International Nuclear Information System (INIS)

    Cho, Young Chul; Park, Chang-Hwan; Kim, Dong-Min

    2016-01-01

    As there are four codes in-vessel analysis code (CSPACE), ex-vessel analysis code (SACAP), corium behavior analysis code (COMPASS), and fission product behavior analysis code, for the analysis of severe accident, it is complex to implement the coupling of codes with the similar methodologies for RELAP and CONTEMPT or SPACE and CAP. Because of that, an efficient coupling so called Pseudo shared memory architecture was introduced. In this paper, coupling methodologies will be compared and the methodology used for the analysis of severe accident will be discussed in detail. The barrier between in-vessel and ex-vessel has been removed for the analysis of severe accidents with the implementation of coupling computer codes with pseudo shared memory architecture based on MPI. The remaining are proper choice and checking of variables and values for the selected severe accident scenarios, e.g., TMI accident. Even though it is possible to couple more than two computer codes with pseudo shared memory architecture, the methodology should be revised to couple parallel codes especially when they are programmed using MPI

  9. Coupling Computer Codes for The Analysis of Severe Accident Using A Pseudo Shared Memory Based on MPI

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Young Chul; Park, Chang-Hwan; Kim, Dong-Min [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    As there are four codes in-vessel analysis code (CSPACE), ex-vessel analysis code (SACAP), corium behavior analysis code (COMPASS), and fission product behavior analysis code, for the analysis of severe accident, it is complex to implement the coupling of codes with the similar methodologies for RELAP and CONTEMPT or SPACE and CAP. Because of that, an efficient coupling so called Pseudo shared memory architecture was introduced. In this paper, coupling methodologies will be compared and the methodology used for the analysis of severe accident will be discussed in detail. The barrier between in-vessel and ex-vessel has been removed for the analysis of severe accidents with the implementation of coupling computer codes with pseudo shared memory architecture based on MPI. The remaining are proper choice and checking of variables and values for the selected severe accident scenarios, e.g., TMI accident. Even though it is possible to couple more than two computer codes with pseudo shared memory architecture, the methodology should be revised to couple parallel codes especially when they are programmed using MPI.

  10. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  11. Design-Based Research: Is This a Suitable Methodology for Short-Term Projects?

    Science.gov (United States)

    Pool, Jessica; Laubscher, Dorothy

    2016-01-01

    This article reports on a design-based methodology of a thesis in which a fully face-to-face contact module was converted into a blended learning course. The purpose of the article is to report on how design-based phases, in the form of micro-, meso- and macro-cycles were applied to improve practice and to generate design principles. Design-based…

  12. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  13. RAMA Methodology for the Calculation of Neutron Fluence

    International Nuclear Information System (INIS)

    Villescas, G.; Corchon, F.

    2013-01-01

    he neutron fluence plays an important role in the study of the structural integrity of the reactor vessel after a certain time of neutron irradiation. The NRC defined in the Regulatory Guide 1.190, the way must be estimated neutron fluence, including uncertainty analysis of the validation process (creep uncertainty is ? 20%). TRANSWARE Enterprises Inc. developed a methodology for calculating the neutron flux, 1,190 based guide, known as RAMA. Uncertainty values obtained with this methodology, for about 18 vessels, are less than 10%.

  14. Methodology to estimate parameters of an excitation system based on experimental conditions

    Energy Technology Data Exchange (ETDEWEB)

    Saavedra-Montes, A.J. [Carrera 80 No 65-223, Bloque M8 oficina 113, Escuela de Mecatronica, Universidad Nacional de Colombia, Medellin (Colombia); Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Ramirez-Scarpetta, J.M. [Calle 13 No 100-00, Escuela de Ingenieria Electrica y Electronica, Universidad del Valle, Cali, Valle (Colombia); Malik, O.P. [2500 University Drive N.W., Electrical and Computer Engineering Department, University of Calgary, Calgary, Alberta (Canada)

    2011-01-15

    A methodology to estimate the parameters of a potential-source controlled rectifier excitation system model is presented in this paper. The proposed parameter estimation methodology is based on the characteristics of the excitation system. A comparison of two pseudo random binary signals, two sampling periods for each one, and three estimation algorithms is also presented. Simulation results from an excitation control system model and experimental results from an excitation system of a power laboratory setup are obtained. To apply the proposed methodology, the excitation system parameters are identified at two different levels of the generator saturation curve. The results show that it is possible to estimate the parameters of the standard model of an excitation system, recording two signals and the system operating in closed loop with the generator. The normalized sum of squared error obtained with experimental data is below 10%, and with simulation data is below 5%. (author)

  15. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  16. EA Training 2.0 Newsletter #3 - EA Active, Problem Based Learning Methodology

    DEFF Research Database (Denmark)

    Buus, Lillian; Ryberg, Thomas; Sroga, Magdalena

    2010-01-01

    The main products of the project are innovative, active problem-based learning methodology for EA education and training, EA courses for university students and private and public sector employees, and an Enterprise Architecture competence ontology including a complete specification of skills...

  17. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    Science.gov (United States)

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  18. Knowledge-based operation guidance system for nuclear power plants based on generic task methodology

    International Nuclear Information System (INIS)

    Yamada, Naoyuki; Chandrasekaran, B.; Bhatnager, R.

    1989-01-01

    A knowledge-based system for operation guidance of nuclear power plants is proposed. The Dynamic Procedure Management System (DPMS) is designed and developed to assist human operators interactively by selecting and modifying predefined operation procedures in a dynamic situation. Unlike most operation guidance systems, DPMS has been built based on Generic Task Methodology, which makes the overall framework of the system perspicuous and also lets domain knowledge be represented in a natural way. This paper describes the organization of the system, the definition of each task, and the form and organization of knowledge, followed by an application example. (author)

  19. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  20. Core design methodology and software for Temelin NPP

    International Nuclear Information System (INIS)

    Havluj, F; Hejzlar, J.; Klouzal, J.; Stary, V.; Vocka, R.

    2011-01-01

    In the frame of the process of fuel vendor change at Temelin NPP in the Czech Republic, where, starting since 2010, TVEL TVSA-T fuel is loaded instead of Westinghouse VVANTAGE-6 fuel, new methodologies for core design and core reload safety evaluation have been developed. These documents are based on the methodologies delivered by TVEL within the fuel contract, and they were further adapted according to Temelin NPP operational needs and according to the current practice at NPP. Along with the methodology development the 3D core analysis code ANDREA, licensed for core reload safety evaluation in 2010, have been upgraded in order to optimize the safety evaluation process. New sequences of calculations were implemented in order to simplify the evaluation of different limiting parameters and output visualization tools were developed to make the verification process user friendly. Interfaces to the fuel performance code TRANSURANUS and sub-channel analysis code SUBCAL were developed as well. (authors)

  1. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    Science.gov (United States)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  2. Drag &Drop, Mixed-Methodology-based Lab-on-Chip Design Optimization Software, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The overall objective is to develop a ?mixed-methodology?, drag and drop, component library (fluidic-lego)-based, system design and optimization tool for complex...

  3. Methodology for development of risk indicators for offshore platforms

    International Nuclear Information System (INIS)

    Oeien, K.; Sklet, S.

    1999-01-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  4. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  5. Base line definitions and methodological lessons from Zimbabwe

    International Nuclear Information System (INIS)

    Maya, R.S.

    1995-01-01

    The UNEP Greenhouse Gas Abatement Costing Studies carried out under the management of the UNEP Collaborating Centre On Energy and Environment at Risoe National Laboratories in Denmark has placed effort in generating methodological approaches to assessing the cost of abatement activities to reduce CO 2 emissions. These efforts have produced perhaps the most comprehensive set of methodological approaches to defining and assessing the cost of greenhouse gas abatement. Perhaps the most importance aspect of the UNEP study which involved teams of researchers from ten countries is the mix of countries in which the studies were conducted and hence the representation of views and concepts from researchers in these countries particularly those from developing countries namely, Zimbabwe, India, Venezuela, Brazil, Thailand and Senegal. Methodological lessons from Zimbabwe, therefore, would have benefited from the interactions with methodological experiences form the other participating countries. Methodological lessons from the Zimbabwean study can be placed in two categories. One relates to the modelling of tools to analyze economic trends and the various factors studied in order to determine the unit cost of CO 2 abatement. The other is the definition of factors influencing the levels of emissions reducible and those realised under specific economic trends. (au)

  6. PLACE-BASED GREEN BUILDING: INTEGRATING LOCAL ENVIRONMENTAL AND PLANNING ANALYSIS INTO GREEN BUILDING GUIDELINES

    Science.gov (United States)

    This project will develop a model for place-based green building guidelines based on an analysis of local environmental, social, and land use conditions. The ultimate goal of this project is to develop a methodology and model for placing green buildings within their local cont...

  7. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  8. A network-base analysis of CMIP5 "historical" experiments

    Science.gov (United States)

    Bracco, A.; Foudalis, I.; Dovrolis, C.

    2012-12-01

    In computer science, "complex network analysis" refers to a set of metrics, modeling tools and algorithms commonly used in the study of complex nonlinear dynamical systems. Its main premise is that the underlying topology or network structure of a system has a strong impact on its dynamics and evolution. By allowing to investigate local and non-local statistical interaction, network analysis provides a powerful, but only marginally explored, framework to validate climate models and investigate teleconnections, assessing their strength, range, and impacts on the climate system. In this work we propose a new, fast, robust and scalable methodology to examine, quantify, and visualize climate sensitivity, while constraining general circulation models (GCMs) outputs with observations. The goal of our novel approach is to uncover relations in the climate system that are not (or not fully) captured by more traditional methodologies used in climate science and often adopted from nonlinear dynamical systems analysis, and to explain known climate phenomena in terms of the network structure or its metrics. Our methodology is based on a solid theoretical framework and employs mathematical and statistical tools, exploited only tentatively in climate research so far. Suitably adapted to the climate problem, these tools can assist in visualizing the trade-offs in representing global links and teleconnections among different data sets. Here we present the methodology, and compare network properties for different reanalysis data sets and a suite of CMIP5 coupled GCM outputs. With an extensive model intercomparison in terms of the climate network that each model leads to, we quantify how each model reproduces major teleconnections, rank model performances, and identify common or specific errors in comparing model outputs and observations.

  9. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  10. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    Bataillon, Remy; Lafont, Marielle; Rousse, Carole; Vuillez, Jean-Philippe; Ducou Le Pointe, Hubert; Grenier, Nicolas; Lartigau, Eric; Orcel, Philippe; Dujarric, Francis; Beaupin, Alain; Bar, Olivier; Blondet, Emmanuelle; Combe, Valerie; Pages, Frederique

    2012-11-01

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  11. Study of methodology for low power/shutdown fire PSA

    International Nuclear Information System (INIS)

    Yan Zhen; Li Zhaohua; Li Lin; Song Lei

    2014-01-01

    As a risk assessment technology based on probability, the fire PSA is accepted abroad by nuclear industry in its application in the risk assessment for nuclear power plants. Based on the industry experience, the fire-induced impact on the plant safety during low power and shutdown operation cannot be neglected, therefore fire PSA can be used to assess the corresponding fire risk. However, there is no corresponding domestic guidance/standard as well as accepted analysis methodology up to date. Through investigating the latest evolvement on fire PSA during low power and shutdown operation, and integrating its characteristic with the corresponding engineering experience, an engineering methodology to evaluate the fire risk during low power and shutdown operation for nuclear power plant is established in this paper. In addition, an analysis demonstration as an example is given. (authors)

  12. [Activity-based costing methodology to manage resources in intensive care units].

    Science.gov (United States)

    Alvear V, Sandra; Canteros G, Jorge; Jara M, Juan; Rodríguez C, Patricia

    2013-11-01

    An accurate estimation of resources use by individual patients is crucial in hospital management. To measure financial costs of health care actions in intensive care units of two public regional hospitals in Chile. Prospective follow up of 716 patients admitted to two intensive care units during 2011. The financial costs of health care activities was calculated using the Activity-Based Costing methodology. The main activities recorded were procedures and treatments, monitoring, response to patient needs, patient maintenance and coordination. Activity-Based Costs, including human resources and assorted indirect costs correspond to 81 to 88% of costs per disease in one hospital and 69 to 80% in the other. The costs associated to procedures and treatments are the most significant and are approximately $100,000 (Chilean pesos) per day of hospitalization. The second most significant cost corresponds to coordination activities, which fluctuates between $86,000 and 122,000 (Chilean pesos). There are significant differences in resources use between the two hospitals studied. Therefore cost estimation methodologies should be incorporated in the management of these clinical services.

  13. Sociocultural Meanings of Nanotechnology: Research Methodologies

    Science.gov (United States)

    Bainbridge, William Sims

    2004-06-01

    This article identifies six social-science research methodologies that will be useful for charting the sociocultural meaning of nanotechnology: web-based questionnaires, vignette experiments, analysis of web linkages, recommender systems, quantitative content analysis, and qualitative textual analysis. Data from a range of sources are used to illustrate how the methods can delineate the intellectual content and institutional structure of the emerging nanotechnology culture. Such methods will make it possible in future to test hypotheses such as that there are two competing definitions of nanotechnology - the technical-scientific and the science-fiction - that are influencing public perceptions by different routes and in different directions.

  14. Development of Thermal-hydraulic Analysis Methodology for Multi-module Breeding Blankets in K-DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Geon-Woo; Lee, Jeong-Hun; Park, Goon-Cherl; Cho, Hyoung-Kyu [Seoul National University, Seoul (Korea, Republic of); Im, Kihak [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    In this paper, the purpose of the analyses is to extend the capability of MARS-KS to the entire blanket system which includes a few hundreds of single blanket modules. Afterwards, the plan for the whole blanket system analysis using MARS-KS is introduced and the result of the multiple blanket module analysis is summarized. A thermal-hydraulic analysis code for a nuclear reactor safety, MARS-KS, was applied for the conceptual design of the K-DEMO breeding blanket thermal analysis. Then, a methodology to simulate multiple blanket modules was proposed, which uses a supervisor program to handle each blanket module individually at first and then distribute the flow rate considering pressure drops arises in each module. For a feasibility test of the proposed methodology, 10 outboard blankets in a toroidal field sector were simulated, which are connected with each other through the inlet and outlet common headers. The calculation results of flow rates, pressure drops, and temperatures showed the validity of the calculation and thanks to the parallelization using MPI, almost linear speed-up could be obtained.

  15. Methodology based on genetic heuristics for in-vivo characterizing the patient-specific biomechanical behavior of the breast tissues.

    Science.gov (United States)

    Lago, M A; Rúperez, M J; Martínez-Martínez, F; Martínez-Sanchis, S; Bakic, P R; Monserrat, C

    2015-11-30

    This paper presents a novel methodology to in-vivo estimate the elastic constants of a constitutive model proposed to characterize the mechanical behavior of the breast tissues. An iterative search algorithm based on genetic heuristics was constructed to in-vivo estimate these parameters using only medical images, thus avoiding invasive measurements of the mechanical response of the breast tissues. For the first time, a combination of overlap and distance coefficients were used for the evaluation of the similarity between a deformed MRI of the breast and a simulation of that deformation. The methodology was validated using breast software phantoms for virtual clinical trials, compressed to mimic MRI-guided biopsies. The biomechanical model chosen to characterize the breast tissues was an anisotropic neo-Hookean hyperelastic model. Results from this analysis showed that the algorithm is able to find the elastic constants of the constitutive equations of the proposed model with a mean relative error of about 10%. Furthermore, the overlap between the reference deformation and the simulated deformation was of around 95% showing the good performance of the proposed methodology. This methodology can be easily extended to characterize the real biomechanical behavior of the breast tissues, which means a great novelty in the field of the simulation of the breast behavior for applications such as surgical planing, surgical guidance or cancer diagnosis. This reveals the impact and relevance of the presented work.

  16. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  17. Steel catenary riser design based on coupled analysis methodology; Projeto de um riser rigido em catenaria baseado em metodologia de analise acoplada

    Energy Technology Data Exchange (ETDEWEB)

    Bahiense, Rodrigo A.; Rodrigues, Marcos V. [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Due to adoption of oil and gas production systems to deep and ultra-deep water, achieving frontiers that were not conceivable before, research in computational methods has contributed to implementation of sophisticated and efficient numerical tools in order to perform simulations considering those operational conditions. Thus, this work presents a steel catenary riser (SCR) design which is connected to a semi-submersible platform where the applied motions to SCR are obtained from analyses based on de-coupled and coupled methodologies. With increase of the line number of the system and water depth, the effects due to coupling of the lines on the platform motions are significant. It can be observed from performed evaluations for a floating production system, comparing the obtained results for SCR under extreme and fatigue conditions when it was submitted to the platform motions considering both methodologies mentioned. So, from the results obtained it can be concluded that for production system evaluated the adoption of decoupled methodology presents more conservative results. Thus the adoption of the coupled model associated to design practice of production lines presents as an alternative more realist and accurate to provide the optimization in the design of these structures. (author)

  18. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-10-15

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied.

  19. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    International Nuclear Information System (INIS)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo

    2016-01-01

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied

  20. A methodology for stochastic analysis of share prices as Markov chains with finite states.

    Science.gov (United States)

    Mettle, Felix Okoe; Quaye, Enoch Nii Boi; Laryea, Ravenhill Adjetey

    2014-01-01

    Price volatilities make stock investments risky, leaving investors in critical position when uncertain decision is made. To improve investor evaluation confidence on exchange markets, while not using time series methodology, we specify equity price change as a stochastic process assumed to possess Markov dependency with respective state transition probabilities matrices following the identified state pace (i.e. decrease, stable or increase). We established that identified states communicate, and that the chains are aperiodic and ergodic thus possessing limiting distributions. We developed a methodology for determining expected mean return time for stock price increases and also establish criteria for improving investment decision based on highest transition probabilities, lowest mean return time and highest limiting distributions. We further developed an R algorithm for running the methodology introduced. The established methodology is applied to selected equities from Ghana Stock Exchange weekly trading data.

  1. More Than Just a Discursive Practice? Conceptual Principles and Methodological Aspects of Dispositif Analysis

    Directory of Open Access Journals (Sweden)

    Andrea D. Bührmann

    2007-05-01

    Full Text Available This article gives an introduction into the conceptual and practical field of dispositf analysis—a field that is of great importance but that is as yet underdeveloped. In order to render this introduction, we first explain the terms discourse and dispositif. Then we examine the conceptual instruments and methodological procedures of dispositf analysis. In this way, we define the relations between discourse and (a non discoursive practices (b subjectification, (c everyday orders of knowledge and (d institutional practices like societal changes as central issues of dispositif analysis. Furthermore, we point out the methodological possibilities and limitations of dispositif analysis. We demonstrate these possibilities and limitations with some practical examples. In general, this article aims to provide an extension of the perspectives of discourse theory and research by stressing the relations between normative orders of knowledge, their effects on interactions and individual self–reflections connected with them. URN: urn:nbn:de:0114-fqs0702281

  2. Methodological exploratory study applied to occupational epidemiology

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: janetegc@ipen.br

    2007-07-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  3. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A.

    2007-01-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  4. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  5. Project based learning in organizations: towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; Krogt, F.J. van der

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article

  6. Project-based learning in organizations : Towards a methodology for learning in groups

    NARCIS (Netherlands)

    Poell, R.F.; van der Krogt, F.J.

    2003-01-01

    This article introduces a methodology for employees in organizations to set up and carry out their own group learning projects. It is argued that employees can use project-based learning to make their everyday learning more systematic at times, without necessarily formalizing it. The article

  7. A case study for INPRO methodology based on Indian advanced heavy water reactor

    International Nuclear Information System (INIS)

    Anantharaman, K.; Saha, D.; Sinha, R.K.

    2004-01-01

    Under Phase 1A of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) a methodology (INPRO methodology) has been developed which can be used to evaluate a given energy system or a component of such a system on a national and/or global basis. The INPRO study can be used for assessing the potential of the innovative reactor in terms of economics, sustainability and environment, safety, waste management, proliferation resistance and cross cutting issues. India, a participant in INPRO program, is engaged in a case study applying INPRO methodology based on Advanced Heavy Water Reactor (AHWR). AHWR is a 300 MWe, boiling light water cooled, heavy water moderated and vertical pressure tube type reactor. Thorium utilization is very essential for Indian nuclear power program considering the indigenous resource availability. The AHWR is designed to produce most of its power from thorium, aided by a small input of plutonium-based fuel. The features of AHWR are described in the paper. The case study covers the fuel cycle, to be followed in the near future, for AHWR. The paper deals with initial observations of the case study with regard to fuel cycle issues. (authors)

  8. ICP-MS/MS-Based Ionomics: A Validated Methodology to Investigate the Biological Variability of the Human Ionome.

    Science.gov (United States)

    Konz, Tobias; Migliavacca, Eugenia; Dayon, Loïc; Bowman, Gene; Oikonomidi, Aikaterini; Popp, Julius; Rezzi, Serge

    2017-05-05

    We here describe the development, validation and application of a quantitative methodology for the simultaneous determination of 29 elements in human serum using state-of-the-art inductively coupled plasma triple quadrupole mass spectrometry (ICP-MS/MS). This new methodology offers high-throughput elemental profiling using simple dilution of minimal quantity of serum samples. We report the outcomes of the validation procedure including limits of detection/quantification, linearity of calibration curves, precision, recovery and measurement uncertainty. ICP-MS/MS-based ionomics was used to analyze human serum of 120 older adults. Following a metabolomic data mining approach, the generated ionome profiles were subjected to principal component analysis revealing gender and age-specific differences. The ionome of female individuals was marked by higher levels of calcium, phosphorus, copper and copper to zinc ratio, while iron concentration was lower with respect to male subjects. Age was associated with lower concentrations of zinc. These findings were complemented with additional readouts to interpret micronutrient status including ceruloplasmin, ferritin and inorganic phosphate. Our data supports a gender-specific compartmentalization of the ionome that may reflect different bone remodelling in female individuals. Our ICP-MS/MS methodology enriches the panel of validated "Omics" approaches to study molecular relationships between the exposome and the ionome in relation with nutrition and health.

  9. An Indicator Based Assessment Methodology Proposal for the Identification of Domestic Systemically Important Banks within the Turkish Banking Sector

    OpenAIRE

    Ozge ULKUTAS SACCI; Guven SAYILGAN

    2014-01-01

    This study aims to identify domestic systemically important banks (D-SIB) operating within the Turkish Banking Sector. In this regard, adopting an indicator based assessment methodology together with the cluster analysis application, banks in the sample are classified in terms of their degree of systemic importance by using publicly available year-end data of 2012. The study has shown that a total of 7 banks with the highest systemic importance clustered away from the remaining 21 banks in th...

  10. A proposed descriptive methodology for environmental geologic (envirogeologic) site characterization

    International Nuclear Information System (INIS)

    Schwarz, D.L.; Snyder, W.S.

    1994-01-01

    We propose a descriptive methodology for use in environmental geologic (envirogeologic) site characterization. The method uses traditional sedimentologic descriptions augmented by environmental data needs, and facies analysis. Most other environmental methodologies for soil and sediment characterization use soil engineering and engineering geology techniques that classify by texture and engineering properties. This technique is inadequate for envirogeologic characterization of sediments. In part, this inadequacy is due to differences in the grain-size between the Unified soil Classification and the Udden-Wentworth scales. Use of the soil grain-size classification could easily cause confusion when attempting to relate descriptions based on this classification to our basic understanding of sedimentary depositional systems. The proposed envirogeologic method uses descriptive parameters to characterize a sediment sample, suggests specific tests on samples for adequate characterization, and provides a guidelines for subsurface facies analysis, based on data retrieved from shallow boreholes, that will allow better predictive models to be developed. This methodology should allow for both a more complete site assessment, and provide sufficient data for selection of the appropriate remediation technology, including bioremediation. 50 refs

  11. The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics

    Science.gov (United States)

    Mazzorana, B.; Fuchs, S.; Levaggi, L.

    2012-04-01

    The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.

  12. Qualitative safety analysis in accelerator based systems

    International Nuclear Information System (INIS)

    Sarkar, P.K.; Chowdhury, Lekha M.

    2006-01-01

    In recent developments connected to high energy and high current accelerators, the accelerator driven systems (ADS) and the Radioactive Ion Beam (RIB) facilities come in the forefront of application. For medical and industrial applications high current accelerators often need to be located in populated areas. These facilities pose significant radiological hazard during their operation and accidental situations. We have done a qualitative evaluation of radiological safety analysis using the probabilistic safety analysis (PSA) methods for accelerator-based systems. The major contribution to hazard comes from a target rupture scenario in both ADS and RIB facilities. Other significant contributors to hazard in the facilities are also discussed using fault tree and event tree methodologies. (author)

  13. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  14. Optimum extrusion-cooking conditions for improving physical properties of fish-cereal based snacks by response surface methodology.

    Science.gov (United States)

    Singh, R K Ratankumar; Majumdar, Ranendra K; Venkateshwarlu, G

    2014-09-01

    To establish the effect of barrel temperature, screw speed, total moisture and fish flour content on the expansion ratio and bulk density of the fish based extrudates, response surface methodology was adopted in this study. The experiments were optimized using five-levels, four factors central composite design. Analysis of Variance was carried to study the effects of main factors and interaction effects of various factors and regression analysis was carried out to explain the variability. The fitting was done to a second order model with the coded variables for each response. The response surface plots were developed as a function of two independent variables while keeping the other two independent variables at optimal values. Based on the ANOVA, the fitted model confirmed the model fitness for both the dependent variables. Organoleptically highest score was obtained with the combination of temperature-110(0) C, screw speed-480 rpm, moisture-18 % and fish flour-20 %.

  15. METHODOLOGICAL APPROACHES TO THE ANALYSIS OF EFFICIENCY OF CASH FLOW MANAGEMENT IN INVESTMENT ACTIVITY OF THE ENTERPRISES

    Directory of Open Access Journals (Sweden)

    I. Magdych

    2015-06-01

    Full Text Available The article explores the methodological approaches to the analysis of cash flows in investment activity of the enterprise; the system of motion net cash flows, reflecting the impact of cash management efficiency on the amount and source of investment cash flows of the enterprise; analytical model of definition of effectiveness of cash management of the enterprise is proposed, based on the selected principals of modeling, comprehensive analysis of cash flows in investing activities and their optimization for the purpose of maximization of social and economic benefit. The research performed here allowed generalization and definition of stages of analysis for investing cash flow of the enterprise with the appropriate reasoning. It is necessary that research is going concern in this direction of effectiveness valuation of cash flow management in investing activity of the enterprise.

  16. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    Science.gov (United States)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  17. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  18. Evaluation of methodologies for assessing the overall diet: dietary quality scores and dietary pattern analysis.

    Science.gov (United States)

    Ocké, Marga C

    2013-05-01

    This paper aims to describe different approaches for studying the overall diet with advantages and limitations. Studies of the overall diet have emerged because the relationship between dietary intake and health is very complex with all kinds of interactions. These cannot be captured well by studying single dietary components. Three main approaches to study the overall diet can be distinguished. The first method is researcher-defined scores or indices of diet quality. These are usually based on guidelines for a healthy diet or on diets known to be healthy. The second approach, using principal component or cluster analysis, is driven by the underlying dietary data. In principal component analysis, scales are derived based on the underlying relationships between food groups, whereas in cluster analysis, subgroups of the population are created with people that cluster together based on their dietary intake. A third approach includes methods that are driven by a combination of biological pathways and the underlying dietary data. Reduced rank regression defines linear combinations of food intakes that maximally explain nutrient intakes or intermediate markers of disease. Decision tree analysis identifies subgroups of a population whose members share dietary characteristics that influence (intermediate markers of) disease. It is concluded that all approaches have advantages and limitations and essentially answer different questions. The third approach is still more in an exploration phase, but seems to have great potential with complementary value. More insight into the utility of conducting studies on the overall diet can be gained if more attention is given to methodological issues.

  19. Development of a methodology of analysis of instabilities in BWR reactors; Desarrollo de una metodologia de analisis de inestabilidades en reactores PWR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Fenoll, M.; Abarca, A.; Barrachina, T.; Miro, R.; Verdu, G.

    2012-07-01

    This paper presents a methodology of analysis of the reactors instabilities of BWR type. This methodology covers of modal analysis of the point operation techniques of signal analysis and simulation of transients, through 3D Coupled RELAP5/PARCSv2.7 code.

  20. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    Barabas, Roberta de C.; Sabundjian, Gaiane

    2015-01-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)