WorldWideScience

Sample records for methodological developments large

  1. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  2. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Science.gov (United States)

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  3. Development of an Evaluation Methodology for Loss of Large Area induced from extreme events

    International Nuclear Information System (INIS)

    Kim, Sok Chul; Park, Jong Seuk; Kim, Byung Soon; Jang, Dong Ju; Lee, Seung Woo

    2015-01-01

    USNRC announced several regulatory requirements and guidance documents regarding the event of loss of large area including 10CFR 50.54(hh), Regulatory Guide 1.214 and SRP 19.4. In Korea, consideration of loss of large area has been limitedly taken into account for newly constructing NPPs as voluntary based. In general, it is hardly possible to find available information on methodology and key assumptions for the assessment of LOLA due to 'need to know based approach'. Urgent needs exists for developing country specific regulatory requirements, guidance and evaluation methodology by themselves with the consideration of their own geographical and nuclear safety and security environments. Currently, Korea Hydro and Nuclear Power Company (KHNP) has developed an Extended Damage Mitigation Guideline (EDMG) for APR1400 under contract with foreign consulting company. The submittal guidance NEI 06-12 related to B.5.b Phase 2 and 3 focused on unit-wise mitigation strategy instead of site level mitigation or response strategy. Phase 1 mitigating strategy and guideline for LOLA (Loss of Large Area) provides emphasis on site level arrangement including cooperative networking outside organizations and agile command and control system. Korea Institute of Nuclear Safety has carried out a pilot in-house research project to develop the methodology and guideline for evaluation of LOLA since 2014. This paper introduces the summary of major results and outcomes of the aforementioned research project. After Fukushima Dai-Ichi accident, the awareness on countering the event of loss of large area induced from extreme man-made hazards or extreme beyond design basis external event. Urgent need exists to develop regulatory guidance for coping with this undesirable situation, which has been out of consideration at existing nuclear safety regulatory framework due to the expectation of rare possibility of occurrence

  4. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  5. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    Science.gov (United States)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  6. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  7. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available as the acceleration of the system development phases through an iterative construction approach. These methodologies also claim to manage the changing nature of requirements. However, during the development of large and complex systems by a small and technically...

  8. Development of an Evaluation Methodology for Loss of Large Area Induced from Extreme Events with Malicious Origin

    International Nuclear Information System (INIS)

    Kim, S.C.; Park, J.S.; Chang, D.J.; Kim, D.H.; Lee, S.W.; Lee, Y.J.; Kim, H.W.

    2016-01-01

    Event of loss of large area (LOLA) induced from extreme external event at multi-units nuclear installation has been emerged a new challenges in the realm of nuclear safety and regulation after Fukushima Dai-Ichi accident. The relevant information and experience on evaluation methodology and regulatory requirements are rarely available and negative to share due to the security sensitivity. Most of countries has been prepared their own regulatory requirements and methodologies to evaluate impact of LOLA at nuclear power plant. In Korea, newly amended the Nuclear Safety Acts requires to assess LOLA in terms of EDMG (Extended Damage Mitigation Guideline). Korea Institute of Nuclear Safety (KINS) has performed a pilot research project to develop the methodology and regulatory review guidance on LOLA at multi-units nuclear power plant since 2014. Through this research, we proposed a methodology to identify the strategies for preventive and mitigation of the consequences of LOLA utilizing PSA techniques or its results. The proposed methodology is comprised of 8 steps including policy consideration, threat evaluation, identification of damage path sets, SSCs capacity evaluation and identification of mitigation measures and strategies. The consequence of LOLA due to malevolent aircraft crash may significantly susceptible with analysis assumptions including type of aircraft, amount of residual fuel, and hittable angle and so on, which cannot be shared overtly. This paper introduces a evaluation methodology for LOLA using PSA technique and its results. Also we provide a case study to evaluate hittable access angle using flight simulator for two types of air crafts and to identify potential path sets leading to core damage by affected SSCs within damaged area.(author).

  9. Setting up fuel supply strategies for large-scale bio-energy projects using agricultural and forest residues. A methodology for developing countries

    International Nuclear Information System (INIS)

    Junginger, M.

    2000-08-01

    The objective of this paper is to develop a coherent methodology to set up fuel supply strategies for large-scale biomass-conversion units. This method will explicitly take risks and uncertainties regarding availability and costs in relation to time into account. This paper aims at providing general guidelines, which are not country-specific. These guidelines cannot provide 'perfect fit'-solutions, but aim to give general help to overcome barriers and to set up supply strategies. It will mainly focus on residues from the agricultural and forestry sector. This study focuses on electricity or both electricity and heat production (CHP) with plant scales between 1040 MWe. This range is chosen due to rules of economies of scale. In large-scale plants the benefits of increased efficiency outweigh increased transportation costs, allowing a lower price per kWh which in turn may allow higher biomass costs. However, fuel-supply risks tend to get higher with increasing plant size, which makes it more important to assess them for large(r) conversion plants. Although the methodology does not focus on a specific conversion technology, it should be stressed that the technology must be able to handle a wide variety of biomass fuels with different characteristics because many biomass residues are not available the year round and various fuels are needed for a constant supply. The methodology allows for comparing different technologies (with known investment and operational and maintenance costs from literature) and evaluation for different fuel supply scenarios. In order to demonstrate the methodology, a case study was carried out for the north-eastern part of Thailand (Isaan), an agricultural region. The research was conducted in collaboration with the Regional Wood Energy Development Programme in Asia (RWEDP), a project of the UN Food and Agricultural Organization (FAO) in Bangkok, Thailand. In Section 2 of this paper the methodology will be presented. In Section 3 the economic

  10. A study on methodological of software development for HEP

    International Nuclear Information System (INIS)

    Ding Yuzheng; Dai Guiliang

    1999-01-01

    The HEP related software system is a large one. It comprises mainly detector simulation software, DAQ software and offline system. The author discusses the advantages of OO object oriented methodologies applying to such software system, and the basic strategy for the usage of OO methodologies, languages and tools in the development of the HEP related software are given

  11. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology

  12. Development methodology for industrial diesel engines; Entwicklungsmethode fuer Industrie-Dieselmotoren

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Dirk; Kech, Johannes [MTU Friedrichshafen GmbH (Germany)

    2011-11-15

    In order to remain cost-effective with relatively low production volumes in spite of the high requirements regarding emissions and durability, MTU uses a clearly structured development methodology with a close interlinking of technology and product development in the development of its large engines. For the new engine of the 4000 Series with cooled EGR, MTU applied this methodology in order to implement the emissions concept from the initial idea right through to the serial product. (orig.)

  13. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  14. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  15. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral......World-wide, much attention has been given to cadastral development. As a consequence of experiences made during the last decades, several authors have stated the need of research in the domain of cadastre and proposed methodologies to be used. The purpose of this paper is to contribute...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  16. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  17. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  18. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  19. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  20. Scrum of scrums solution for large size teams using scrum methodology

    OpenAIRE

    Qurashi, Saja Al; Qureshi, M. Rizwan Jameel

    2014-01-01

    Scrum is a structured framework to support complex product development. However, Scrum methodology faces a challenge of managing large teams. To address this challenge, in this paper we propose a solution called Scrum of Scrums. In Scrum of Scrums, we divide the Scrum team into teams of the right size, and then organize them hierarchically into a Scrum of Scrums. The main goals of the proposed solution are to optimize communication between teams in Scrum of Scrums; to make the system work aft...

  1. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  2. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  3. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  4. Comparative study on software development methodologies

    OpenAIRE

    Mihai Liviu DESPA

    2014-01-01

    This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager han...

  5. Extending statistical boosting. An overview of recent methodological developments.

    Science.gov (United States)

    Mayr, A; Binder, H; Gefeller, O; Schmid, M

    2014-01-01

    Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately in the literature up to now. The methodological developments on statistical boosting during the last ten years can be grouped into three different lines of research: i) efforts to ensure variable selection leading to sparser models, ii) developments regarding different types of predictor effects and how to choose them, iii) approaches to extend the statistical boosting framework to new regression settings. Statistical boosting algorithms have been adapted to carry out unbiased variable selection and automated model choice during the fitting process and can nowadays be applied in almost any regression setting in combination with a large amount of different types of predictor effects.

  6. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  7. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  8. What the Current System Development Trends tell us about Systems Development Methodologies: Toward explaining SSDAM, Agile and IDEF0 Methodologies

    Directory of Open Access Journals (Sweden)

    Abdulla F. Ally

    2015-03-01

    Full Text Available Systems integration, customization and component based development approach are of increasing attention. This trend facilitates the research attention to also focus on systems development methodologies. The availability of systems development tools, rapid change in technologies, evolution of mobile computing and the growth of cloud computing have necessitated a move toward systems integration and customization rather than developing systems from scratch. This tendency encourages component based development and discourages traditional systems development approach. The paper presents and evaluates SSADM, IDEF0 and Agile systems development methodologies. More specifically, it examines how they fit or not fit into the current competitive market of systems development. In the view of this perspective, it is anticipated that despite of its popularity, SSADM methodology is becoming obsolete while Agile and IDEF0 methodologies are still gaining acceptance in the current competitive market of systems development. The present study more likely enrich our understanding of the systems development methodologies concepts and draw attention regarding where the current trends in system development are heading.

  9. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  10. New computational methodology for large 3D neutron transport problems

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    We present a new computational methodology, based on 3D characteristics method, dedicated to solve very large 3D problems without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we set up a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (authors)

  11. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Kim, Yoon Ik; Yang, Hui Chang; Huh, Byeong Gill; Lee, Dong Won; Ahn, Gwan Won [Seoul National Univ., Seoul (Korea, Republic of)

    2001-03-15

    The purpose of this study IS the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of nuclear power plants. In this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested. Configuration control assessment methodology suggested in this study for the purpose of the development of configuration control methodology reflecting the characteristics of Korean NPPs, can be utilized as the supplement of current PSA methodologies.

  12. Inkjet printed large-area flexible circuits: a simple methodology for optimizing the printing quality

    Science.gov (United States)

    Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei

    2018-01-01

    In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).

  13. A Methodology for Estimating Large-Customer Demand Response MarketPotential

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles; Hopper, Nicole; Bharvirkar, Ranjit; Neenan,Bernie; Cappers,Peter

    2007-08-01

    Demand response (DR) is increasingly recognized as an essential ingredient to well-functioning electricity markets. DR market potential studies can answer questions about the amount of DR available in a given area and from which market segments. Several recent DR market potential studies have been conducted, most adapting techniques used to estimate energy-efficiency (EE) potential. In this scoping study, we: reviewed and categorized seven recent DR market potential studies; recommended a methodology for estimating DR market potential for large, non-residential utility customers that uses price elasticities to account for behavior and prices; compiled participation rates and elasticity values from six DR options offered to large customers in recent years, and demonstrated our recommended methodology with large customer market potential scenarios at an illustrative Northeastern utility. We observe that EE and DR have several important differences that argue for an elasticity approach for large-customer DR options that rely on customer-initiated response to prices, rather than the engineering approaches typical of EE potential studies. Base-case estimates suggest that offering DR options to large, non-residential customers results in 1-3% reductions in their class peak demand in response to prices or incentive payments of $500/MWh. Participation rates (i.e., enrollment in voluntary DR programs or acceptance of default hourly pricing) have the greatest influence on DR impacts of all factors studied, yet are the least well understood. Elasticity refinements to reflect the impact of enabling technologies and response at high prices provide more accurate market potential estimates, particularly when arc elasticities (rather than substitution elasticities) are estimated.

  14. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  15. The prosa methodology for scenario development

    International Nuclear Information System (INIS)

    Grupa, J.B.

    2001-01-01

    In this paper a methodology for scenario development is proposed. The method is developed in an effort to convince ourselves (and others) that all conceivable future developments of a waste repository have been covered. To be able to assess all conceivable future developments, the method needs to be comprehensive. To convince us and others the method should be structured in such a way that the treatment of each conceivable future development is traceable. The methodology is currently being applied to two Dutch disposal designs. Preliminary results show that the elaborated method functions better than the original method. However, some elements in the method will need further refinement. (author)

  16. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Chang Hyeon; Yu, Yeong Woo; Cho, Jae Seon; Kim, Ju Yeol; Kim, Yun Ik; Yang, Hui Chang; Park, Gang Min; Hur, Byeong Gil [Seoul National Univ., Seoul (Korea, Republic of)

    1999-03-15

    The purpose of this study is the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of Nuclear Power Plant. The development of this methodology can contribute to enhance safety. In the first year of this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested.

  17. Discontinuous Galerkin methodology for Large-Eddy Simulations of wind turbine airfoils

    DEFF Research Database (Denmark)

    Frére, A.; Sørensen, Niels N.; Hillewaert, K.

    2016-01-01

    This paper aims at evaluating the potential of the Discontinuous Galerkin (DG) methodology for Large-Eddy Simulation (LES) of wind turbine airfoils. The DG method has shown high accuracy, excellent scalability and capacity to handle unstructured meshes. It is however not used in the wind energy...... sector yet. The present study aims at evaluating this methodology on an application which is relevant for that sector and focuses on blade section aerodynamics characterization. To be pertinent for large wind turbines, the simulations would need to be at low Mach numbers (M ≤ 0.3) where compressible...... at low and high Reynolds numbers and compares the results to state-of-the-art models used in industry, namely the panel method (XFOIL with boundary layer modeling) and Reynolds Averaged Navier-Stokes (RANS). At low Reynolds number (Re = 6 × 104), involving laminar boundary layer separation and transition...

  18. Development of a methodology of evaluation of financial stability of commercial banks

    Directory of Open Access Journals (Sweden)

    Brauers Willem Karel M.

    2014-01-01

    Full Text Available The field of evaluation of financial stability of commercial banks, which emanates from persistent existence of financial crisis, induces interest of researchers for over a century. The span of prevailing methodologies stretches from over-simplified risk-return approaches to ones comprising large number of economic variables on the micro- and/or macro-economic level. Methodologies of rating agencies and current methodologies reviewed and applied by the ECB are not intended for reducing information asymmetry in the market of commercial banks. In the paper it is shown that the Lithuanian financial system is bankbased with deposits of households being its primary sources, and its stability is primarily depending on behavior of depositors. A methodology of evaluation of commercial banks with features of decreasing information asymmetry in the market of commercial banks is being developed by comparing different MCDA methods.

  19. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  20. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  1. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  2. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  3. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  4. Developing educational hypermedia applications: a methodological approach

    Directory of Open Access Journals (Sweden)

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  5. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  6. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions

    DEFF Research Database (Denmark)

    Roberts, Chris; Freeman, John; Samdal, Oddrun

    2009-01-01

    OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members...... of the network, we collaboratively analysed our joint understandings of the survey's methodology. RESULTS: We identified four tensions that are likely to be present in upcoming survey cycles: (1) maintaining quality standards against a background of rapid growth, (2) continuous improvement with limited financial...... in working through such challenges renders it likely that HBSC can provide a model of other similar studies facing these tensions....

  7. Development of risk-informed assessment (RIA) design methodology

    International Nuclear Information System (INIS)

    Ji, S. K.; Park, S. J.; Park, B. R.; Kim, M. R.; Choi, C. J.

    2001-01-01

    It has been assessed that the capital cost for future nuclear power plants needs to be reduced on the order of 35% to 40% for Advanced Light Water Reactors such as KNGR and System 80+. Such reduction in the capital cost will require a fundamental re-evaluation of the industry standards and regulatory basis under which nuclear plants are designed and licensed. The objective of this study is to develop the risk-informed assessment (RIA) design methodology for future nuclear power plants. In order to meet this objective, the design simplification method is developed and RIA design methodology exercised for conceptual system. For the methodology verification, simplified conceptual ECCS and feedwater system are developed, then LOCA sensitivity analyses and agressive secondary cooldown analyses for these systems are performed. In addition, the probability safety assessment (PSA) model for LOCA is developed and the validation of RIA design methodology is demonstrated

  8. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  9. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  10. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  11. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  12. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  13. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  14. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  15. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  16. A Methodology for Measuring Microplastic Transport in Large or Medium Rivers

    Directory of Open Access Journals (Sweden)

    Marcel Liedermann

    2018-04-01

    Full Text Available Plastic waste as a persistent contaminant of our environment is a matter of increasing concern due to the largely unknown long-term effects on biota. Although freshwater systems are known to be the transport paths of plastic debris to the ocean, most research has been focused on marine environments. In recent years, freshwater studies have advanced rapidly, but they rarely address the spatial distribution of plastic debris in the water column. A methodology for measuring microplastic transport at various depths that is applicable to medium and large rivers is needed. We present a new methodology offering the possibility of measuring microplastic transport at different depths of verticals that are distributed within a profile. The net-based device is robust and can be applied at high flow velocities and discharges. Nets with different sizes (41 µm, 250 µm, and 500 µm are exposed in three different depths of the water column. The methodology was tested in the Austrian Danube River, showing a high heterogeneity of microplastic concentrations within one cross section. Due to turbulent mixing, the different densities of the polymers, aggregation, and the growth of biofilms, plastic transport cannot be limited to the surface layer of a river, and must be examined within the whole water column as for suspended sediments. These results imply that multipoint measurements are required for obtaining the spatial distribution of plastic concentration and are therefore a prerequisite for calculating the passing transport. The analysis of filtration efficiency and side-by-side measurements with different mesh sizes showed that 500 µm nets led to optimal results.

  17. Development of theoretical methodology for large molecules

    International Nuclear Information System (INIS)

    Maggiora, G.M.; Christoffersen, R.E.; Yoffe, J.A.; Petke, J.D.

    1981-01-01

    A major advantage of the use of floating spherical Gaussian or vitals (FSGO) is the extreme rapidity with which the necessary quantum mechanical integrals can be evaluated. This advantage has been exploited in several quantum mechanical procedures for molecular electronic structure calculations, as described below. Several other properties of these functions have also been exploited, and have led to the development of semiclassical point charge and harmonic oscillator models capable of describing first and second order electromagnetic properties and intermolecular forces with reasonable accuracy in all cases and with considerably better accuracy than much more elaborate theoretical procedures in some cases. These applications are also described below. The primary intent of the current paper is to present an overview of some of the uses of FSGOs in the study of molecular electronic structure and properties and to indicate possible directions for future applications. No attempt will be made to include all possible applications. Rather, those applications of interest to the authors have been stressed. Hopefully, this paper will further stimulate the development of additional uses of these remarkable functions

  18. Development of a Seismic Setpoint Calculation Methodology Using a Safety System Approach

    International Nuclear Information System (INIS)

    Lee, Chang Jae; Baik, Kwang Il; Lee, Sang Jeong

    2013-01-01

    The Automatic Seismic Trip System (ASTS) automatically actuates reactor trip when it detects seismic activities whose magnitudes are comparable to a Safe Shutdown Earthquake (SSE), which is the maximum hypothetical earthquake at the nuclear power plant site. To ensure that the reactor is tripped before the magnitude of earthquake exceeds the SSE, it is crucial to reasonably determine the seismic setpoint. The trip setpoint and allowable value for the ASTS for Advanced Power Reactor (APR) 1400 Nuclear Power Plants (NPPs) were determined by the methodology presented in this paper. The ASTS that trips the reactor when a large earthquake occurs is categorized as a non safety system because the system is not required by design basis event criteria. This means ASTS has neither specific analytical limit nor dedicated setpoint calculation methodology. Therefore, we developed the ASTS setpoint calculation methodology by conservatively considering that of PPS. By incorporating the developed methodology into the ASTS for APR1400, the more conservative trip setpoint and allowable value were determined. In addition, the ZPA from the Operating Basis Earthquake (OBE) FRS of the floor where the sensor module is located is 0.1g. Thus, the allowance of 0.17g between OBE of 0.1 g and ASTS trip setpoint of 0.27 g is sufficient to prevent the reactor trip before the magnitude of the earthquake exceeds the OBE. In result, the developed ASTS setpoint calculation methodology is evaluated as reasonable in both aspects of the safety and performance of the NPPs. This will be used to determine the ASTS trip setpoint and allowable for newly constructed plants

  19. A methodology for the synthesis of heat exchanger networks having large numbers of uncertain parameters

    International Nuclear Information System (INIS)

    Novak Pintarič, Zorka; Kravanja, Zdravko

    2015-01-01

    This paper presents a robust computational methodology for the synthesis and design of flexible HEN (Heat Exchanger Networks) having large numbers of uncertain parameters. This methodology combines several heuristic methods which progressively lead to a flexible HEN design at a specific level of confidence. During the first step, a HEN topology is generated under nominal conditions followed by determining those points critical for flexibility. A significantly reduced multi-scenario model for flexible HEN design is formulated at the nominal point with the flexibility constraints at the critical points. The optimal design obtained is tested by stochastic Monte Carlo optimization and the flexibility index through solving one-scenario problems within a loop. This presented methodology is novel regarding the enormous reduction of scenarios in HEN design problems, and computational effort. Despite several simplifications, the capability of designing flexible HENs with large numbers of uncertain parameters, which are typical throughout industry, is not compromised. An illustrative case study is presented for flexible HEN synthesis comprising 42 uncertain parameters. - Highlights: • Methodology for HEN (Heat Exchanger Network) design under uncertainty is presented. • The main benefit is solving HENs having large numbers of uncertain parameters. • Drastically reduced multi-scenario HEN design problem is formulated through several steps. • Flexibility of HEN is guaranteed at a specific level of confidence.

  20. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  1. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  2. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  3. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  4. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  5. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  6. Enabling Psychiatrists to be Mobile Phone App Developers: Insights Into App Development Methodologies.

    Science.gov (United States)

    Zhang, Melvyn Wb; Tsang, Tammy; Cheow, Enquan; Ho, Cyrus Sh; Yeong, Ng Beng; Ho, Roger Cm

    2014-11-11

    The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical

  7. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  8. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  9. Risk-Informed Assessment Methodology Development and Application

    International Nuclear Information System (INIS)

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  10. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  11. The Typology of Methodological Approaches to Development of Innovative Clusters

    Directory of Open Access Journals (Sweden)

    Farat Olexandra V.

    2017-06-01

    Full Text Available The aim of the article is to study the existing methodological approaches to assessing the development of enterprises for further substantiation of possibilities of their using by cluster associations. As a result of research, based on the analysis of scientific literature, the most applicable methodological approaches to assessing the development of enterprises are characterized. 8 methodical approaches to assessing the level of development of enterprises and 4 methodological approaches to assessing the level of development of clusters are singled out. Each of the approaches is characterized by the presence of certain advantages and disadvantages, but none of them allows to obtain a systematic assessment of all areas of cluster functioning, identify possible reserves for cluster competitiveness growth and characterize possible strategies for their future development. Taking into account peculiarities of the functioning and development of cluster associations of enterprises, we propose our own methodological approach for assessing the development of innovative cluster structures.

  12. Vedic division methodology for high-speed very large scale integration applications

    Directory of Open Access Journals (Sweden)

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  13. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  14. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  15. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  16. Evolution of courseware development methodology : recent issues

    NARCIS (Netherlands)

    Moonen, J.C.M.M.; Schoenmaker, Jan

    1992-01-01

    To improve the quality of courseware products and the efficiency of the courseware development process, a methodology based upon "courseware engineering", being a combination of instructional systems development and software engineering, has emerged over the last 10¿15 years. Recently, software

  17. Cooperative learning as a methodology for inclusive education development

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz Martínez

    2017-06-01

    Full Text Available This paper presents the methodology of cooperative learning as a strategy to develop the principles of inclusive education. It has a very practical orientation, with the intention of providing tools for teachers who want to implement this methodology in the classroom, starting with a theoretical review, and then a description of a case in which they have worked this methodology for 5 years. We describe specific activities and ways of working with students, later reaching conclusions on the implementation of the methodology.

  18. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Raff, D. [U.S. Dept. of the Interior, Bureau of Reclamation, Denver, Colorado (United States)

    2008-07-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  19. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    International Nuclear Information System (INIS)

    Raff, D.

    2008-01-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  20. Large-Scale Demand Driven Design of a Customized Bus Network: A Methodological Framework and Beijing Case Study

    Directory of Open Access Journals (Sweden)

    Jihui Ma

    2017-01-01

    Full Text Available In recent years, an innovative public transportation (PT mode known as the customized bus (CB has been proposed and implemented in many cities in China to efficiently and effectively shift private car users to PT to alleviate traffic congestion and traffic-related environmental pollution. The route network design activity plays an important role in the CB operation planning process because it serves as the basis for other operation planning activities, for example, timetable development, vehicle scheduling, and crew scheduling. In this paper, according to the demand characteristics and operational purpose, a methodological framework that includes the elements of large-scale travel demand data processing and analysis, hierarchical clustering-based route origin-destination (OD region division, route OD region pairing, and a route selection model is proposed for CB network design. Considering the operating cost and social benefits, a route selection model is proposed and a branch-and-bound-based solution method is developed. In addition, a computer-aided program is developed to analyze a real-world Beijing CB route network design problem. The results of the case study demonstrate that the current CB network of Beijing can be significantly improved, thus demonstrating the effectiveness of the proposed methodology.

  1. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  2. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  3. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  4. Developing a methodology for identifying correlations between LERF and early fatality

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moo Sung; Ahn, Kwang Il

    2009-01-01

    The correlations between Large Early Release Frequency (LERF) and Early Fatality need to be investigated for risk-informed application and regulation. In RG-1.174, there are decision-making criteria using the measures of CDF and LERF, while there are no specific criteria on LERF. Since there are both huge uncertainty and large cost need in off-site consequence calculation, a LERF assessment methodology need to be developed and its correlation factor needs to be identified for risk-informed decision-making. This regards, the robust method for estimating off-site consequence has been performed for assessing health effects caused by radioisotopes released from severe accidents of nuclear power plants. And also, MACCS2 code are used for validating source term quantitatively regarding health effects depending on release characteristics of radioisotopes during severe accidents has been performed. This study developed a method for identifying correlations between LERF and Early Fatality and validates the results of the model using MACCS2 code. The results of this study may contribute to defining LERF and finding a measure for risk-informed regulations and risk-informed decision making

  5. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  6. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    Science.gov (United States)

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  7. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  8. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  9. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  10. Large-Scale Urban Decontamination; Developments, Historical Examples and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Rick Demmer

    2007-02-01

    Recent terrorist threats and actual events have lead to a renewed interest in the technical field of large scale, urban environment decontamination. One of the driving forces for this interest is the real potential for the cleanup and removal of radioactive dispersal device (RDD or “dirty bomb”) residues. In response the U. S. Government has spent many millions of dollars investigating RDD contamination and novel decontamination methodologies. Interest in chemical and biological (CB) cleanup has also peaked with the threat of terrorist action like the anthrax attack at the Hart Senate Office Building and with catastrophic natural events such as Hurricane Katrina. The efficiency of cleanup response will be improved with these new developments and a better understanding of the “old reliable” methodologies. Perhaps the most interesting area of investigation for large area decontamination is that of the RDD. While primarily an economic and psychological weapon, the need to cleanup and return valuable or culturally significant resources to the public is nonetheless valid. Several private companies, universities and National Laboratories are currently developing novel RDD cleanup technologies. Because of its longstanding association with radioactive facilities, the U. S. Department of Energy National Laboratories are at the forefront in developing and testing new RDD decontamination methods. However, such cleanup technologies are likely to be fairly task specific; while many different contamination mechanisms, substrate and environmental conditions will make actual application more complicated. Some major efforts have also been made to model potential contamination, to evaluate both old and new decontamination techniques and to assess their readiness for use. Non-radioactive, CB threats each have unique decontamination challenges and recent events have provided some examples. The U. S. Environmental Protection Agency (EPA), as lead agency for these emergency

  11. Development of a comprehensive management site evaluation methodology

    International Nuclear Information System (INIS)

    Rodgers, J.C.; Onishi, Y.

    1981-01-01

    The Nuclear Regulatory Commission is in the process of preparing regulations that will define the necessary conditions for adequate disposal of low-level waste (LLW) by confinement in an LLW disposal facility. These proposed regulations form the context in which the motivation for the joint Los Alamos National Laboratory Battelle Pacific Northwest Laboratory program to develop a site-specific, LLW site evaluation methodology is discussed. The overall effort is divided into three development areas: land-use evaluation, environmental transport modelling, and long term scenario development including long-range climatology projections. At the present time four steps are envisioned in the application of the methodology to a site: site land use suitability assessment, land use-ecosystem interaction, contaminant transport simulation, and sensitivity analysis. Each of these steps is discussed in the paper. 12 refs

  12. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  13. Adaptation of a software development methodology to the implementation of a large-scale data acquisition and control system. [for Deep Space Network

    Science.gov (United States)

    Madrid, G. A.; Westmoreland, P. T.

    1983-01-01

    A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.

  14. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  15. Radioimmunoassay of h-TSH - methodological suggestions for dealing with medium to large numbers of samples

    International Nuclear Information System (INIS)

    Mahlstedt, J.

    1977-01-01

    The article deals with practical aspects of establishing a TSH-RIA for patients, with particular regard to predetermined quality criteria. Methodological suggestions are made for medium to large numbers of samples with the target of reducing monotonous precision working steps by means of simple aids. The quality criteria required are well met, while the test procedure is well adapted to the rhythm of work and may be carried out without loss of precision even with large numbers of samples. (orig.) [de

  16. Methodology for plastic fracture - a progress report

    International Nuclear Information System (INIS)

    Wilkinson, J.P.D.; Smith, R.E.E.

    1977-01-01

    This paper describes the progress of a study to develop a methodology for plastic fracture. Such a fracture mechanics methodology, having application in the plastic region, is required to assess the margin of safety inherent in nuclear reactor pressure vessels. The initiation and growth of flaws in pressure vessels under overload conditions is distinguished by a number of unique features, such as large scale yielding, three-dimensional structural and flaw configurations, and failure instabilities that may be controlled by either toughness or plastic flow. In order to develop a broadly applicable methodology of plastic fracture, these features require the following analytical and experimental studies: development of criteria for crack initiation and growth under large scale yielding; the use of the finite element method to describe elastic-plastic behaviour of both the structure and the crack tip region; and extensive experimental studies on laboratory scale and large scale specimens, which attempt to reproduce the pertinent plastic flow and crack growth phenomena. This discussion centers on progress to date on the selection, through analysis and laboratory experiments, of viable criteria for crack initiation and growth during plastic fracture. (Auth.)

  17. A framework for assessing the adequacy and effectiveness of software development methodologies

    Science.gov (United States)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  18. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  19. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2005-04-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  20. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2008-01-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  1. Development of advanced methodology for defect assessment in FBR power plants

    International Nuclear Information System (INIS)

    Meshii, Toshiyuki; Asayama, Tai

    2001-03-01

    As a preparation for developing a code for FBR post construction code, (a) JSME Code NA1-2000 was reviewed on the standpoint of applying it to FBR power plants and the necessary methodologies for defect assessment for FBR plants were pointed out (b) large capacity-high speed fatigue crack propagation (FCP) testing system was developed and some data were acquired to evaluate the FCP characteristics under thermal stresses. Results showed that the extended research on the following items are necessary for developing FBR post construction code. (1) Development of assessment for multiple defects due to creep damage. Multiple defects due to creep damage are not considered in the existing code, which is established for nuclear power plants in service under negligible-creep temperature. Therefore method to assess the integrity of these multiple defects due to creep damage is necessary. (2) FCP resistance for small load. Since components of FBR power plants are designed to minimize thermal stresses, the accuracy of FCP resistance for small load is important to estimate the crack propagation under thermal stresses accurately. However, there is not a sufficient necessary FCP data for small loads, maybe because the data is time consuming. Therefore we developed a large capacity-high speed FCP testing system, made a guideline for accelerated test and acquired some data to meet the needs. Continuous efforts to accumulate small load FCP data for various materials are necessary. (author)

  2. Development Methodology for an Integrated Legal Cadastre

    NARCIS (Netherlands)

    Hespanha, J.P.

    2012-01-01

    This Thesis describes the research process followed in order to achieve a development methodology applicable to the reform of cadastral systems with a legal basis. It was motivated by the author’s participation in one of the first surveying and mapping operations for a digital cadastre in Portugal,

  3. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  4. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  5. Development of seismic PSA methodology at JAERI

    International Nuclear Information System (INIS)

    Muramatsu, K.; Ebisawa, K.; Matsumoto, K.; Oikawa, T.; Kondo, M.

    1995-01-01

    The Japan Atomic Energy Research Institute (JAERI) is developing a methodology for seismic probabilistic safety assessment (PSA) of nuclear power plants, aiming at providing a set of procedures, computer codes and data suitable for performing seismic PSA in Japan. In order to demonstrate the usefulness of JAERI's methodology and to obtain better understanding on the controlling factors of the results of seismic PSAs, a seismic PSA for a BWR is in progress. In the course of this PSA, various improvements were made on the methodology. In the area of the hazard analysis, the application of the current method to the model plant site is being carried out. In the area of response analysis, the response factor method was modified to consider the non-linear response effect of the building. As for the capacity evaluation of components, since capacity data for PSA in Japan are very scarce, capacities of selected components used in Japan were evaluated. In the systems analysis, the improvement of the SECOM2 code was made to perform importance analysis and sensitivity analysis for the effect of correlation of responses and correlation of capacities. This paper summarizes the recent progress of the seismic PSA research at JAERI with emphasis on the evaluation of component capacity and the methodology improvement of systems reliability analysis. (author)

  6. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    Science.gov (United States)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  7. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  8. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  9. Methodology for development of risk indicators for offshore platforms

    International Nuclear Information System (INIS)

    Oeien, K.; Sklet, S.

    1999-01-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  10. Large rotorcraft transmission technology development program

    Science.gov (United States)

    Mack, J. C.

    1983-01-01

    Testing of a U.S. Army XCH-62 HLH aft rotor transmission under NASA Contract NAS 3-22143 was successfully completed. This test establishes the feasibility of large, high power rotorcraft transmissions as well as demonstrating the resolution of deficiencies identified during the HLH advanced technology programs and reported by USAAMRDLTR-77-38. Over 100 hours of testing was conducted. At the 100% design power rating of 10,620 horsepower, the power transferred through a single spiral bevel gear mesh is more than twice that of current helicopter bevel gearing. In the original design of these gears, industry-wide design methods were employed and failures were experienced which identified problem areas unique to gear size. To remedy this technology shortfall, a program was developed to predict gear stresses using finite element analysis for complete and accurate representation of the gear tooth and supporting structure. To validate the finite element methodology gear strain data from the existing U.S. Army HLH aft transmission was acquired, and existing data from smaller gears were made available.

  11. Non-economic determinants of economic development: methodology and influence

    OpenAIRE

    Barashov, N.

    2011-01-01

    The paper deals with research methodology of non-economic determinants of economic development. The author considers various theoretical approaches to definition of economic growth factors. Considerable attention is given to studying possible influence of non-economic determinants on quality of economic development.

  12. Application of low-cost methodologies for mobile phone app development.

    Science.gov (United States)

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low

  13. Developing Agent-Oriented Video Surveillance System through Agent-Oriented Methodology (AOM

    Directory of Open Access Journals (Sweden)

    Cheah Wai Shiang

    2016-12-01

    Full Text Available Agent-oriented methodology (AOM is a comprehensive and unified agent methodology for agent-oriented software development. Although AOM is claimed to be able to cope with a complex system development, it is still not yet determined up to what extent this may be true. Therefore, it is vital to conduct an investigation to validate this methodology. This paper presents the adoption of AOM in developing an agent-oriented video surveillance system (VSS. An intruder handling scenario is designed and implemented through AOM. AOM provides an alternative method to engineer a distributed security system in a systematic manner. It presents the security system at a holistic view; provides a better conceptualization of agent-oriented security system and supports rapid prototyping as well as simulation of video surveillance system.

  14. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  15. Environmental quality indexing of large industrial development alternatives using AHP

    International Nuclear Information System (INIS)

    Solnes, Julius

    2003-01-01

    Two industrial development alternatives have been proposed for the East Coast of Iceland in order to strengthen its socio-economic basis. The favoured option is to build a large aluminium smelter, which requires massive hydropower development in the nearby highlands. Another viable option is the construction of a 6-million-ton oil refinery, following the planned exploitation of the Timan Pechora oil reserves in the Russian Arctic. A third 'fictitious' alternative could be general development of existing regional industry and new knowledge-based industries, development of ecotourism, establishment of national parks, accompanied by infrastructure improvement (roads, tunnels, communications, schools, etc.). The three alternatives will have different environmental consequences. The controversial hydropower plant for the smelter requires a large water reservoir as well as considerable land disturbance in this unique mountain territory, considered to be the largest uninhabited wilderness in Western Europe. The aluminium smelter and the oil refinery will give rise to substantial increase of the greenhouse gas (GHG) emissions of the country (about 20%). Then there is potential environmental risk associated with the refinery regarding oil spills at sea, which could have disastrous impact on the fisheries industry. However, the oil refinery does not require any hydropower development, which is a positive factor. Finally, the third alternative could be defined as a ''green'' solution whereby the detrimental environmental consequences of the two industrial solutions are mostly avoided. In order to compare the three alternatives in an orderly manner, the analytic hierarchy process methodology of Saaty was applied to calculate the environmental quality index of each alternative, which is defined as a weighted sum of selected environmental and socio-economic factors. These factors are evaluated on a comparison basis, applying the AHP methodology, and the weights in the quality

  16. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  17. Methodology for developing evidence-based clinical imaging guidelines: Joint recommendations by Korea society of radiology and national evidence-based healthcare collaborating agency

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sol Ji; Jo, Ae Jeong; Choi, Jin A [Div. for Healthcare Technology Assessment Research, National Evidence-Based Healthcare Collaborating Agency, Seoul (Korea, Republic of); and others

    2017-01-15

    This paper is a summary of the methodology including protocol used to develop evidence-based clinical imaging guidelines (CIGs) in Korea, led by the Korean Society of Radiology and the National Evidence-based Healthcare Collaborating Agency. This is the first protocol to reflect the process of developing diagnostic guidelines in Korea. The development protocol is largely divided into the following sections: set-up, process of adaptation, and finalization. The working group is composed of clinical imaging experts, and the developmental committee is composed of multidisciplinary experts to validate the methodology. The Korean CIGs will continue to develop based on this protocol, and these guidelines will act for decision supporting tools for clinicians as well as reduce medical radiation exposure.

  18. Methodology for developing evidence-based clinical imaging guidelines: Joint recommendations by Korea society of radiology and national evidence-based healthcare collaborating agency

    International Nuclear Information System (INIS)

    Choi, Sol Ji; Jo, Ae Jeong; Choi, Jin A

    2017-01-01

    This paper is a summary of the methodology including protocol used to develop evidence-based clinical imaging guidelines (CIGs) in Korea, led by the Korean Society of Radiology and the National Evidence-based Healthcare Collaborating Agency. This is the first protocol to reflect the process of developing diagnostic guidelines in Korea. The development protocol is largely divided into the following sections: set-up, process of adaptation, and finalization. The working group is composed of clinical imaging experts, and the developmental committee is composed of multidisciplinary experts to validate the methodology. The Korean CIGs will continue to develop based on this protocol, and these guidelines will act for decision supporting tools for clinicians as well as reduce medical radiation exposure

  19. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  20. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  1. Methodological Grounds of Managing Innovation Development of Restaurants

    OpenAIRE

    Naidiuk V. S.

    2013-01-01

    The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the "managing innovation development of an enterprise" notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficien...

  2. A Quantitative, Non-Destructive Methodology for Habitat Characterisation and Benthic Monitoring at Offshore Renewable Energy Developments

    Science.gov (United States)

    Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.

    2010-01-01

    Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and

  3. A quantitative, non-destructive methodology for habitat characterisation and benthic monitoring at offshore renewable energy developments.

    Directory of Open Access Journals (Sweden)

    Emma V Sheehan

    2010-12-01

    Full Text Available Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs. Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects. The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth, then subsequently successfully deployed in demanding conditions at the deep (>50 m high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current, the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath

  4. Theoretical and methodological basis of the comparative historical and legal method development

    Directory of Open Access Journals (Sweden)

    Д. А. Шигаль

    2015-05-01

    Full Text Available Problem setting. Development of any scientific method is always both a question of its structural and functional characteristics and place in the system of scientific methods, and a comment as for practicability of such methodological work. This paper attempts to give a detailed response to the major comments and objections arising in respect of the separation as an independent means of special and scientific knowledge of comparative historical and legal method. Recent research and publications analysis. Analyzing research and publications within the theme of the scientific article, it should be noted that attention to methodological issues of both general and legal science at the time was paid by such prominent foreign and domestic scholars as I. D. Andreev, Yu. Ya. Baskin, O. L. Bygych, M. A. Damirli, V. V. Ivanov, I. D. Koval'chenko, V. F. Kolomyitsev, D. V. Lukyanov, L. A. Luts, J. Maida, B. G. Mogilnytsky, N. M. Onishchenko, N. M. Parkhomenko, O. V. Petryshyn, S. P. Pogrebnyak, V. I. Synaisky, V. M. Syryh, O. F. Skakun, A. O. Tille, D. I. Feldman and others. It should be noted that, despite a large number of scientific papers in this field, the interest of research partnership in the methodology of history of state and law science still unfairly remains very low. Paper objective. The purpose of this scientific paper is theoretical and methodological rationale for the need of separation and development of comparative historical and legal method in the form of answers to more common questions and objections that arise in scientific partnership in this regard. Paper main body. Development of comparative historical and legal means of knowledge is quite justified because it meets the requirements of the scientific method efficiency, which criteria are the speed for achieving this goal, ease of use of one or another way of scientific knowledge, universality of research methods, convenience of techniques that are used and so on. Combining the

  5. NPA4K development system using object-oriented methodology

    International Nuclear Information System (INIS)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced

  6. NPA4K development system using object-oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced.

  7. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  8. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  9. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  10. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  11. Selecting a software development methodology. [of digital flight control systems

    Science.gov (United States)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  12. Territory development as economic and geographical activity (theory, methodology, practice

    Directory of Open Access Journals (Sweden)

    Vitaliy Nikolaevich Lazhentsev

    2013-03-01

    Full Text Available Accents in a description of theory and methodology of territory development are displaced from distribution of the national benefits on formation of territorial natural and economic systems and organization of economical and geographical activity. The author reveals theconcept of «territory development» and reviews its placein thetheory and methodology of human geography and regionaleconomy. In the articletheindividual directions ofeconomic activity areconsidered. The author has made an attempt to definethesubject matter of five levels of «ideal» territorial and economic systems as a part of objects of the nature, societies, population settlement, production, infrastructure and management. The author’s position of interpretation of sequences of mechanisms of territory development working according to a Nested Doll principle (mechanism of economy, economic management mechanism, controlling mechanism of economy is presented. The author shows the indicators, which authentically define territory development

  13. Recent developments in methodology for dynamic qualification of nuclear plant equipment

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-01-01

    Dynamic qualification of nuclear plant electrical and mechanical equipment is performed basically under guidelines given in IEEE Standards 323 and 344, and a variety of NRC regulatory guides. Over the last fifteen years qualification methodology prescribed by these documents has changed significantly as interpretations, equipment capability, and imagination of the qualification engineers have progressed. This progress has been sparked by concurrent NRC and industry sponsored research programs that have identified anomalies and developed new methodologies for resolving them. Revisions of the standards have only resulted after a lengthy debate of all such new information and subsequent judgment of its validity. The purpose of this paper is to review a variety of procedural improvements and developments in qualification methodology that are under current consideration as revisions to the standards. Many of the improvements and developments have resulted from recent research programs. All are very likely to appear in one type of standard or another in the near future

  14. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  15. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  16. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  17. Development and application of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, H.Y.

    1981-01-01

    A methodology is developed to identify and flag significant trends related to the safety and availability of U.S. commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs, and significant errors affecting systems and which are caused by response action are highly associated with startup reactor mode in BWRS

  18. An approach to SOA development methodology: SOUP comparison with RUP and XP

    Directory of Open Access Journals (Sweden)

    Sandra Svanidzaitė

    2014-08-01

    Full Text Available Service oriented architecture (SOA is an architecture for distributed applications composed of distributed services with weak coupling that are designed to meet business requirements. One of the research priorities in the field of SOA is creating such software design and development methodology (SDDM that takes into account all principles of this architecture and allows for effective and efficient application development. A lot of investigation has been carried out to find out whether can one of popular SDDM, such as agile methodologies or RUP suits, be adapted for SOA or there is a need to create some new SOA-oriented SDDM. This paper compares one of SOA-oriented SDDM – SOUP – with RUP and XP methodologies. The aim is to find out whether the SOUP methodology is already mature enough to assure successful development of SOA applications. This aim is accomplished by comparing activities, artifacts of SOUP and RUP and emphasizing which XP practices are used in SOUP.DOI: http://dx.doi.org/10.15181/csat.v2i1.77 

  19. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...... the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main...

  20. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  1. Synthesis of methodology development and case studies

    OpenAIRE

    Roetter, R.P.; Keulen, van, H.; Laar, van, H.H.

    2000-01-01

    The .Systems Research Network for Ecoregional Land Use Planning in Support of Natural Resource Management in Tropical Asia (SysNet). was financed under the Ecoregional Fund, administered by the International Service for National Agricultural Research (ISNAR). The objective of the project was to develop and evaluate methodologies and tools for land use analysis, and apply them at the subnational scale to support agricultural and environmental policy formulation. In the framework of this projec...

  2. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  3. A vision on methodology for integrated sustainable urban development: bequest

    NARCIS (Netherlands)

    Bentivegna, V.; Curwell, S.; Deakin, M.; Lombardi, P.; Mitchell, G.; Nijkamp, P.

    2002-01-01

    The concepts and visions of sustainable development that have emerged in the post-Brundtland era are explored in terms laying the foundations for a common vision of sustainable urban development (SUD). The described vision and methodology for SUD resulted from the activities of an international

  4. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  5. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  6. A Methodology of Estimation on Air Pollution and Its Health Effects in Large Japanese Cities

    OpenAIRE

    Hirota, Keiko; Shibuya, Satoshi; Sakamoto, Shogo; Kashima, Shigeru

    2012-01-01

    The correlation between air pollution and health effects in large Japanese cities presents a great challenge owing to the limited availability of data on the exposure to pollution, health effects and the uncertainty of mixed causes. A methodology for quantitative relationships (between the emission volume and air quality, and the air quality and health effects) is analysed with a statistical method in this article; the correlation of air pollution reduction policy in Japan from 1974 to 2007. ...

  7. Methodology for inferring initial flaw distribution

    International Nuclear Information System (INIS)

    Jouris, G.M.; Shaffer, D.H.

    1980-01-01

    It has been common practice in both deterministic and probabilistic assessment of the integrity of a pressure vessel to assume the presence of a rather large flaw (usually 1/4 the thickness of the vessel wall) in the belt-line region. Although it is highly unlikely that such a large flaw would be present, the assumption is adopted in order to be conservative. A more realistic approach, which can be incorporated in the probabilistic analysis of integrity, is to characterize the depth of a flaw as a random variable and thus allow the probabilities associated with the presence of various size flaws to be reflected in the final estimated probability of vessel failure. This is precisely the motivation for developing the methodology to obtain the distribution of initial flaw depth, which is presented in this paper. It should be mentioned that the methodology developed here is not an end in itself but rather provides an input distribution to be used in a comprehensive integrity assessment. (orig.)

  8. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  9. Prometheus Reactor I and C Software Development Methodology, for Action

    International Nuclear Information System (INIS)

    T. Hamilton

    2005-01-01

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I and C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I and C Software Development Process Manual and Reactor Module Software Development Plan to NR for information

  10. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  11. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  12. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  13. Remotely controlled large container disposal methodology

    International Nuclear Information System (INIS)

    Amir, S.J.

    1994-09-01

    Remotely Handled Large Containers (RHLC), also called drag-off boxes, have been used at the Hanford Site since the 1940s to dispose of large pieces of radioactively contaminated equipment. These containers are typically large steel-reinforced concrete boxes, which weigh as much as 40 tons. Because large quantities of high-dose waste can produce radiation levels as high as 200 mrem/hour at 200 ft, the containers are remotely handled (either lifted off the railcar by crane or dragged off with a cable). Many of the existing containers do not meet existing structural and safety design criteria and some of the transportation requirements. The drag-off method of pulling the box off the railcar using a cable and a tractor is also not considered a safe operation, especially in view of past mishaps

  14. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  15. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  16. The economics of climate change mitigation in developing countries - methodological and empirical results

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs.

  17. The economics of climate change mitigation in developing countries -methodological and empirical results

    International Nuclear Information System (INIS)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs

  18. European methodology for qualification of NDT as developed by ENIQ

    International Nuclear Information System (INIS)

    Champigny, F.; Sandberg, U.; Engl, G.; Crutzen, S.; Lemaitre, P.

    1997-01-01

    The European Network for Inspection Qualification (ENIQ) groups the major part of the nuclear power plant operators in the European Union (and Switzerland). The main objective of ENIQ is to co-ordinate and manage at European level expertise and resources for the qualification of NDE inspection systems, primarily for nuclear components. In the framework of ENIQ the European methodology for qualification of NDT has been developed. In this paper the main principles of the European methodology are given besides the main activities and organisation of ENIQ. (orig.)

  19. Research and assessment of competitiveness of large engineering complexes

    Directory of Open Access Journals (Sweden)

    Krivorotov V.V.

    2017-01-01

    Full Text Available The urgency of the problem of ensuring the competitiveness of manufacturing and high-tech sectors is shown. Substantiated the decisive role of the large industrial complexes in the formation of the results of the national economy; the author’s interpretation of the concept of “industrial complex” with regard to current economic systems. Current approaches to assessing the competitiveness of enterprises and industrial complexes are analyzed; showing their main advantages and disadvantages. Provides scientific-methodological approach to the study and management of competitiveness of a large industrial complex; the description of its main units is provided. As a Central element of the scientific methodology approach proposed the methodology for assessing the competitiveness of a large industrial complex based on the Pattern-method; a modular system of indicators of competitiveness is developed and its adaptation to a large engineering complexes is made. Using the developed methodology the competitiveness of one of the largest engineering complexes of the group of companies Uralelectrotyazhmash, which is the leading enterprises in electrotechnical industry of Russia is assessed. The evaluation identified the main problems and bottlenecks in the development of these enterprises, and their comparison with leading competitors is provided. According to the results of the study the main conclusions and recommendations are formed.

  20. Development and delivery of a workshop methodology: planning for biomass power plant projects

    Energy Technology Data Exchange (ETDEWEB)

    Gray, A.J.; Delbridge, P.; Trevorrow, E.; Pile, C.

    2001-07-01

    This report gives details of the approach used to develop a workshop methodology to help planners and stakeholders address key issues that may arise when submitting a planning application for a biomass power plant in the light of the UK government's energy and climate change targets. The results of interviews with stakeholders (central government, regulatory authorities, developers, planners, non-governmental organisations, local community, resident groups) are summarised, and the NIMBY (not in my back yard) syndrome, the lack of trust in the developer, and lack of awareness of the use of biomass are discussed. Details are given of the design and testing of the workshop methodology and the resulting workshop methodology and workbook guide aimed at understanding the stakeholder issues and concerns through stakeholder discussions.

  1. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  2. The development of a checklist to enhance methodological quality in intervention programs

    Directory of Open Access Journals (Sweden)

    Salvador Chacón-Moscoso

    2016-11-01

    Full Text Available The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a systematize and summarize the available literature about methodological quality in primary studies; (b propose a specific, parsimonious, 12-item checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c present an inter-coder reliability study for the resulting 12 items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  3. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    International Nuclear Information System (INIS)

    Dorp, F. van

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways (a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  4. Gamma ray auto absorption correction evaluation methodology

    International Nuclear Information System (INIS)

    Gugiu, Daniela; Roth, Csaba; Ghinescu, Alecse

    2010-01-01

    Neutron activation analysis (NAA) is a well established nuclear technique, suited to investigate the microstructural or elemental composition and can be applied to studies of a large variety of samples. The work with large samples involves, beside the development of large irradiation devices with well know neutron field characteristics, the knowledge of perturbing phenomena and adequate evaluation of correction factors like: neutron self shielding, extended source correction, gamma ray auto absorption. The objective of the works presented in this paper is to validate an appropriate methodology for gamma ray auto absorption correction evaluation for large inhomogeneous samples. For this purpose a benchmark experiment has been defined - a simple gamma ray transmission experiment, easy to be reproduced. The gamma ray attenuation in pottery samples has been measured and computed using MCNP5 code. The results show a good agreement between the computed and measured values, proving that the proposed methodology is able to evaluate the correction factors. (authors)

  5. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  6. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  7. Methodology for Developing the REScheckTM Software through Version 4.2

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Connell, Linda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gowri, Krishnan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lucas, R. G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schultz, Robert W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Zachary T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wiberg, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2009-08-01

    This report explains the methodology used to develop Version 4.2 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, and 2006 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these five editions is similar. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  8. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  9. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  10. Load shape development for Swedish commercial and public buildings - methodologies and results

    Energy Technology Data Exchange (ETDEWEB)

    Noren, C.

    1999-06-01

    The knowledge concerning electricity consumption, and especially load demand, in Swedish commercial buildings is very limited. The current study deals with methods for electricity consumption indicator development and application of the different methodologies on measured data. Typical load shapes and consumption indicators are developed for four different types of commercial buildings: schools, hotels, grocery stores and department stores. Two different methodologies for consumption indicator development are presented and discussed. The influence on load demand from different factors such as, installations, outdoor temperature and building activities is studied. It is suggested that building floor area is not an accurate determinant of building electricity consumption and it is necessary to consider other factors as those just mentioned to understand commercial building electricity consumption. The application of the two methodologies on measured data shows that typical load shapes can be developed with reasonable accuracy. For most of the categories it is possible to use the typical load shapes for approximation of whole-building load shapes with error rates about 10-25% depending on day-type and building type. Comparisons of the developed load shapes with measured data show good agreement 49 refs, 22 figs, 3 tabs

  11. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  12. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Ganapathi Iyer, S.; Ali, M.M.; Thantry, S.S.; Verma, R.; Arunachalam, J.; Walvekar, A.P.

    1992-01-01

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  13. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  14. Development of methodology for separation and recovery of uranium from nuclear wastewater

    International Nuclear Information System (INIS)

    Satpati, S.K.; Roy, S.B.; Pal, Sangita; Tewari, P.K.

    2015-01-01

    Uranium plays a key role in nuclear power supply, demand of which is growing up with time because of its prospective features. Persistent increase in different nuclear activities leads to increase generation of nuclear wastewater containing uranium. Separation and recovery of the uranium from its unconventional source like nuclear wastewater is worth to explore for addressing the reutilisation of the uranium source. It is also necessary to improve remediation technology of nuclear industries for environmental protection. Development of a suitable process methodology is essential for the purpose to supersede the conventional methodology. In the article, recent developments in several possible methodologies for separation of uranium from dilute solution have been discussed with their merits and demerits. Sorption technique as solid phase extraction methodology has been chosen with suitable polymer matrix and functional moiety based on wastewater characteristics. Polyhydroxamic Acid, PHOA sorbent synthesized following eco-friendly procedure is a promising polymeric chelating sorbents for remediation of nuclear wastewaters and recovery of uranium. Sorption and elution characteristics of the PHOA have been evaluated and illustrated for separation and recovery of uranium from a sample nuclear wastewater. For the remediation of nuclear wastewater SPE technique applying the PHOA, a polymeric sorbent is found to be a potentially suitable methodology. (author)

  15. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  16. Overview on hydrogen risk research and development activities: Methodology and open issues

    Energy Technology Data Exchange (ETDEWEB)

    Bentaib, Ahmed; Meynet, Nicolas; Bleyer, Alexande [Institut de Radioprotection et de Surete Nucleaire (IRSN), Severe Accident Department, Fontenay-aux-Roses (France)

    2015-02-15

    During the course of a severe accident in a light water nuclear reactor, large amounts of hydrogen can be generated and released into the containment during reactor core degradation. Additional burnable gases [hydrogen (H2) and carbon monoxide (CO)] may be released into the containment in the corium/concrete interaction. This could subsequently raise a combustion hazard. As the Fukushima accidents revealed, hydrogen combustion can cause high pressure spikes that could challenge the reactor buildings and lead to failure of the surrounding buildings. To prevent the gas explosion hazard, most mitigation strategies adopted by European countries are based on the implementation of passive autocatalytic recombiners (PARs). Studies of representative accident sequences indicate that, despite the installation of PARs, it is difficult to prevent at all times and locations, the formation of a combustible mixture that potentially leads to local flame acceleration. Complementary research and development (R and D) projects were recently launched to understand better the phenomena associated with the combustion hazard and to address the issues highlighted after the Fukushima Daiichi events such as explosion hazard in the venting system and the potential flammable mixture migration into spaces beyond the primary containment. The expected results will be used to improve the modeling tools and methodology for hydrogen risk assessment and severe accident management guidelines. The present paper aims to present the methodology adopted by Institut de Radioprotection et de Su.

  17. Development of the methodology and approaches to validate safety and accident management

    International Nuclear Information System (INIS)

    Asmolov, V.G.

    1997-01-01

    The article compares the development of the methodology and approaches to validate the nuclear power plant safety and accident management in Russia and advanced industrial countries. It demonstrates that the development of methods of safety validation is dialectically related to the accumulation of the knowledge base on processes and events during NPP normal operation, transients and emergencies, including severe accidents. The article describes the Russian severe accident research program (1987-1996), the implementation of which allowed Russia to reach the world level of the safety validation efforts, presents future high-priority study areas. Problems related to possible approaches to the methodological accident management development are discussed. (orig.)

  18. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  19. The Development and Significance of Standards for Smoking-Machine Methodology

    Directory of Open Access Journals (Sweden)

    Baker R

    2014-12-01

    Full Text Available Bialous and Yach have recently published an article in Tobacco Control in which they claim that all smoking-machine standards stem from a method developed unilaterally by the tobacco industry within the Cooperation Centre for Scientific Research Relative to Tobacco (CORESTA. Using a few highly selective quotations from internal tobacco company memos, they allege, inter alia, that the tobacco industry has changed the method to suit its own needs, that because humans do not smoke like machines the standards are of little value, and that the tobacco industry has unjustifiably made health claims about low “tar” cigarettes. The objectives of this paper are to review the development of smoking-machine methodology and standards, involvement of relative parties, outline the significance of the results and explore the validity of Bialous and Yach's claims. The large volume of published scientific information on the subject together with other information in the public domain has been consulted. When this information is taken into account it becomes obvious that the very narrow and restricted literature base of Bialous and Yach's analysis has resulted in them, perhaps inadvertedly, making factual errors, drawing wrong conclusions and writing inaccurate statements on many aspects of the subject. The first smoking-machine standard was specified by the Federal Trade Commission (FTC, a federal government agency in the USA, in 1966. The CORESTA Recommended Method, similar in many aspects to that of the FTC, was developed in the late 1960s and published in 1969. Small differences in the butt lengths, smoke collection and analytical procedures in methods used in various countries including Germany, Canada and the UK, developed later, resulted in about a 10% difference in smoke “tar” yields. These differences in methodology were harmonised in a common International Organisation for Standardisation (ISO Standard Method in 1991, after a considerable amount

  20. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  1. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  2. Development of a design methodology for hydraulic pipelines carrying rectangular capsules

    International Nuclear Information System (INIS)

    Asim, Taimoor; Mishra, Rakesh; Abushaala, Sufyan; Jain, Anuj

    2016-01-01

    The scarcity of fossil fuels is affecting the efficiency of established modes of cargo transport within the transportation industry. Efforts have been made to develop innovative modes of transport that can be adopted for economic and environmental friendly operating systems. Solid material, for instance, can be packed in rectangular containers (commonly known as capsules), which can then be transported in different concentrations very effectively using the fluid energy in pipelines. For economical and efficient design of such systems, both the local flow characteristics and the global performance parameters need to be carefully investigated. Published literature is severely limited in establishing the effects of local flow features on system characteristics of Hydraulic Capsule Pipelines (HCPs). The present study focuses on using a well validated Computational Fluid Dynamics (CFD) tool to numerically simulate the solid-liquid mixture flow in both on-shore and off-shore HCPs applications including bends. Discrete Phase Modelling (DPM) has been employed to calculate the velocity of the rectangular capsules. Numerical predictions have been used to develop novel semi-empirical prediction models for pressure drop in HCPs, which have then been embedded into a robust and user-friendly pipeline optimisation methodology based on Least-Cost Principle. - Highlights: • Local flow characteristics in a pipeline transporting rectangular capsules. • Development of prediction models for the pressure drop contribution of capsules. • Methodology developed for sizing of Hydraulic Capsule Pipelines. • Implementation of the developed methodology to obtain optimal pipeline diameter.

  3. System study methodology development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Sarto, S.; Zappellini, G.; Gambi, G.

    1989-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to describe the rules to apply in scientific research. This methodology is a powerful tool for evaluating the options, compared with conventional analytical methods as a higher number of parameters can be taken into account, with a higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. This method can be limited to a specific objective such as a fusion reactor safety analysis, taking into account other major constraints such as the economical environment. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design. (orig.)

  4. Embracing Agile methodology during DevOps Developer Internship Program

    OpenAIRE

    Patwardhan, Amol; Kidd, Jon; Urena, Tiffany; Rajgopalan, Aishwarya

    2016-01-01

    The DevOps team adopted agile methodologies during the summer internship program as an initiative to move away from waterfall. The DevOps team implemented the Scrum software development strategy to create an internal data dictionary web application. This article reports on the transition process and lessons learned from the pilot program.

  5. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    Science.gov (United States)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  6. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  7. Reaching the grassroots: publishing methodologies for development organizations.

    Science.gov (United States)

    Zielinski, C

    1987-01-01

    There are 3 major distinctions between the traditional form of academic publishing and publishing for the grassroots as a development-organization activity, particularly in developing countries. Whereas academic publishing seeks to cover the target audience in its entirety, grassroots publishing can only cover a sampling. Academic publishing fulfills a need, while grassroots publishing demonstrates a need and a way to fulfill it. Finally, whereas academic publishing is largely a support activity aimed at facilitating the dissemination of information as a relatively minor part of a technical program, grassroots publishing is a more substantive activity aimed at producing a catalytic effect. Publication for the grassroots further calls for a different methodological approach. Given the constraint of numbers, publications aimed at the grassroots can only be examples or prototypes. The function of a prototype is to serve both as a basis for translation, adaptation, and replication and as a model end result. The approach to the use and promotion of prototypes differs according to the specific country situation. In countries with a heterogenous culture or several different languages, 2 items should be produced: a prototype of the complete text, which should be pretested and evaluated, and a prototype adaptation kit stripped of cultural and social biases. Promotion of the translation and replication of a publication can be achieved by involving officials at the various levels of government, interesting international and voluntary funding agencies, and stimulating indigenous printing capacities at the community level. The most important factors are the appropriateness of the publication in solving specific priority problems and the interest and involvement of national and state authorities at all stages of the project.

  8. Diffraction or Reflection? Sketching the Contours of Two Methodologies in Educational Research

    Science.gov (United States)

    Bozalek, Vivienne; Zembylas, Michalinos

    2017-01-01

    Internationally, an interest is emerging in a growing body of work on what has become known as "diffractive methodologies" drawing attention to ontological aspects of research. Diffractive methodologies have largely been developed in response to a dissatisfaction with practices of "reflexivity", which are seen to be grounded in…

  9. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    International Nuclear Information System (INIS)

    Zhang, B.; Mayhue, L.; Huria, H.; Ivanov, B.

    2012-01-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000 R plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. The mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)

  11. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1986-01-01

    The analysis of the processes involved in the burial of nuclear wastes can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission

  12. Enhanced Methodologies to Enumerate Persons Experiencing Homelessness in a Large Urban Area.

    Science.gov (United States)

    Troisi, Catherine L; D'Andrea, Ritalinda; Grier, Gary; Williams, Stephen

    2015-10-01

    Homelessness is a public health problem, and persons experiencing homelessness are a vulnerable population. Estimates of the number of persons experiencing homelessness inform funding allocations and services planning and directly determine the ability of a community to intervene effectively in homelessness. The point-in-time (PIT) count presents a logistical problem in large urban areas, particularly those covering a vast geographical area. Working together, academia, local government, and community organizations improved the methodology for the count. Specific enhancements include use of incident command system (ICS), increased number of staging areas/teams, specialized outreach and Special Weapons and Tactics teams, and day-after surveying to collect demographic information. This collaboration and enhanced methodology resulted in a more accurate estimate of the number of persons experiencing homelessness and allowed comparison of findings for 4 years. While initial results showed an increase due to improved counting, the number of persons experiencing homelessness counted for the subsequent years showed significant decrease during the same time period as a "housing first" campaign was implemented. The collaboration also built capacity in each sector: The health department used ICS as a training opportunity; the academics enhanced their community health efforts; the service sector was taught and implemented more rigorous quantitative methods; and the community was exposed to public health as a pragmatic and effective discipline. Improvements made to increase the reliability of the PIT count can be adapted for use in other jurisdictions, leading to improved counts and better evaluation of progress in ending homelessness. © The Author(s) 2015.

  13. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  14. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  15. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    Science.gov (United States)

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to

  16. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  17. Status of Methodology Development for the Evaluation of Proliferation Resistance

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Ko, Won Il; Lee, Jung Won

    2010-01-01

    Concerning the increasing energy demand and green house effect, nuclear energy is now the most feasible option. Therefore, recently, oil countries even have a plan to build the nuclear power plant for energy production. If nuclear systems are to make a major and sustainable contribution to the worlds energy supply, future nuclear energy systems must meet specific requirements. One of the requirements is to satisfy the proliferation resistance condition in an entire nuclear system. Therefore, from the beginning of future nuclear energy system development, it is important to consider a proliferation resistance to prevent the diversion of nuclear materials. The misuse of a nuclear system must be considered as well. Moreover, in the import and export of nuclear system, the evaluation of the proliferation resistance on the nuclear system becomes a key factor The INPRO (International Project on Innovative Nuclear Reactors and Fuel Cycles) program initiated by the IAEA proposed proliferation resistance (PR) as a key component of a future innovative nuclear system (INS) with a sustainability, economics, safety of nuclear installation and waste management. The technical goal for Generation IV (Gen IV) nuclear energy systems (NESs) highlights a Proliferation Resistance and Physical Protection (PR and PP), sustainability, safety, reliability and economics as well. Based on INPRO and Gen IV study, the methodology development for the evaluation of proliferation resistance has been carried out in KAERI. Finally, the systematic procedure for methodology was setup and the indicators for the procedure were decided. The methodology involves the evaluation from total nuclear system to individual process. Therefore, in this study, the detailed procedure for the evaluation of proliferation resistance and the newly proposed additional indicators are described and several conditions are proposed to increase the proliferation resistance in the future nuclear system. The assessment of PR

  18. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  19. A pattern recognition methodology for evaluation of load profiles and typical days of large electricity customers

    International Nuclear Information System (INIS)

    Tsekouras, G.J.; Kotoulas, P.B.; Tsirekis, C.D.; Dialynas, E.N.; Hatziargyriou, N.D.

    2008-01-01

    This paper describes a pattern recognition methodology for the classification of the daily chronological load curves of each large electricity customer, in order to estimate his typical days and his respective representative daily load profiles. It is based on pattern recognition methods, such as k-means, self-organized maps (SOM), fuzzy k-means and hierarchical clustering, which are theoretically described and properly adapted. The parameters of each clustering method are properly selected by an optimization process, which is separately applied for each one of six adequacy measures. The results can be used for the short-term and mid-term load forecasting of each consumer, for the choice of the proper tariffs and the feasibility studies of demand side management programs. This methodology is analytically applied for one medium voltage industrial customer and synoptically for a set of medium voltage customers of the Greek power system. The results of the clustering methods are presented and discussed. (author)

  20. Development of a simplified statistical methodology for nuclear fuel rod internal pressure calculation

    International Nuclear Information System (INIS)

    Kim, Kyu Tae; Kim, Oh Hwan

    1999-01-01

    A simplified statistical methodology is developed in order to both reduce over-conservatism of deterministic methodologies employed for PWR fuel rod internal pressure (RIP) calculation and simplify the complicated calculation procedure of the widely used statistical methodology which employs the response surface method and Monte Carlo simulation. The simplified statistical methodology employs the system moment method with a deterministic statistical methodology employs the system moment method with a deterministic approach in determining the maximum variance of RIP. The maximum RIP variance is determined with the square sum of each maximum value of a mean RIP value times a RIP sensitivity factor for all input variables considered. This approach makes this simplified statistical methodology much more efficient in the routine reload core design analysis since it eliminates the numerous calculations required for the power history-dependent RIP variance determination. This simplified statistical methodology is shown to be more conservative in generating RIP distribution than the widely used statistical methodology. Comparison of the significances of each input variable to RIP indicates that fission gas release model is the most significant input variable. (author). 11 refs., 6 figs., 2 tabs

  1. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F van [NAGRA (Switzerland); and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  2. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  3. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    International Nuclear Information System (INIS)

    Salazar, Jesus; McVay, Duane A.; Lee, W. John

    2010-01-01

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  4. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  5. Use of New Methodologies for Students Assessment in Large Groups in Engineering Education

    Directory of Open Access Journals (Sweden)

    B. Tormos

    2014-03-01

    Full Text Available In this paper, a student evaluation methodology which applies the concept of continuous assessment proposed by Bologna is presented for new degrees in higher education. An important part of the student's final grade is based on the performance of several individual works throughout the semester. The paper shows the correction system used which is based on using a spreadsheet with macros and a template in which the student provides the solution of each task. The employ of this correction system together with the available e-learning platform allows the teachers to perform automatic tasks evaluations compatible with courses with large number of students. The paper also raises the different solutions adopted to avoid plagiarism and to try that the final grade reflects, as closely as possible, the knowledge acquired by the students.

  6. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  7. The status of proliferation resistance evaluation methodology development in GEN IV international forum

    International Nuclear Information System (INIS)

    Inoue, Naoko; Kawakubo, Yoko; Seya, Michio; Suzuki, Mitsutoshi; Kuno, Yusuke; Senzaki, Masao

    2010-01-01

    The Generation IV Nuclear Energy Systems International Forum (GIF) Proliferation Resistance and Physical Protection Working Group (PR and PP WG) was established in December 2002 in order to develop the PR and PP evaluation methodology for GEN IV nuclear energy systems. The methodology has been studied and established by international consensus. The PR and PP WG activities include development of the measures and metrics; establishment of the framework of PR and PP evaluation, the demonstration study using Example Sodium Fast Reactor (ESFR), which included the development of three evaluation approaches; the Case Study using ESFR and four kinds of threat scenarios; the joint study with GIF System Steering Committees (SSCs) of the six reactor design concepts; and the harmonization study with the IAEA's International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). This paper reviews the status of GIF PR and PP studies and identifies the challenges and directions for applying the methodology to evaluate future nuclear energy systems in Japan. (author)

  8. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  9. The Methodology of the Process of Formation of Innovation Management of Enterprises’ Development

    Directory of Open Access Journals (Sweden)

    Prokhorova Viktoriia V.

    2017-12-01

    Full Text Available The article is aimed at forming the methodology of process of innovation management of enterprises’ development in modern conditions. A study on formation of the essence of methodology was carried out, the stages of development of methods and means of scientific cognition were analyzed. The basic components of formation of methodology of innovation management of development of enterprises have been defined, i.e.: methods, types, principles, components, systematized aggregate. The relations of empirical and theoretical methods of scientific cognition were considered and defined. It has been determined that the increase of the volume and scope of scientific views, as well as the deepening of scientific knowledge in the disclosure of laws and regularities of functioning of real natural and social world, lead to the objective fact that is the desire of scientists to analyze methods and means by which modern innovative knowledge and views in the enterprise management system can be acquired and formed.

  10. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    Science.gov (United States)

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  11. Developing new methodology for nuclear power plants vulnerability assessment

    International Nuclear Information System (INIS)

    Kostadinov, Venceslav

    2011-01-01

    new methodology and solution methods for vulnerability assessment can help the overall national energy sector to identify and understand the terrorist threats to and vulnerabilities of its critical infrastructure. Moreover, adopted methodology could help national regulators and agencies to develop and implement a vulnerability awareness and education programs for their critical assets to enhance the security and a safe operation of the entire energy infrastructure. New methods can also assist nuclear power plants to develop, validate, and disseminate assessment and surveys of new efficient countermeasures. Consequently, concise description of developed new quantitative method and adapted new methodology for nuclear regulatory vulnerability assessment of nuclear power plants are presented.

  12. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  13. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  14. Development of probabilistic assessment methodology for geologic disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Kimura, H.; Takahashi, T.

    1998-01-01

    The probabilistic assessment methodology is essential to evaluate uncertainties of long-term radiological consequences associated with geologic disposal of radioactive wastes. We have developed a probabilistic assessment methodology to estimate the influences of parameter uncertainties/variabilities. An exposure scenario considered here is based on a groundwater migration scenario. A computer code system GSRW-PSA thus developed is based on a non site-specific model, and consists of a set of sub modules for sampling of model parameters, calculating the release of radionuclides from engineered barriers, calculating the transport of radionuclides through the geosphere, calculating radiation exposures of the public, and calculating the statistical values relating the uncertainties and sensitivities. The results of uncertainty analyses for α-nuclides quantitatively indicate that natural uranium ( 238 U) concentration is suitable for an alternative safety indicator of long-lived radioactive waste disposal, because the estimated range of individual dose equivalent due to 238 U decay chain is narrower that that due to other decay chain ( 237 Np decay chain). It is internationally necessary to have detailed discussion on the PDF of model parameters and the PSA methodology to evaluated the uncertainties due to conceptual models and scenarios. (author)

  15. Progress in the development of methodology for fusion safety systems studies

    International Nuclear Information System (INIS)

    Ho, S.K.; Cambi, G.; Ciattaglia, S.; Fujii-e, Y.; Seki, Y.

    1994-01-01

    The development of fusion safety systems-study methodology, including the aspects of schematic classification of overall fusion safety system, qualitative assessment of fusion system for identification of critical accident scenarios, quantitative analysis of accident consequences and risk for safety design evaluation, and system-level analysis of accident consequences and risk for design optimization, by a consortium of international efforts is presented. The potential application of this methodology into reactor design studies will facilitate the systematic assessment of safety performance of reactor designs and enhance the impacts of safety considerations on the selection of design configurations

  16. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning, the establishment and destruction of components at software run-time (dynamic reconfiguration, cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning.Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration, and finally with a mobile architecture (i.e. architecture with dynamic component mobility confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008. Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development. According to (Szyper, 2002 that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process, see (Sommer, 2010 as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of

  17. Development of performance assessment methodology for nuclear waste isolation in geologic media

    Science.gov (United States)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  18. Contracting Selection for the Development of the Range Rule Risk Methodology

    National Research Council Canada - National Science Library

    1997-01-01

    ...-Effectiveness Risk Tool and contractor selection for the development of the Range Rule Risk Methodology. The audit objective was to determine whether the Government appropriately used the Ordnance and Explosives Cost-Effectiveness Risk Tool...

  19. The Helicobacter Eradication Aspirin Trial (HEAT: A Large Simple Randomised Controlled Trial Using Novel Methodology in Primary Care

    Directory of Open Access Journals (Sweden)

    Jennifer S. Dumbleton

    2015-09-01

    Discussion: HEAT is important medically, because aspirin is so widely used, and methodologically, as a successful trial would show that large-scale studies of important clinical outcomes can be conducted at a fraction of the cost of those conducted by industry, which in turn will help to ensure that trials of primarily medical rather than commercial interest can be conducted successfully in the UK.

  20. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  1. Development of a new methodology for quantifying nuclear safety culture

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-01-15

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  2. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    Science.gov (United States)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  3. A New Methodology of Design and Development of Serious Games

    Directory of Open Access Journals (Sweden)

    André F. S. Barbosa

    2014-01-01

    Full Text Available The development of a serious game requires perfect knowledge of the learning domain to obtain the desired results. But it is also true that this may not be enough to develop a successful serious game. First of all, the player has to feel that he is playing a game where the learning is only a consequence of the playing actions. Otherwise, the game is viewed as boring and not as a fun activity and engaging. For example, the player can catch some items in the scenario and then separate them according to its type (i.e., recycle them. Thus, the main action for player is catching the items in the scenario where the recycle action is a second action, which is viewed as a consequence of the first action. Sometimes, the game design relies on a detailed approach based on the ideas of the developers because some educational content are difficult to integrate in the games, while maintaining the fun factor in the first place. In this paper we propose a new methodology of design and development of serious games that facilitates the integration of educational contents in the games. Furthermore, we present a serious game, called “Clean World”, created using this new methodology.

  4. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  5. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  6. Project management methodology in the public and private sector: The case of an emerging market

    Directory of Open Access Journals (Sweden)

    Azamat Oinarov

    2017-03-01

    Full Text Available Application of project management methodologies in different countries is varied. The preference of a particular methodology largely depends on the specific features of a project management system in use. The aim of the paper is to draw the attention of project-involved readers to the need to develop, not a guide, but a specific project management methodology for projects in the public-private sector. The objective pursued by the paper is to provide useful recommendations for improving the existing methodologies on project management in the public-private sector. Kazakhstan’s experience in implementing project management methodologies in its public sector is sporadic while its private sector uses of modern methodologies build on external investor proven practices. At the background of the low exposure of the public sector to the best of project management methodologies, the paper reviews existing international project management methodologies and develops useful recommendations on the methodology, most suitable for a developing country’s public-private sector, on Kazakhstan’s example.

  7. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  8. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  9. Development and Attestation of Gamma-Ray Measurement Methodologies for use by Rostekhnadzor Inspectors in the Russian Federation

    International Nuclear Information System (INIS)

    Jeff Sanders

    2006-01-01

    Development and attestation of gamma-ray non-destructive assay measurement methodologies for use by inspectors of the Russian Federal Service for Environmental, Technological, and Nuclear Oversight (Rostekhnadzor, formerly Gosatomnadzor or GAN), as well as for use by Russian nuclear facilities, has been completed. Specifically, a methodology utilizing the gamma-ray multi group analysis (MGA) method for determining plutonium isotopic composition has been developed, while existing methodologies to determining uranium enrichment and isotopic composition have been revised to make them more appropriate to the material types and conditions present in nuclear facilities in the Russian Federation. This paper will discuss the development and revision of these methodologies, the metrological characteristics of the final methodologies, as well as the limitations and concerns specific to the utilization of these analysis methods in the Russian Federation

  10. A methodology to modify land uses in a transit oriented development scenario.

    Science.gov (United States)

    Sahu, Akshay

    2018-05-01

    Developing nations are adopting transit oriented development (TOD) strategies to decongest their transportation systems. These strategies are often adopted after the preparation of land use plans. The goal of this study was to build a methodology to modify these land uses using soft computing. This can help to achieve alternate land use plans relevant to TOD. The methodology incorporates TOD characteristics and objectives. Global TOD parameters (density, diversity, and distance to transit) were studied. Expert opinions gave weights and ranges for the parameters in an Indian TOD scenario. Rules to allocate land use was developed. Objective functions were defined. Four objectives were used. First was to maximize employment density, residential density and percent of mix land use. Second was to shape density and diversity with respect to distance. Third was to minimize degree of land use change, and fourth was to increase compactness of the land use allocation. The methodology was applied to two sectors of Naya Raipur, the new planned administrative capital of the state of Chhattisgarh, India. The city has implemented TOD in the form of Bus rapid transit system (BRTS) over an existing land use. Thousand random plans were generated through the methodology. Top 30 plans were selected as parent population for modifications through genetic algorithm (GA). Alternate plans were generated at the end of GA cycle. The best alternate plan was compared with successful BRTS and TOD land uses for its merits and demerits. It was also compared with the initial land use plan for empirical validation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Astatine-211 Radiochemistry: The Development Of Methodologies For High Activity Level Radiosynthesis

    International Nuclear Information System (INIS)

    Zalutsky, Michael R.

    2012-01-01

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for α-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the α-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At-labeled targeted

  12. ASTATINE-211 RADIOCHEMISTRY: THE DEVELOPMENT OF METHODOLOGIES FOR HIGH ACTIVITY LEVEL RADIOSYNTHESIS

    Energy Technology Data Exchange (ETDEWEB)

    MICHAEL R. ZALUTSKY

    2012-08-08

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for {alpha}-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the {alpha}-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At

  13. Bibliographic survey on methodologies for development of health database of the population in case of cancer occurrences

    International Nuclear Information System (INIS)

    Cavinato, Christianne C.; Andrade, Delvonei A. de; Sabundjian, Gaiane; Diz, Maria Del Pilar E.

    2014-01-01

    The objective is to make a survey of existing methodologies and for the development of public health database, focusing on health (fatal and nonfatal cancer) of the population surrounding a nuclear facility, for purposes of calculating the environmental cost of the same. From methodologies found to develop this type of database, a methodology will be developed to be applied to the internal public of IPEN/CNEN-SP, Brazil, as a pre-test for the acquisition of health information desired

  14. Contextual assessment of organisational culture - methodological development in two case studies

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.

    2002-01-01

    Despite the acknowledged significance of organisational culture in the nuclear field, previous cultural studies have concentrated on purely safety related matters, or been only descriptive in nature. New kinds of methods, taking into account the overall objectives of the organisation, were needed to assess culture and develop its working practices appropriately. VTT developed the Contextual Assessment of Organisational Culture (CAOC) methodology during the FINNUS programme. The methodology utilises two concepts, organisational culture and core task. The core task can be defined as the core demands and content of work that the organisation has to accomplish in order to be effective. The core task concept is used in assessing the central dimensions of the organisation's culture. Organisational culture is defined as a solution the company has generated in order to fulfil the perceived demands of its core task. The CAOC-methodology was applied in two case studies, in the Radiation and Nuclear Safety Authority of Finland and in the maintenance unit of Loviisa NPP. The aim of the studies was not only to assess the given culture, but also to give the personnel new concepts and new tools for reflecting on their organisation, their jobs and on appropriate working practices. The CAOC-methodology contributes to the design and redesign of work in complex sociotechnical systems. It strives to enhance organisations' capability to assess their current working practices and the meanings attached to them and compare these to the actual demands of their basic mission and so change unadaptive practices. (orig.)

  15. Methodology for considering environments and culture in developing information security systems

    OpenAIRE

    Mwakalinga, G Jeffy; Kowalski, Stewart; Yngström, Louise

    2009-01-01

    In this paper, we describe a methodology for considering culture of users and environments when developing information security systems. We discuss the problem of how researchers and developers of security for information systems have had difficulties in considering culture of users and environments when they develop information security systems. This has created environments where people serve technology instead of technology serving people. Users have been considered just as any other compo...

  16. Systematic methodological review: developing a framework for a qualitative semi-structured interview guide.

    Science.gov (United States)

    Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari

    2016-12-01

    To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.

  17. Technical Support Document: Development of the Advanced Energy Design Guide for Large Hospitals - 50% Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Bonnema, E.; Leach, M.; Pless, S.

    2013-06-01

    This Technical Support Document describes the process and methodology for the development of the Advanced Energy Design Guide for Large Hospitals: Achieving 50% Energy Savings Toward a Net Zero Energy Building (AEDG-LH) ASHRAE et al. (2011b). The AEDG-LH is intended to provide recommendations for achieving 50% whole-building energy savings in large hospitals over levels achieved by following Standard 90.1-2004. The AEDG-LH was created for a 'standard' mid- to large-size hospital, typically at least 100,000 ft2, but the strategies apply to all sizes and classifications of new construction hospital buildings. Its primary focus is new construction, but recommendations may be applicable to facilities undergoing total renovation, and in part to many other hospital renovation, addition, remodeling, and modernization projects (including changes to one or more systems in existing buildings).

  18. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  19. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  20. A probabilistic assessment of large scale wind power development for long-term energy resource planning

    Science.gov (United States)

    Kennedy, Scott Warren

    A steady decline in the cost of wind turbines and increased experience in their successful operation have brought this technology to the forefront of viable alternatives for large-scale power generation. Methodologies for understanding the costs and benefits of large-scale wind power development, however, are currently limited. In this thesis, a new and widely applicable technique for estimating the social benefit of large-scale wind power production is presented. The social benefit is based upon wind power's energy and capacity services and the avoidance of environmental damages. The approach uses probabilistic modeling techniques to account for the stochastic interaction between wind power availability, electricity demand, and conventional generator dispatch. A method for including the spatial smoothing effect of geographically dispersed wind farms is also introduced. The model has been used to analyze potential offshore wind power development to the south of Long Island, NY. If natural gas combined cycle (NGCC) and integrated gasifier combined cycle (IGCC) are the alternative generation sources, wind power exhibits a negative social benefit due to its high capacity cost and the relatively low emissions of these advanced fossil-fuel technologies. Environmental benefits increase significantly if charges for CO2 emissions are included. Results also reveal a diminishing social benefit as wind power penetration increases. The dependence of wind power benefits on natural gas and coal prices is also discussed. In power systems with a high penetration of wind generated electricity, the intermittent availability of wind power may influence hourly spot prices. A price responsive electricity demand model is introduced that shows a small increase in wind power value when consumers react to hourly spot prices. The effectiveness of this mechanism depends heavily on estimates of the own- and cross-price elasticities of aggregate electricity demand. This work makes a valuable

  1. Summary of FY-1978 consultation input for Scenario Methodology Development

    International Nuclear Information System (INIS)

    Scott, B.L.; Benson, G.L.; Craig, R.A.; Harwell, M.A.

    1979-11-01

    The Scenario Methodology Development task is concerned with evaluating the geologic system surrounding an underground repository and describing the phenomena (volcanic, seismic, meteorite, hydrologic, tectonic, climate, etc.) which could perturb the system and possibly cause loss of repository integrity. This document includes 14 individual papers. Separate abstracts were prepared for all 14 papers

  2. Why did humans develop a large brain?

    OpenAIRE

    Muscat Baron, Yves

    2012-01-01

    "Of all animals, man has the largest brain in proportion to his size"- Aristotle. Dr Yves Muscat Baron shares his theory on how humans evolved large brains. The theory outlines how gravity could have helped humans develop a large brain- the author has named the theory 'The Gravitational Vascular Theory'. http://www.um.edu.mt/think/why-did-humans-develop-a-large-brain/

  3. Organizational and technological genesis as a tool for strategic planning of large-scale real estate development projects

    Directory of Open Access Journals (Sweden)

    Gusakova Elena

    2018-01-01

    Full Text Available Conceptual planning and implementation of large-scale real estate development projects is one of the most difficult tasks in the organization of construction. In the Russian practice, a large experience of development, complex reorganization and redevelopment of large development areas is accumulated. The methodological basis for solving similar problems is the organizational and technological genesis, which considers the development of the project during the full life cycle. An analysis of this experience allows us to talk about the formation of new and effective approaches and methods within the organizational and technological genesis. Among them, the most significant and universal approaches should be highlighted: The concept of real estate development, which explains the reasons and objective needs for project transformations during its life cycle, as well as to increase the adaptive capabilities of design decisions and the project's suitability for the most likely future changes; Development project of joint action, which is based on the balance of interests of project participants; Master planning of the life cycle stages of the project and subprojects, based on the rethinking of the theory and methods of the construction organization, and allowing rationally localized construction sites and related subprojects, while retaining the remaining development and development area beyond of the negative effect of construction for comfortable living and work.

  4. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  5. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  6. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  7. Large shaft development test plan

    International Nuclear Information System (INIS)

    Krug, A.D.

    1984-03-01

    This test plan proposes the conduct of a large shaft development test at the Hanford site in support of the repository development program. The purpose and objective of the test plan is to obtain the information necessary to establish feasibility and to predict the performance of the drilling system used to drill large diameter shafts. The test plan is based upon drilling a 20 ft diameter shaft to a depth of 1,000 feet. The test plan specifies series of tests to evaluate the performance of the downhole assembly, the performance of the rig, and the ability of the system to cope with geologic hazards. The quality of the hole produced will also be determined. This test plan is considered to be preliminary in that it was prepared as input for the decision to determine if development testing is required in this area. Should the decision be made to proceed with development testing, this test plan shall be updated and revised. 6 refs., 2 figs., 3 tabs

  8. Development of extreme rainfall PRA methodology for sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2016-01-01

    The objective of this study is to develop a probabilistic risk assessment (PRA) methodology for extreme rainfall with focusing on decay heat removal system of a sodium-cooled fast reactor. For the extreme rainfall, annual excess probability depending on the hazard intensity was statistically estimated based on meteorological data. To identify core damage sequence, event trees were developed by assuming scenarios that structures, systems and components (SSCs) important to safety are flooded with rainwater coming into the buildings through gaps in the doors and the SSCs fail when the level of rainwater on the ground or on the roof of the building becomes higher than thresholds of doors on first floor or on the roof during the rainfall. To estimate the failure probability of the SSCs, the level of water rise was estimated by comparing the difference between precipitation and drainage capacity. By combining annual excess probability and the failure probability of SSCs, the event trees led to quantification of core damage frequency, and therefore the PRA methodology for rainfall was developed. (author)

  9. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    Science.gov (United States)

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  11. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  12. Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.

    Science.gov (United States)

    Lan, Y

    1992-12-01

    This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.

  13. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  14. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  15. Addressing the “It Is Just Placebo” Pitfall in CAM: Methodology of a Project to Develop Patient-Reported Measures of Nonspecific Factors in Healing

    Directory of Open Access Journals (Sweden)

    Carol M. Greco

    2013-01-01

    Full Text Available CAM therapies are often dismissed as “no better than placebo;” however, this belief may be overcome through careful analysis of nonspecific factors in healing. To improve trial methodology, we propose that CAM (and conventional RCTs should evaluate and adjust for the effects of intrapersonal, interpersonal, and environmental factors on outcomes. However, measurement of these is challenging, and there are no brief, precise instruments that are suitable for widespread use in trials and clinical settings. This paper describes the methodology of a project to develop a set of patient-reported instruments that will quantify the nonspecific or “placebo” effects that are in fact specific and active ingredients in healing. The project uses the rigorous instrument-development methodology of the NIH-PROMIS initiative. The methods include (1 integration of patients’ and clinicians’ opinions with existing literature; (2 development of relevant items; (3 calibration of items on large samples; (4 classical test theory and modern psychometric methods to select the most useful items; (5 development of computerized adaptive tests (CATs that maximize information while minimizing patient burden; and (6 initial validation studies. The instruments will have the potential to revolutionize clinical trials in both CAM and conventional medicine through quantifying contextual factors that contribute to healing.

  16. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  17. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  18. Methodologies Developed for EcoCity Related Projects: New Borg El Arab, an Egyptian Case Study

    Directory of Open Access Journals (Sweden)

    Carmen Antuña-Rozado

    2016-08-01

    Full Text Available The aim of the methodologies described here is to propose measures and procedures for developing concepts and technological solutions, which are adapted to the local conditions, to build sustainable communities in developing countries and emerging economies. These methodologies are linked to the EcoCity framework outlined by VTT Technical Research Centre of Finland Ltd. for sustainable community and neighbourhood regeneration and development. The framework is the result of a long experience in numerous EcoCity related projects, mainly Nordic and European in scope, which has been reformulated in recent years to respond to the local needs in the previously mentioned countries. There is also a particular emphasis on close collaboration with local partners and major stakeholders. In order to illustrate how these methodologies can support EcoCity concept development and implementation, results from a case study in Egypt will be discussed. The referred case study relates to the transformation of New Borg El Arab (NBC, near Alexandria, into an EcoCity. The viability of the idea was explored making use of different methodologies (Roadmap, Feasibility Study, and Residents Energy Survey and Building Consumption Assessment and considering the Residential, Commercial/Public Facilities, Industrial, Services/Utilities, and Transport sectors.

  19. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  20. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  1. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  2. Safety-related operator actions: methodology for developing criteria

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria

  3. Development of margin assessment methodology of decay heat removal function against external hazards. (2) Tornado PRA methodology

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2014-01-01

    Probabilistic Risk Assessment (PRA) for external events has been recognized as an important safety assessment method after the TEPCO's Fukushima Daiichi nuclear power station accident. The PRA should be performed not only for earthquake and tsunami which are especially key events in Japan, but also the PRA methodology should be developed for the other external hazards (e.g. tornado). In this study, the methodology was developed for Sodium-cooled Fast Reactors paying attention to that the ambient air is their final heat sink for removing decay heat under accident conditions. First, tornado hazard curve was estimated by using data recorded in Japan. Second, important structures and components for decay heat removal were identified and an event tree resulting in core damage was developed in terms of wind load and missiles (i.e. steel pipes, boards and cars) caused by a tornado. Main damage cause for important structures and components is the missiles and the tornado missiles that can reach those components and structures placed on high elevations were identified, and the failure probabilities of the components and structures against the tornado missiles were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or outtake in the decay heat removal system, and a probability of failure caused by the missile impacts. Finally, the event tree was quantified. As a result, the core damage frequency was enough lower than 10 -10 /ry. (author)

  4. Chapter 43: Assessment of NE Greenland: Prototype for development of Circum-ArcticResource Appraisal methodology

    Science.gov (United States)

    Gautier, D.L.; Stemmerik, L.; Christiansen, F.G.; Sorensen, K.; Bidstrup, T.; Bojesen-Koefoed, J. A.; Bird, K.J.; Charpentier, R.R.; Houseknecht, D.W.; Klett, T.R.; Schenk, C.J.; Tennyson, Marilyn E.

    2011-01-01

    Geological features of NE Greenland suggest large petroleum potential, as well as high uncertainty and risk. The area was the prototype for development of methodology used in the US Geological Survey (USGS) Circum-Arctic Resource Appraisal (CARA), and was the first area evaluated. In collaboration with the Geological Survey of Denmark and Greenland (GEUS), eight "assessment units" (AU) were defined, six of which were probabilistically assessed. The most prospective areas are offshore in the Danmarkshavn Basin. This study supersedes a previous USGS assessment, from which it differs in several important respects: oil estimates are reduced and natural gas estimates are increased to reflect revised understanding of offshore geology. Despite the reduced estimates, the CARA indicates that NE Greenland may be an important future petroleum province. ?? 2011 The Geological Society of London.

  5. Methodological and Methodical Principles of the Empirical Study of Spiritual Development of a Personality

    Directory of Open Access Journals (Sweden)

    Olga Klymyshyn

    2017-06-01

    Full Text Available The article reveals the essence of the methodological principles of the spiritual development of a personality. The results of the theoretical analysis of psychological content of spirituality from the positions of system and structural approach to studying of a personality, age patterns of the mental personality development, the sacramental nature of human person, mechanisms of human spiritual development are taken into consideration. The interpretation of spirituality and the spiritual development of a personality is given. Initial principles of the organization of the empirical research of the spiritual development of a personality (ontogenetic, sociocultural, self-determination, system are presented. Such parameters of the estimation of a personality’s spiritual development as general index of the development of spiritual potential, indexes of the development of ethical, aesthetical, cognitive, existential components of spirituality, index of religiousness of a personality are described. Methodological support of psychological diagnostic research is defined.

  6. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  7. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  8. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  9. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  10. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2017-01-01

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  11. Development of performance assessment methodology for establishment of quantitative acceptance criteria of near-surface radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. R.; Lee, E. Y.; Park, J. W.; Chang, G. M.; Park, H. Y.; Yeom, Y. S. [Korea Hydro and Nuclear Power Co., Ltd., Seoul (Korea, Republic of)

    2002-03-15

    The contents and the scope of this study are as follows : review of state-of-the-art on the establishment of waste acceptance criteria in foreign near-surface radioactive waste disposal facilities, investigation of radiological assessment methodologies and scenarios, investigation of existing models and computer codes used in performance/safety assessment, development of a performance assessment methodology(draft) to derive quantitatively radionuclide acceptance criteria of domestic near-surface disposal facility, preliminary performance/safety assessment in accordance with the developed methodology.

  12. Large packages for reactor decommissioning waste

    International Nuclear Information System (INIS)

    Price, M.S.T.

    1991-01-01

    This study was carried out jointly by the Atomic Energy Establishment at Winfrith (now called the Winfrith Technology Centre), Windscale Laboratory and Ove Arup and Partners. The work involved the investigation of the design of large transport containers for intermediate level reactor decommissioning waste, ie waste which requires shielding, and is aimed at European requirements (ie for both LWR and gas cooled reactors). It proposes a design methodology for such containers covering the whole lifetime of a waste disposal package. The design methodology presented takes account of various relevant constraints. Both large self shielded and returnable shielded concepts were developed. The work was generic, rather than specific; the results obtained, and the lessons learned, remain to be applied in practice

  13. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    Science.gov (United States)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean

  14. Development of methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWRs

    International Nuclear Information System (INIS)

    Shvarts, S.; Gerber, D.A.; House, K.; Hirschberg, P.

    1994-01-01

    The objective of this paper is to describe a methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWR plants. This methodology is based in part on plant test data obtained from a recent Diablo Canyon Power Plant (DCPP) Unit 1 heatup. Temperature sensors installed near the nozzle-to-pipe weld were monitored during the heatup, along with operational parameters such as auxiliary feedwater (AFW) flow rate and steam generator temperature. A thermal stratification load definition was developed from this data. Steady state characteristics of this data were used in a finite element analysis to develop relationship between AFW flow and stratification interface level. Fluctuating characteristics of this data were used to determine transient parameters through the application of a Green's Function approach. The thermal stratification load definition from the test data was used in a three-dimensional thermal stress analysis to determine stress cycling and consequent fatigue damage or crack growth during AFW flow fluctuations. The implementation of the developed methodology in the DCPP and Sequoyah Nuclear Plant (SNP) fatigue monitoring systems is described

  15. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  16. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  17. Development of methodologies for coupled water-hammer analysis of piping systems and supports

    International Nuclear Information System (INIS)

    Kamil, H.; Gantayat, A.; Attia, A.; Goulding, H.

    1983-01-01

    The paper presents the results of an investigation on the development of methodologies for coupled water-hammer analyses. The study was conducted because the present analytical methods for calculation of loads on piping systems and supports resulting from water-hammer phenomena are overly conservative. This is mainly because the methods do not usually include interaction between the fluid and the piping and thus predict high loads on piping systems and supports. The objective of the investigation presented in this paper was to develop methodologies for coupled water-hammer analyses, including fluid-structure interaction effects, to be able to obtain realistic loads on piping systems and supports, resulting in production of more economical designs. (orig./RW)

  18. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  19. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  20. The use and effectiveness of information system development methodologies in health information systems / Pieter Wynand Conradie.

    OpenAIRE

    Conradie, Pieter Wynand

    2010-01-01

    Abstract The main focus of this study is the identification of factors influencing the use and effectiveness of information system development methodologies (Le., systems development methodologies) in health information systems. In essence, it can be viewed as exploratory research, utilizing a conceptual research model to investigate the relationships among the hypothesised factors. More specifically, classified as behavioural science, it combines two theoretical models, namely...

  1. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  2. Blending critical realist and emancipatory practice development methodologies: making critical realism work in nursing research.

    LENUS (Irish Health Repository)

    Parlour, Randal

    2012-12-01

    This paper examines the efficacy of facilitation as a practice development intervention in changing practice within an Older Person setting and in implementing evidence into practice. It outlines the influences exerted by the critical realist paradigm in guiding emancipatory practice development activities and, in particular, how the former may be employed within an emancipatory practice development study to elucidate and increase understanding pertinent to causation and outcomes. The methodology is based upon an emancipatory practice development approach set within a realistic evaluation framework. This allows for systematic analysis of the social and contextual elements that influence the explication of outcomes associated with facilitation. The study is concentrated upon five practice development cycles, within which a sequence of iterative processes is integrated. The authors assert that combining critical realist and emancipatory processes offers a robust and practical method for translating evidence and implementing changes in practice, as the former affirms or falsifies the influence that emancipatory processes exert on attaining culture shift, and enabling transformation towards effective clinical practice. A new framework for practice development is proposed that establishes methodological coherency between emancipatory practice development and realistic evaluation. This augments the existing theoretical bases for both these approaches by contributing new theoretical and methodological understandings of causation.

  3. Methodology for oil field development; Metodologia para o desenvolvimento de campos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Galeano, Yadira Diaz

    1998-07-01

    The main scope of this work is to study and develop a methodology which allows the elaboration of project for oil field development. There fore it is necessary to consider to consider the integration of the human, technological and economical issues that are important parameters in the engineering project. The spiral concept was applied for the project in order to coordinate, in a reasonable and logical way, the activities involved in the field development, as well as the hierarchical analysis method for the decision making process. The development of an oil field is divided in viability study, preliminary project, final project, project implementation, production and field abandonment cycles. The main components for each cycle are external aspects, environmental criteria, reservoir management, and drilling, completion and well workover, production systems, exportation systems, and risk and economical analysis. The proposed methodology establishes a general scheme for planning and it presents applicable procedures for any field. (author)

  4. CHARACTERISTICS OF RESEARCH METHODOLOGY DEVELOPMENT IN SPECIAL EDUCATION AND REHABILITATION

    Directory of Open Access Journals (Sweden)

    Natasha ANGELOSKA-GALEVSKA

    2004-12-01

    Full Text Available The aim of the text is to point out the developmental tendencies in the research methodology of special education and rehabilitation worldwide and in our country and to emphasize the importance of methodological training of students in special education and rehabilitation at the Faculty of Philosophy in Skopje.The achieved scientific knowledge through research is the fundamental pre-condition for development of special education and rehabilitation theory and practice. The results of the scientific work sometimes cause small, insignificant changes, but, at times, they make radical changes. Thank to the scientific researches and knowledge, certain prejudices were rejected. For example, in the sixth decade of the last century there was a strong prejudice that mentally retarded children should be segregated from the society as aggressive and unfriendly ones or the deaf children should not learn sign language because they would not be motivated to learn lip-reading and would hardly adapt. Piaget and his colleagues from Geneva institute were the pioneers in researching this field and they imposed their belief that handicapped children were not handicapped in each field and they had potentials that could be developed and improved by systematic and organized work. It is important to initiate further researches in the field of special education and rehabilitation, as well as a critical analysis of realized researches. Further development of the scientific research in special education and rehabilitation should be a base for education policy on people with disabilities and development of institutional and non-institutional treatment of this population.

  5. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  6. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  7. Application of realistic (best- estimate) methodologies for large break loss of coolant (LOCA) safety analysis: licensing of Westinghouse ASTRUM evaluation model in Spain

    International Nuclear Information System (INIS)

    Lage, Carlos; Frepoli, Cesare

    2010-01-01

    When the LOCA Final Acceptance Criteria for Light Water Reactors was issued in Appendix K of 10CFR50 both the USNRC and the industry recognized that the rule was highly conservative. At that time, however, the degree of conservatism in the analysis could not be quantified. As a result, the USNRC began a research program to identify the degree of conservatism in those models permitted in the Appendix K rule and to develop improved thermal-hydraulic computer codes so that realistic accident analysis calculations could be performed. The overall results of this research program quantified the conservatism in the Appendix K rule and confirmed that some relaxation of the rule can be made without a loss in safety to the public. Also, from a risk-informed perspective it is recognized that conservatism is not always a complete defense for lack of sophistication in models. In 1988, as a result of the improved understanding of LOCA phenomena, the USNRC staff amended the requirements of 10 CFR 50.46 and Appendix K, 'ECCS Evaluation Models', so that a realistic evaluation model may be used to analyze the performance of the ECCS during a hypothetical LOCA. Under the amended rules, best-estimate plus uncertainty (BEPU) thermal-hydraulic analysis may be used in place of the overly prescriptive set of models mandated by Appendix K rule. Further guidance for the use of best-estimate codes was provided in Regulatory Guide 1.157 To demonstrate use of the revised ECCS rule, the USNRC and its consultants developed a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology as an approach for defining and qualifying a best-estimate thermal-hydraulic code and quantifying the uncertainties in a LOCA analysis. More recently the CSAU principles have been generalized in the Evaluation Model Development and Assessment Process (EMDAP) of Regulatory Guide 1.203. ASTRUM is the Westinghouse Best Estimate Large Break LOCA evaluation model applicable to two-, three

  8. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  9. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  10. System study methodology. Development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Zappellini, G.; Gambi, G.

    1988-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to explicit the rules to apply in scientific research. This methodology is a powerful tool to evaluate the options to be made, compared with conventional analytical methods as a higher number of parameters can be taken into account, with higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships, by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. Experimental values collection, analysis of the problem, search of solutions, sizing of the installation from defined functions, cost evaluation (planning and operating) and ranking of the options as regard all the constraints are the main points considered for the system's application. This method can be limited to a specific objective such as a fusion reactor safety analysis. The possibility of taking into account all the options, possible accidents, quality assurance, exhaustivity of the safety analysis, identification of the residual risk and modelisation of the results are the main advantages of this approach. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design

  11. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    Directory of Open Access Journals (Sweden)

    Andi Saptono

    2009-09-01

    Full Text Available The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT model. This model describes five required characteristics for a telerehabilitation (TR infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. Keywords: Telerehabilitation, Information Management, Infrastructure Development Methodology, Videoconferencing, Online Portal, Database

  12. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  13. Development of a predictive methodology for identifying high radon exhalation potential areas

    International Nuclear Information System (INIS)

    Ielsch, G.

    2001-01-01

    Radon 222 is a radioactive natural gas originating from the decay of radium 226 which itself originates from the decay of uranium 23 8 naturally present in rocks and soil. Inhalation of radon gas and its decay products is a potential health risk for man. Radon can accumulate in confined environments such as buildings, and is responsible for one third of the total radiological exposure of the general public to radiation. The problem of how to manage this risk then arises. The main difficulty encountered is due to the large variability of exposure to radon across the country. A prediction needs to be made of areas with the highest density of buildings with high radon levels. Exposure to radon varies depending on the degree of confinement of the habitat, the lifestyle of the occupants and particularly emission of radon from the surface of the soil on which the building is built. The purpose of this thesis is to elaborate a methodology for determining areas presenting a high potential for radon exhalation at the surface of the soil. The methodology adopted is based on quantification of radon exhalation at the surface, starting from a precise characterization of the main local geological and pedological parameters that control the radon source and its transport to the ground/atmosphere interface. The methodology proposed is innovative in that it combines a cartographic analysis, parameters integrated into a Geographic Information system, and a simplified model for vertical transport of radon by diffusion through pores in the soil. This methodology has been validated on two typical areas, in different geological contexts, and gives forecasts that generally agree with field observations. This makes it possible to identify areas with a high exhalation potential within a range of a few square kilometers. (author)

  14. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  15. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  16. Development of a standard methodology for optimizing remote visual display for nuclear-maintenance tasks

    International Nuclear Information System (INIS)

    Clarke, M.M.; Garin, J.; Preston-Anderson, A.

    1981-01-01

    The aim of the present study is to develop a methodology for optimizing remote viewing systems for a fuel recycle facility (HEF) being designed at Oak Ridge National Laboratory (ORNL). An important feature of this design involves the Remotex concept: advanced servo-controlled master/slave manipulators, with remote television viewing, will totally replace direct human contact with the radioactive environment. Therefore, the design of optimal viewing conditions is a critical component of the overall man/machine system. A methodology has been developed for optimizing remote visual displays for nuclear maintenance tasks. The usefulness of this approach has been demonstrated by preliminary specification of optimal closed circuit TV systems for such tasks

  17. In-house developed methodologies and tools for decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, Michel; Centner, Baudouin

    2007-01-01

    The paper describes different methodologies and tools developed in-house by Tractebel Engineering to facilitate the engineering works to be carried out especially in the frame of decommissioning projects. Three examples of tools with their corresponding results are presented: - The LLWAA-DECOM code, a software developed for the radiological characterization of contaminated systems and equipment. The code constitutes a specific module of more general software that was originally developed to characterize radioactive waste streams in order to be able to declare the radiological inventory of critical nuclides, in particular difficult-to-measure radionuclides, to the Authorities. In the case of LLWAA-DECOM, deposited activities inside contaminated equipment (piping, tanks, heat exchangers...) and scaling factors between nuclides, at any given time of the decommissioning time schedule, are calculated on the basis of physical characteristics of the systems and of operational parameters of the nuclear power plant. This methodology was applied to assess decommissioning costs of Belgian NPPs, to characterize the primary system of Trino NPP in Italy, to characterize the equipment of miscellaneous circuits of Ignalina NPP and of Kozloduy unit 1 and, to calculate remaining dose rates around equipment in the frame of the preparation of decommissioning activities; - The VISIMODELLER tool, a user friendly CAD interface developed to ease the introduction of lay-out areas in a software named VISIPLAN. VISIPLAN is a 3D dose rate assessment tool for ALARA work planning, developed by the Belgian Nuclear Research Centre SCK.CEN. Both softwares were used for projects such as the steam generators replacements in Belgian NPPs or the preparation of the decommissioning of units 1 and 2 of Kozloduy NPP; - The DBS software, a software developed to manage the different kinds of activities that are part of the general time schedule of a decommissioning project. For each activity, when relevant

  18. DEVELOPMENT OF METHODOLOGY FOR DESIGNING TESTABLE COMPONENT STRUCTURE OF DISCIPLINARY COMPETENCE

    Directory of Open Access Journals (Sweden)

    Vladimir I. Freyman

    2014-01-01

    Full Text Available The aim of the study is to present new methods of quality results assessment of the education corresponding to requirements of Federal State Educational Standards (FSES of the Third Generation developed for the higher school. The urgency of search of adequate tools for quality competency measurement and its elements formed in the course of experts’ preparation are specified. Methods. It is necessary to consider interference of competency components such as knowledge, abilities, possession in order to make procedures of assessment of students’ achievements within the limits of separate discipline or curriculum section more convenient, effective and exact. While modeling of component structure of the disciplinary competence the testable design of components is used; the approach borrowed from technical diagnostics. Results. The research outcomes include the definition and analysis of general iterative methodology for testable designing component structure of the disciplinary competence. Application of the proposed methodology is illustrated as the example of an abstract academic discipline with specified data and index of labour requirement. Methodology restrictions are noted; practical recommendations are given. Scientific novelty. Basic data and a detailed step-by-step implementation phase of the proposed common iterative approach to the development of disciplinary competence testable component structure are considered. Tests and diagnostic tables for different options of designing are proposed. Practical significance. The research findings can help promoting learning efficiency increase, a choice of adequate control devices, accuracy of assessment, and also efficient use of personnel, temporal and material resources of higher education institutions. Proposed algorithms, methods and approaches to procedure of control results organization and realization of developed competences and its components can be used as methodical base while

  19. A Methodology of Health Effects Estimation from Air Pollution in Large Asian Cities

    Directory of Open Access Journals (Sweden)

    Keiko Hirota

    2017-09-01

    Full Text Available The increase of health effects caused by air pollution seems to be a growing concern in Asian cities with increasing motorization. This paper discusses methods of estimating the health effects of air pollution in large Asian cities. Due to the absence of statistical data in Asia, this paper carefully chooses the methodology using data of the Japanese compensation system. A basic idea of health effects will be captured from simple indicators, such as population and air quality, in a correlation model. This correlation model enables more estimation results of respiratory mortality caused by air pollution to be yielded than by using the relative model. The correlation model could be an alternative method to estimate mortality besides the relative risk model since the results of the correlation model are comparable with those of the relative model by city and by time series. The classification of respiratory diseases is not known from the statistical yearbooks in many countries. Estimation results could support policy decision-making with respect to public health in a cost-effective way.

  20. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  1. Preliminary methodological proposal for estimating environmental flows in projects approved by the ministry of environment and sustainable development (MADS), Colombia

    International Nuclear Information System (INIS)

    Pinilla Agudelo, Gabriel A; Rodriguez Sandoval, Erasmo A; Camacho Botero, Luis A

    2014-01-01

    A methodological proposal for estimating environmental flows in large projects approved by Agencia Nacional de Licencias Ambientales (ANLA) in Colombian rivers was developed. The project is the result of an agreement between the MADS and the Universidad Nacional de Colombia, Bogota (UNC). The proposed method begins with an evaluation of hydrological criteria, continues with a hydraulic and water quality validation, and follows with the determination of habitat integrity. This is an iterative process that compares conditions before and after the project construction and allows to obtain the magnitude of a monthly flow that, besides preserving the ecological functions of the river, guarantees the water uses downstream. Regarding to the biotic component, the proposal includes the establishment and monitoring of biotic integrity indices for four aquatic communities (periphyton, macro invertebrates, riparian vegetation, and fish). The effects that flow reduction may produce in the medium and long term can be assessed by these indices. We present the results of applying the methodology to several projects licensed by the MADS.

  2. Practical Aspects of Research Monitoring: Methodological and Functional Solutions

    Directory of Open Access Journals (Sweden)

    A A Onosov

    2013-12-01

    Full Text Available The article describes the experience of designing, testing and implementing the National system of monitoring the quality of meteorological services in Russia. Within the framework of this project a large-scale research program was carried out aimed to develop the conception, methodology, research tools and design of customer assessment of the Roshydromet services.

  3. TECHNOLOGY FOR DEVELOPMENT OF ELECTRONIC TEXTBOOK ON HANDICRAFTS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Iryna V. Androshchuk

    2017-10-01

    Full Text Available The main approaches to defining the concept of electronic textbook have been analyzed in the article. The main advantages of electronic textbooks in the context of future teachers’ training have been outlined. They are interactivity, feedback provision, availability of navigation and search engine. The author has presented and characterized the main stages in the technology of development of an electronic textbook on Handicraft and Technology Training Methodology: determination of its role and significance in the process of mastering the discipline; justification of its structure; outline of the stages of its development in accordance with the defined structure. The characteristic feature of the developed electronic textbook is availability of macro- and microstructure. Macrostructure is viewed as a sequence of components of the electronic textbook that are manifested in its content; microstructure is considered to be an internal pattern of each component of macrostructure.

  4. D4.1 Learning analytics: theoretical background, methodology and expected results

    NARCIS (Netherlands)

    Tammets, Kairit; Laanpere, Mart; Eradze, Maka; Brouns, Francis; Padrón-Nápoles, Carmen; De Rosa, Rosanna; Ferrari, Chiara

    2014-01-01

    The purpose of the EMMA project is to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs on different subjects. The main objectives related with the implementation of learning analytics in EMMa project are to: ● develop the

  5. IDENTIFICATION OF THE IMPACT OF GLOBALIZATION ON THE DEVELOPMENT OF ACCOUNTING METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Tetiana Osadcha

    2017-12-01

    Full Text Available The relevance of the research consists in the significant lag behind the economies of post-Soviet countries from countries of the world, which play the role of driving forces of global economic globalization. Incomplete transformation of the institution of property rights, including the rights to specific factors of production, distorted perception of the state participation in the regulation of economic relationships, cause a fragmentary reflection of facts of business life of business units by the accounting system. This shortcoming can be eliminated by identifying the advantages, feasibility of the counterstand and measures to involve the economy of Ukraine into the process of economic globalization, as well as determining the priority trends of the transformation of the property institution with the subsequent revision of the accounting methodology. The purpose of the research was to identify the impact of globalization on the development of accounting methodology due to institutional changes in property ownership in the field of management and outline the prerequisites for recognizing the rent as an accounting object. The methodological basis of the research is the dialectical method of knowledge of the essence, preconditions, and consequences of economic globalization as a whole, as well as for its individual subjects; general scientific methods of scientific knowledge (analysis, abstraction, synthesis, generalization of possible influence of fundamentals and ideas of institutional theory, agency theory, and rent theory on the accounting methodology in order to outline grounds for recognizing the rent as a component of the income of a business unit. Scientific results. It is established that the development of globalization processes was accompanied by significant transformations of property relations and changing approaches to the analysis of activities, in particular, with regard to the use of resources, distribution and redistribution

  6. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  7. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  8. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  9. The development of a methodology to assess population doses from multiple sources and exposure pathways of radioactivity

    International Nuclear Information System (INIS)

    Hancox, J.; Stansby, S.; Thorne, M.

    2002-01-01

    The Environment Agency (EA) has new duties in accordance with the Basic Safety Standards Directive under which it is required to ensure that doses to individuals received from exposure to anthropogenic sources of radioactivity are within defined limits. In order to assess compliance with these requirements, the EA needs to assess the doses to members of the most highly exposed population groups ('critical' groups) from all relevant potential sources of anthropogenic radioactivity and all relevant potential exposure pathways to such radioactivity. The EA has identified a need to develop a methodology for the retrospective assessment of effective doses from multiple sources of radioactive materials and exposure pathways associated with those sources. Under contract to the EA, AEA Technology has undertaken the development of a suitable methodology as part of EA R and D Project P3-070. The methodology developed under this research project has been designed to support the EA in meeting its obligations under the Euratom Basic Safety Standards Directive and is consistent with UK and international approaches to radiation dosimetry and radiological protection. The development and trial application of the methodology is described in this report

  10. Soviet-designed pressurized water reactor symptomatic emergency operating instruction analytical procedure: approach, methodology, development and application

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1999-01-01

    A symptom approach to the analytical validation of symptom-based EOPs includes: (1) Identification of critical safety functions to the maintenance of fission product barrier integrity; (2) Identification of the symptoms which manifest an impending challenge to critical safety function maintenance; (3) Development of a symptomatic methodology to delineate bounding plant transient response modes; (4) Specification of bounding scenarios; (5) Development of a systematic calculational approach consistent with the objectives of the methodology; (6) Performance of thermal-hydraulic computer code calculations implementing the analytical methodology; (7) Interpretation of the analytical results on the basis of information available to the operator; (8) Application of the results to the validation of the proposed operator actions; (9) Production of a technical basis document justifying the proposed operator actions. (author)

  11. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  12. Development of Cost Estimation Methodology of Decommissioning for PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il; Yoo, Yeon Jae; Lim, Yong Kyu; Chang, Hyeon Sik; Song, Geun Ho

    2013-01-01

    The permanent closure of nuclear power plant should be conducted with the strict laws and the profound planning including the cost and schedule estimation because the plant is very contaminated with the radioactivity. In Korea, there are two types of the nuclear power plant. One is the pressurized light water reactor (PWR) and the other is the pressurized heavy water reactor (PHWR) called as CANDU reactor. Also, the 50% of the operating nuclear power plant in Korea is the PWRs which were originally designed by CE (Combustion Engineering). There have been experiences about the decommissioning of Westinghouse type PWR, but are few experiences on that of CE type PWR. Therefore, the purpose of this paper is to develop the cost estimation methodology and evaluate technical level of decommissioning for the application to CE type PWR based on the system engineering technology. The aim of present study is to develop the cost estimation methodology of decommissioning for application to PWR. Through the study, the following conclusions are obtained: · Based on the system engineering, the decommissioning work can be classified as Set, Subset, Task, Subtask and Work cost units. · The Set and Task structure are grouped as 29 Sets and 15 Task s, respectively. · The final result shows the cost and project schedule for the project control and risk management. · The present results are preliminary and should be refined and improved based on the modeling and cost data reflecting available technology and current costs like labor and waste data

  13. Development of Geometry Optimization Methodology with In-house CFD code, and Challenge in Applying to Fuel Assembly

    International Nuclear Information System (INIS)

    Jeong, J. H.; Lee, K. L.

    2016-01-01

    The wire spacer has important roles to avoid collisions between adjacent rods, to mitigate a vortex induced vibration, and to enhance convective heat transfer by wire spacer induced secondary flow. Many experimental and numerical works has been conducted to understand the thermal-hydraulics of the wire-wrapped fuel bundles. There has been enormous growth in computing capability. Recently, a huge increase of computer power allows to three-dimensional simulation of thermal-hydraulics of wire-wrapped fuel bundles. In this study, the geometry optimization methodology with RANS based in-house CFD (Computational Fluid Dynamics) code has been successfully developed in air condition. In order to apply the developed methodology to fuel assembly, GGI (General Grid Interface) function is developed for in-house CFD code. Furthermore, three-dimensional flow fields calculated with in-house CFD code are compared with those calculated with general purpose commercial CFD solver, CFX. The geometry optimization methodology with RANS based in-house CFD code has been successfully developed in air condition. In order to apply the developed methodology to fuel assembly, GGI function is developed for in-house CFD code as same as CFX. Even though both analyses are conducted with same computational meshes, numerical error due to GGI function locally occurred in only CFX solver around rod surface and boundary region between inner fluid region and outer fluid region.

  14. Methodologies for Social Life Cycle Assessment

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Le Bocq, Agathe; Nazakina, Liudmila

    2008-01-01

    Goal, Scope and Background. In recent years several different approaches towards Social Life Cycle Assessment (SLCA) have been developed. The purpose of this review is to compare these approaches in order to highlight methodological differences and general shortcomings. SLCA has several similarit......Goal, Scope and Background. In recent years several different approaches towards Social Life Cycle Assessment (SLCA) have been developed. The purpose of this review is to compare these approaches in order to highlight methodological differences and general shortcomings. SLCA has several...... similarities with other social assessment tools, but in order to limit the review, only claims to address social impacts from an LCA-like framework is considered. Main Features. The review is to a large extent based on conference proceedings and reports of which some are not easily accessible, since very...... stage in the product life cycle. Another very important difference among the proposals is their position towards the use of generic data. Several of the proposals argue that social impacts are connected to the conduct of the company leading to the conclusion that each individual company in the product...

  15. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  16. New well pattern optimization methodology in mature low-permeability anisotropic reservoirs

    Science.gov (United States)

    Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei

    2018-02-01

    In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.

  17. A methodology for analyzing precursors to earthquake-initiated and fire-initiated accident sequences

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Apostolakis, G.

    1998-04-01

    This report covers work to develop a methodology for analyzing precursors to both earthquake-initiated and fire-initiated accidents at commercial nuclear power plants. Currently, the U.S. Nuclear Regulatory Commission sponsors a large ongoing project, the Accident Sequence Precursor project, to analyze the safety significance of other types of accident precursors, such as those arising from internally-initiated transients and pipe breaks, but earthquakes and fires are not within the current scope. The results of this project are that: (1) an overall step-by-step methodology has been developed for precursors to both fire-initiated and seismic-initiated potential accidents; (2) some stylized case-study examples are provided to demonstrate how the fully-developed methodology works in practice, and (3) a generic seismic-fragility date base for equipment is provided for use in seismic-precursors analyses. 44 refs., 23 figs., 16 tabs

  18. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    Science.gov (United States)

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  19. International developments on implementation of Wog risk-informed inservice inspection methodology

    International Nuclear Information System (INIS)

    Balkey, K.R.; Bishop, B.A.; Canton, M.A.; Closky, N.B.; Haessler, R.L.; Kolonay, J.F.; Sharp, G.L.; Stevenson, P.R.

    2001-01-01

    The full text follows. The Westinghouse Owners Group (WOG) risk-informed inservice inspection (ISI) methodology was granted approval by the U.S. Nuclear Regulatory Commission in 1998 thereby providing an alternative to ASME Section XI Code requirements for the selection of examination locations in nuclear plant piping systems. This paper builds upon a technical paper presented at ICONE-8 that reported on the first wave of risk-informed ISI applications under development primarily focusing on those underway within the U.S. Since that time, many applications have continued within the U.S., however, much progress has been made in applying the WOG risk-informed ISI approach in several other countries. While a summary of results across the various applications will be provided, the paper will focus on the development and implementation of the WOG risk-informed ISI methodology across Europe and in Asia for both full scope and limited Class 1 scope applications. An update on future risk-informed applications, such as modifying requirements for augmented examinations for high energy line break exclusion regions and in risk-informing the safety classification of pressure boundary components in support of risk-informed regulation initiatives, will also be provided. (authors)

  20. An approach to SOA development methodology: SOUP comparison with RUP and XP

    OpenAIRE

    Sandra Svanidzaitė

    2014-01-01

    Service oriented architecture (SOA) is an architecture for distributed applications composed of distributed services with weak coupling that are designed to meet business requirements. One of the research priorities in the field of SOA is creating such software design and development methodology (SDDM) that takes into account all principles of this architecture and allows for effective and efficient application development. A lot of investigation has been carried out to find out whether can o...

  1. On-Line Maintenance Methodology Development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyo Won; Kim, Jae Ho; Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2012-05-15

    Most of domestic maintenance activities for nuclear power plants are performed while overhaul. Therefore, On-Line Maintenance (OLM) is one of the proper risks informed application techniques for diffusing maintenance burden during overhaul with safety of the plant is secured. The NUMARC 93-01 (Rev.3) presents the OLM state of the art and it provides methodology. This study adopts NUMARC 93-01 (Rev.3) and present OLM. The reference component is Emergency Diesel Generator (EDG) of Ulchin 3, 4

  2. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    Wallsgrove, R.; Munro, F.

    1991-01-01

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  3. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  4. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  5. The Development Of Learning Sets And Research Methodology Module Using Problem Based Learning For Accounting Education Students

    OpenAIRE

    Thomas, Partono; Nurkhin, Ahmad

    2016-01-01

    Improving the learning process is very important for every lecturer by implement innovative learning methods or media. The purpose of this study is to develop a research methodology learning instruction and module based of problem based learning for accounting education students. This research applied research and development design in the research methodology course in Economics Education (Accounting) Department, Faculty Of Economics, Semarang State University. Data analysis was used to test...

  6. The Desired Image of the Future Economy of the Industrial Region: Development Trends and Evaluation Methodology

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2017-09-01

    Full Text Available In the article, the authors emphasize that industrial regions play an important role in the increasing of technological independence of Russia. We show that the decline in the share of processing industries in the gross regional product can not be treated as a negative de-industrialization of the economy. The article proves that the increase in the speed of changements, instability of socio-economic systems, the diverse risks predetermine the need to develop new methodological approaches to predictive research. The studies aimed at developing a technology for the design of the desired image of the future and the methodology for its evaluation are of high importance. For the initial stage of the research, the authors propose the methodological approach for assessing the desired image of the future of metallurgy as one of the most important industry of the region. We propose the term of «technological image of the regional metallurgy». We show that repositioning the image of the regional metallurgical complex is quite a long process. This have determined the need to define the stages of repositioning. The proposed methodology of the evaluation of desired future includes the methodological provisions to quantify the characteristics of goals achieved at the respective stages of the repositioning of the metallurgy. The methodological approach to the design of the desired image of the future implies the following stages: the identification of the priority areas of the technological development of regional metallurgy on the basis of bibliometric and patent analysis; the evaluation of dynamics of the development of the structure of metal products domestic consumption based on comparative analysis and relevant analytical methods as well as its forecasting; the design of the factor model, allowing to identify the parameters quantifying the technological image of the regional metallurgy based on the principal components method,; systematization of

  7. Preliminary methodology to assess the national and regional impact of U.S. wind energy development on birds and bats

    Science.gov (United States)

    Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.

    2015-01-01

    The U.S. Geological Survey has developed a methodology to assess the impacts of wind energy development on wildlife; it is a probabilistic, quantitative assessment methodology that can communicate to decision makers and the public the magnitude of these effects on species populations. The methodology is currently applicable to birds and bats, focuses primarily on the effects of collisions, and can be applied to any species that breeds in, migrates through, or otherwise uses any part of the United States. The methodology is intended to assess species at the national scale and is fundamentally different from existing methods focusing on impacts at individual facilities.

  8. Towards a Public Sector GIS Evaluation Methodology | Kurwakumire ...

    African Journals Online (AJOL)

    Towards a Public Sector GIS Evaluation Methodology. ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search · USING AJOL · RESOURCES ... However, evaluation methodologies for public sector GIS are largely lacking.

  9. An architecture and methodology for the design and development of Technical Information Systems

    NARCIS (Netherlands)

    Capobianchi, R.; Mautref, M.; van Keulen, Maurice; Balsters, H.

    In order to meet demands in the context of Technical Information Systems (TIS) pertaining to reliability, extensibility, maintainability, etc., we have developed an architectural framework with accompanying methodological guidelines for designing such systems. With the framework, we aim at complex

  10. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  11. Development and application of the methodology to establish life extension and modernization plan of aged hydropower plants

    International Nuclear Information System (INIS)

    Kim, Jong Sung; Kwon, Hyuck Cheon; Song, Byung Hun; Kwon, Chang Seop

    2009-01-01

    This paper provides how to establish an integrated plan for LE (Life Extension) and MD (MoDernization) of aged hydropower plants. The methodology is developed through review of overseas/domestic LE/MD histories, investigation of the previous overseas methodologies and consideration of domestic practices. The methodology includes reviews of the various factors such as condition, operation and maintenance history, up-to-date technology, and economic benefit. In order to establish the life extension/modernization plan, the methodology is applied to the domestic aged hydropower plants. Finally, priority rankings and draft practice plans for LE/MD are derived.

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  13. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  14. Establishing a methodology to develop complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-02-01

    Full Text Available Many modern management systems, such as military command and control, tend to be large and highly interconnected sociotechnical systems operating in a complex environment. Successful development, assessment and implementation of these systems...

  15. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  16. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    Science.gov (United States)

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  18. Development of methodology for the analysis of fuel behavior in light water reactor in design basis accidents

    International Nuclear Information System (INIS)

    Salatov, A. A.; Goncharov, A. A.; Eremenko, A. S.; Kuznetsov, V. I.; Bolnov, V. A.; Gusev, A. S.; Dolgov, A. B.; Ugryumov, A. V.

    2013-01-01

    The report attempts to analyze the current experience of the safety fuel for light-water reactors (LWRs) under design-basis accident conditions in terms of its compliance with international requirements for licensing nuclear power plants. The components of fuel behavior analysis methodology in design basis accidents in LWRs were considered, such as classification of design basis accidents, phenomenology of fuel behavior in design basis accidents, system of fuel safety criteria and their experimental support, applicability of used computer codes and input data for computational analysis of the fuel behavior in accidents, way of accounting for the uncertainty of calculation models and the input data. A brief history of the development of probabilistic safety analysis methodology for nuclear power plants abroad is considered. The examples of a conservative approach to safety analysis of VVER fuel and probabilistic approach to safety analysis of fuel TVS-K are performed. Actual problems in development of the methodology of analyzing the behavior of VVER fuel at the design basis accident conditions consist, according to the authors opinion, in following: 1) Development of a common methodology for analyzing the behavior of VVER fuel in the design basis accidents, implementing a realistic approach to the analysis of uncertainty - in the future it is necessary for the licensing of operating VVER fuel abroad; 2) Experimental and analytical support to the methodology: experimental studies to identify and study the characteristics of the key uncertainties of computational models of fuel and the cladding, development of computational models of key events in codes, validation code on the basis of integral experiments

  19. Navigating the Process of Ethical Approval: A methodological note

    Directory of Open Access Journals (Sweden)

    Eileen Carey, RNID, BSc. (hons, MSc.

    2010-12-01

    Full Text Available Classic grounded theory (CGT methodology is a general methodology whereby the researcher aims to develop an emergent conceptual theory from empirical data collected by the researcher during the research study. Gaining ethical approval from relevant ethics committees to access such data is the starting point for processing a CGT study. The adoption of the Universal Declaration on Bioethics and Human Rights (UNESCO, 2005 is an indication of global consensus on the importance of research ethics. There is, however, a wide variation of health research systems across countries and disciplines (Hearnshaw 2004. Institutional Research Boards (IRB or Research Ethics Committees (REC have been established in many countries to regulate ethical research ensuring that researchers agree to, and adhere to, specific ethical and methodological conditions prior to ethical approval being granted. Interestingly, both the processes and outcomes through which the methodological aspects pertinent to CGT studies are agreed between the researcher and ethics committee remain largely ambiguous and vague. Therefore, meeting the requirements for ethical approval from ethics committees, while enlisting the CGT methodology as a chosen research approach, can be daunting for novice researchers embarking upon their first CGT study.

  20. SITE-94. Scenario development FEP audit list preparation: methodology and presentation

    International Nuclear Information System (INIS)

    Stenhouse, M.; Chapman, N.; Sumerling, T.

    1993-04-01

    This report concerns a study which is part of the SKI performance assessment project SITE-94. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties , evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments

  1. Development of a calculation methodology for potential flow over irregular topographies

    International Nuclear Information System (INIS)

    Del Carmen, Alejandra F.; Ferreri, Juan C.; Boutet, Luis I.

    2003-01-01

    Full text: Computer codes for the calculation of potential flow fields over surfaces with irregular topographies have been developed. The flows past multiple simple obstacles and past the neighboring region of the Embalse Nuclear Power Station have been considered. The codes developed allow the calculation of velocities quite near the surface. It, in turn, imposed developing high accuracy techniques. The Boundary Element Method, using a linear approximation on triangular plane elements and an analytical integration methodology has been applied. A particular and quite efficient technique for the calculation of the solid angle at each node vertex was also considered. The results so obtained will be applied to predict the dispersion of passive pollutants coming from discontinuous emissions. (authors)

  2. Theoretical and methodological foundations of sustainable development of Geosystems

    Science.gov (United States)

    Mandryk, O. M.; Arkhypova, L. M.; Pukish, A. V.; Zelmanovych, A.; Yakovlyuk, Kh

    2017-05-01

    The theoretical and methodological foundations of sustainable development of Geosystems were further evolved. It was grounded the new scientific direction “constructive Hydroecology” - the science that studies the Hydrosphere from the standpoint of natural and technogenic safety based on geosystematical approach. A structural separation for constructive Hydroecology based on objective, subjective, and application characteristics was set. The main object of study of the new scientific field is the hydroecological environment under which the part of Hydrosphere should be understood as a part of the multicomponent dynamic system that is influenced by engineering and economical human activities and, in turn, determines to some extent this activity.

  3. Using Digital Archives in Quantitative Discourse Studies: Methodological Reflections

    Directory of Open Access Journals (Sweden)

    Kobie Van Krieken

    2015-12-01

    Full Text Available This methodological essay discusses the possibilities of using digital archives in quantitative discourse studies. I illustrate these possibilities by discussing a study in which the digital archive Delpher was used to build a relatively large corpus of newspaper narratives (N=300 in order to test hypotheses about the historical development of linguistic features associated with objective and subjective reporting. The large amount of data collected in digital archives like Delpher facilitates the construction of corpora for such hypothesis-driven studies. However, the collection of newspaper articles on Delpher in fact constitutes only a small, non-random and continuously changing selection of all available data. Due to these characteristics, the use of Delpher jeopardizes two core values of quantitative empirical research: the generalizability and the replicability of findings. Although these issues cannot be easily overcome, I argue that digital archives have the potential to broaden the methodological scope of discourse studies and increase the overall significance of the field.

  4. Development of a field measurement methodology for studying the thermal indoor environment in hybrid GEOTABS buildings

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Khovalyg, Dolaana; Olesen, Bjarne W.

    2018-01-01

    buildings. The three demonstration buildings were an office building in Luxembourg, an elderly care home in Belgium, and an elementary school in Czech Republic. All of these buildings are equipped with hybrid GEOTABS systems; however, they vary in size and function, which requires a unique measurement...... methodology for studying them. These buildings already have advanced Building Management Systems (BMS); however, a more detailed measurement plan was needed for the purposes of the project to document the current performance of these systems regarding thermal indoor environment and energy performance......, and to be able to document the improvements after the implementation of the MPC. This study provides the details of the developed field measurement methodology for each of these buildings to study the indoor environmental quality (IEQ) in details. The developed measurement methodology can be applied to other...

  5. Development of a low-level waste risk methodology

    International Nuclear Information System (INIS)

    Fisher, J.E.; Falconer, K.L.

    1984-01-01

    A probabilistic risk assessment method is presented for performance evaluation of low-level waste disposal facilities. The associated program package calculates the risk associated with postulated radionuclide release and transport scenarios. Risk is computed as the mathematical product of two statistical variables: the dose consequence of a given release scenario, and its occurrence probability. A sample risk calculation is included which demonstrates the method. This PRA method will facilitate evaluation of facility performance, including identification of high risk scenarios and their mitigation via optimization of site parameters. The method is intended to be used in facility licensing as a demonstration of compliance with the performance objectives set forth in 10 CFR Part 61, or in corresponding state regulations. The Low-Level Waste Risk Methodology is being developed under sponsorship of the Nuclear Regulatory Commission

  6. Large shaft development test plan

    International Nuclear Information System (INIS)

    Krug, A.D.

    1984-03-01

    This test plan proposes the conduct of shaft liner tests as part of the large shaft development test proposed for the Hanford Site in support of the repository development program. The objectives of these tests are to develop techniques for measuring liner alignment (straightness), both construction assembly alignment and downhole cumulative alignment, and to assess the alignment information as a real time feedback to aid the installation procedure. The test plan is based upon installing a 16 foot ID shaft liner into a 20 foot diameter shaft to a depth of 1000 feet. This test plan is considered to be preliminary in that it was prepared as input for the decision to determine if development testing is required in this area. Should the decision be made to proceed with development testing, this test plan shall be updated and revised. 6 refs., 2 figs

  7. Urban Agglomerations in Regional Development: Theoretical, Methodological and Applied Aspects

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Shmidt

    2016-09-01

    Full Text Available The article focuses on the analysis of the major process of modern socio-economic development, such as the functioning of urban agglomerations. A short background of the economic literature on this phenomenon is given. There are the traditional (the concentration of urban types of activities, the grouping of urban settlements by the intensive production and labour communications and modern (cluster theories, theories of network society conceptions. Two methodological principles of studying the agglomeration are emphasized: the principle of the unity of the spatial concentration of economic activity and the principle of compact living of the population. The positive and negative effects of agglomeration in the economic and social spheres are studied. Therefore, it is concluded that the agglomeration is helpful in the case when it brings the agglomerative economy (the positive benefits from it exceed the additional costs. A methodology for examination the urban agglomeration and its role in the regional development is offered. The approbation of this methodology on the example of Chelyabinsk and Chelyabinsk region has allowed to carry out the comparative analysis of the regional centre and the whole region by the main socio-economic indexes under static and dynamic conditions, to draw the conclusions on a position of the city and the region based on such socio-economic indexes as an average monthly nominal accrued wage, the cost of fixed assets, the investments into fixed capital, new housing supply, a retail turnover, the volume of self-produced shipped goods, the works and services performed in the region. In the study, the analysis of a launching site of the Chelyabinsk agglomeration is carried out. It has revealed the following main characteristics of the core of the agglomeration in Chelyabinsk (structure feature, population, level of centralization of the core as well as the Chelyabinsk agglomeration in general (coefficient of agglomeration

  8. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1985-01-01

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are gound-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission. The approach followed consists of a description of the overall system (waste, facility, and site), scenario selection and screening, consequence modeling (source term, ground-water flow, radionuclide transport, biosphere transport, and health effects), and uncertainty and sensitivity analysis

  9. Comprehensive Psychopathological Assessment Based on the Association for Methodology and Documentation in Psychiatry (AMDP) System: Development, Methodological Foundation, Application in Clinical Routine, and Research

    Science.gov (United States)

    Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang

    2017-01-01

    The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242

  10. Methodological principles to study formation and development of floristic law in Ukraine

    Directory of Open Access Journals (Sweden)

    А. К. Соколова

    2014-06-01

    Full Text Available The paper investigates the problems associated with the determination of methods to study establishment of floristic law in Ukraine. It makes an investigation into the types of methods, establishes their interrelation and functional value. In addition, it analyzes the system of methodological reasons for development of ecological and floristic law and gives additional reasons.

  11. The impact of methodology in innovation measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wilhelmsen, L.; Bugge, M.; Solberg, E.

    2016-07-01

    Innovation surveys and rankings such as the Community Innovation Survey (CIS) and Innovation Union Scoreboard (IUS) have developed into influential diagnostic tools that are often used to categorize countries according to their innovation performance and to legitimise innovation policies. Although a number of ongoing processes are seeking to improve existing frameworks for measuring innovation, there are large methodological differences across countries in the way innovation is measured. This causes great uncertainty regarding a) the coherence between data from innovation surveys, b) actual innovativeness of the economy, and c) the validity of research based on innovation data. Against this background we explore empirically how different survey methods for measuring innovation affect reported innovation performance. The analysis is based on a statistical exercise comparing the results from three different methodological versions of the same survey for measuring innovation in the business enterprise sector in Norway. We find striking differences in reported innovation performance depending on how the surveys are carried out methodologically. The paper concludes that reported innovation performance is highly sensitive to and strongly conditioned by methodological context. This represents a need for increased caution and awareness around data collection and research based on innovation data, and not least in terms of aggregation of data and cross-country comparison. (Author)

  12. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  13. The use of agile systems development methodologies in the telecommunication industry in South Africa / B.M. Mazengera

    OpenAIRE

    Mazengera, Bruce Mwai Analinafe

    2009-01-01

    Over the last decade, systems development professionals have recognised the need to use agile systems development methodologies (ASDMs) in the telecommunication industry. This is partly due to the barriers identified by Mansurov (2000) which suggest that the use of agile methodologies in the telecommunication industry would reduce the ratio of time-to-market. In the South African context, the industry has cemented its position as a major driving force of the economy as a whole. The industry's...

  14. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  15. Reliability-Based Optimal Design for Very Large Floating Structure

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shu-hua(张淑华); FUJIKUBO Masahiko

    2003-01-01

    Costs and losses induced by possible future extreme environmental conditions and difficulties in repairing post-yielding damage strongly suggest the need for proper consideration in design rather than just life loss prevention. This can be addressed through the development of design methodology that balances the initial cost of the very large floating structure (VLFS) against the expected potential losses resulting from future extreme wave-induced structural damage. Here, the development of a methodology for determining optimal, cost-effective design will be presented and applied to a VLFS located in the Tokyo bay. Optimal design criteria are determined based on the total expected life-cycle cost and acceptable damage probability and curvature of the structure, and a set of sizes of the structure are obtained. The methodology and applications require expressions of the initial cost and the expected life-cycle damage cost as functions of the optimal design variables. This study includes the methodology, total life-cycle cost function, structural damage modeling, and reliability analysis.

  16. Methodology to develop a training program as a tool for energy management

    Directory of Open Access Journals (Sweden)

    Mónica Rosario Berenguer-Ungaro

    2017-12-01

    Full Text Available The paperaims to present the methodology to develop a training program improve labor skills that enhance the efficient use of energy resources, which aims to make training a timely and meet the training needs as they arise and that the protagonist of it is he who receives training. It is based on the training-action and action research method and model for evaluating training Krikpatrick, it evaluates four levels, reaction, learning, behavior and results. The methodology is structured in three stages: 1 diagnosis of knowledge, 2 intervention based on the results and 3 evaluation and feedback for continuous improvement. Each stage has identified the objectives and implementation tools. Evaluation is transverse to the entire program and it is through it that decisions for feedback loops are taken.

  17. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  18. Methodology for quantitative evalution of diagnostic performance. Project III

    International Nuclear Information System (INIS)

    Metz, C.E.

    1985-01-01

    Receiver Operation Characteristic (ROC) methodology is now widely recognized as the most satisfactory approach to the problem of measuring and specifying the performance of a diagnostic procedure. The primary advantage of ROC analysis over alternative methodologies is that it seperates differences in diagnostic accuracy that are due to actual differences in discrimination capacity from those that are due to decision threshold effects. Our effort during the past year has been devoted to developing digital computer programs for fitting ROC curves to diagnostic data by maximum likelihood estimation and to developing meaningful and valid statistical tests for assessing the significance of apparent differences between measured ROC curves. FORTRAN programs previously written here for ROC curve fitting and statistical testing have been refined to make them easier to use and to allow them to be run in a large variety of computer systems. We have attempted also to develop two new curve-fitting programs: one for conventional ROC data that assumes a different functional form for the ROC curve, and one that can be used for ''free-response'' ROC data. Finally, we have cooperated with other investigators to apply our techniques to analyze ROC data generated in clinical studies, and we have sought to familiarize the medical community with the advantages of ROC methodology. 36 ref

  19. Simulation and Optimization Methodologies for Military Transportation Network Routing and Scheduling and for Military Medical Services

    National Research Council Canada - National Science Library

    Rodin, Ervin Y

    2005-01-01

    The purpose of this present research was to develop a generic model and methodology for analyzing and optimizing large-scale air transportation networks including both their routing and their scheduling...

  20. Application of Master Curve Methodology for Structural Integrity Assessments of Nuclear Components

    Energy Technology Data Exchange (ETDEWEB)

    Sattari-Far, Iradj [Det Norske Veritas, Stockholm (Sweden); Wallin, Kim [VTT, Esbo (Finland)

    2005-10-15

    The objective was to perform an in-depth investigation of the Master Curve methodology and also based on this method develop a procedure for fracture assessments of nuclear components. The project has sufficiently illustrated the capabilities of the Master Curve methodology for fracture assessments of nuclear components. Within the scope of this work, the theoretical background of the methodology and its validation on small and large specimens has been studied and presented to a sufficiently large extent, as well as the correlations between the charpy-V data and the Master Curve T{sub 0} reference temperature in the evaluation of fracture toughness. The work gives a comprehensive report of the background theory and the different applications of the Master Curve methodology. The main results of the work have shown that the cleavage fracture toughness is characterized by a large amount of statistical scatter in the transition region, it is specimen size dependent and it should be treated statistically rather than deterministically. The Master Curve methodology is able to make use of statistical data in a consistent way. Furthermore, the Master Curve methodology provides a more precise prediction of the fracture toughness of embrittled materials in comparison with the ASME K{sub IC} reference curve, which often gives over-conservative results. The suggested procedure in this study, concerning the application of the Master Curve method in fracture assessments of ferritic steels in the transition region and the low shelf regions, is valid for the temperatures range T{sub 0}-50{<=}T{<=}T{sub 0}+50 deg C. If only approximate information is required, the Master Curve may well be extrapolated outside this temperature range. The suggested procedure has also been illustrated for some examples.

  1. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  2. METHODOLOGY OF RESEARCH AND DEVELOPMENT MANAGEMENT OF REGIONAL NETWORK ECONOMY

    Directory of Open Access Journals (Sweden)

    O.I. Botkin

    2007-06-01

    Full Text Available Information practically of all the Russian regions economy branches and development by managing subjects is information − communicative the Internet technologies render huge influence on economic attitudes development in the environment of regional business: there are new forms of interaction of managing subjects and change is information − organizational structures of regional business management. Integrated image of the set forth above innovations is the regional network economy representing the interactive environment in which on high speed and with minimal transaction (R.H.Coase’s costs are performed social economic and commodity monetary attitudes between managing subjects of region with use of Internet global network interactive opportunities. The urgency of the regional network economy phenomenon research, first of all, is caused by necessity of a substantiation of regional network economy methodology development and management mechanisms development by its infrastructure with the purpose of regional business efficiency increase. In our opinion, the decision of these problems will be the defining factor of effective economic development maintenance and russian regions economy growth in the near future.

  3. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  4. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  5. Calculation and evaluation methodology of the flawed pipe and the compute program development

    International Nuclear Information System (INIS)

    Liu Chang; Qian Hao; Yao Weida; Liang Xingyun

    2013-01-01

    Background: The crack will grow gradually under alternating load for a pressurized pipe, whereas the load is less than the fatigue strength limit. Purpose: Both calculation and evaluation methodology for a flawed pipe that have been detected during in-service inspection is elaborated here base on the Elastic Plastic Fracture Mechanics (EPFM) criteria. Methods: In the compute, the depth and length interaction of a flaw has been considered and a compute program is developed per Visual C++. Results: The fluctuating load of the Reactor Coolant System transients, the initial flaw shape, the initial flaw orientation are all accounted here. Conclusions: The calculation and evaluation methodology here is an important basis for continue working or not. (authors)

  6. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed performance assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. It is intended that assessment of the base, scenario would form the core of any future performance assessment. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs which are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. Variant scenarios are defined by FEPs which represent a significant perturbation to the natural system evolution, for example the occurrence of a large seismic event. A variant scenario defined by a single initiating FEP is characterised by a sequence of events. This is represented as a 'timeline' which forms the basis for modelling that scenario. To generate a variant scenario defined by two initiating FEPs, a methodology is presented for combining the timelines for the two underlying 'single-FEP' variants. The resulting series of event sequences can be generated automatically. These sequences are then reviewed, in order to reduce the number of timelines requiring detailed consideration. This is achieved in two ways: by aggregating sequences which have similar consequence in terms of safety performance; and by combining successive intervals along a timeline where appropriate. In the context of a performance assessment, the aim is to determine the conditional risk and appropriate weight for each

  7. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  8. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    Barabas, Roberta de C.; Sabundjian, Gaiane

    2015-01-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  9. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    Science.gov (United States)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  10. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  11. Quality assurance in a large research and development laboratory

    International Nuclear Information System (INIS)

    Neill, F.H.

    1980-01-01

    Developing a quality assurance program for a large research and development laboratory provided a unique opportunity for innovative planning. The quality assurance program that emerged has been tailored to meet the requirements of several sponsoring organizations and contains the flexibility for experimental programs ranging from large engineering-scale development projects to bench-scale basic research programs

  12. Turbofan Engine Core Compartment Vent Aerodynamic Configuration Development Methodology

    Science.gov (United States)

    Hebert, Leonard J.

    2006-01-01

    This paper presents an overview of the design methodology used in the development of the aerodynamic configuration of the nacelle core compartment vent for a typical Boeing commercial airplane together with design challenges for future design efforts. Core compartment vents exhaust engine subsystem flows from the space contained between the engine case and the nacelle of an airplane propulsion system. These subsystem flows typically consist of precooler, oil cooler, turbine case cooling, compartment cooling and nacelle leakage air. The design of core compartment vents is challenging due to stringent design requirements, mass flow sensitivity of the system to small changes in vent exit pressure ratio, and the need to maximize overall exhaust system performance at cruise conditions.

  13. Methods for assessing the socioeconomic impacts of large-scale resource developments: implications for nuclear repository siting

    International Nuclear Information System (INIS)

    Murdock, S.H.; Leistritz, F.L.

    1983-03-01

    An overview of the major methods presently available for assessing the socioeconomic impacts of large-scale resource developments and includes discussion of the implications and applications of such methods for nuclear-waste-repository siting are provided. The report: (1) summarizes conceptual approaches underlying, and methodological alternatives for, the conduct of impact assessments in each substantive area, and then enumerates advantages and disadvantages of each alternative; (2) describes factors related to the impact-assessment process, impact events, and the characteristics of rural areas that affect the magnitude and distribution of impacts and the assessment of impacts in each area; (3) provides a detailed review of those methodologies actually used in impact assessment for each area, describes advantages and problems encountered in the use of each method, and identifies the frequency of use and the general level of acceptance of each technique; and (4) summarizes the implications of each area of projection for the repository-siting process, the applicability of the methods for each area to the special and standard features of repositories, and makes general recommendations concerning specific methods and procedures that should be incorporated in assessments for siting areas

  14. Fast crawling methods of exploring content distributed over large graphs

    KAUST Repository

    Wang, Pinghui; Zhao, Junzhou; Lui, John C. S.; Towsley, Don; Guan, Xiaohong

    2018-01-01

    Despite recent effort to estimate topology characteristics of large graphs (e.g., online social networks and peer-to-peer networks), little attention has been given to develop a formal crawling methodology to characterize the vast amount of content

  15. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  16. Nirex methodology for scenario and conceptual model development. An international peer review

    International Nuclear Information System (INIS)

    1999-06-01

    Nirex has responsibilities for nuclear waste management in the UK. The company's top level objectives are to maintain technical credibility on deep disposal, to gain public acceptance for a deep geologic repository, and to provide relevant advice to customers on the safety implications of their waste packaging proposals. Nirex utilizes peer reviews as appropriate to keep its scientific tools up-to-date and to periodically verify the quality of its products. The NEA formed an International Review Team (IRT) consisting of four internationally recognised experts plus a member of the NEA Secretariat. The IRT performed an in-depth analysis of five Nirex scientific reports identified in the terms of reference of the review. The review was to primarily judge whether the Nirex methodology provides an adequate framework to support the building of a future licensing safety case. Another objective was to judge whether the methodology could aid in establishing a better understanding, and, ideally, enhance acceptance of a repository among stakeholders. Methodologies for conducting safety assessments include at a very basic level the identification of features, events, and processes (FEPs) relevant to the system at hand, their convolution in scenarios for analysis, and the formulation of conceptual models to be addressed through numerical modelling. The main conclusion of the IRT is that Nirex has developed a potentially sound methodology for the identification and analysis of FEPs and for the identification of conceptual model needs and model requirements. The work is still in progress and is not yet complete. (R.P.)

  17. Integrated structure/control design - Present methodology and future opportunities

    Science.gov (United States)

    Weisshaar, T. A.; Newsom, J. R.; Zeiler, T. A.; Gilbert, M. G.

    1986-01-01

    Attention is given to current methodology applied to the integration of the optimal design process for structures and controls. Multilevel linear decomposition techniques proved to be most effective in organizing the computational efforts necessary for ISCD (integrated structures and control design) tasks. With the development of large orbiting space structures and actively controlled, high performance aircraft, there will be more situations in which this concept can be applied.

  18. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  19. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  20. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.

  1. Development and application of a decision methodology for the planning of nuclear research and development in Saudi Arabia

    International Nuclear Information System (INIS)

    Abulfaraj, W.H.

    1983-01-01

    The thesis adapts two formal decision methodologies for the planning and development of a Nuclear Research Center for the Kingdom of Saudi Arabia. The methodologies are useful for the selection of alternative nuclear energy strategies. Multiattribute utility theory (MUT) and fuzzy set theory (FST) are selected to accommodate for decision makers' preferences and quick decision. MUT is employed to evaluate four appropriate research center facilities to determine the optimal choice therefrom in order to meet the needs of the proposed center. FST is used to handle site selection decision for the center. Procurement and siting decisions in Saudi Arabia for equipment are based on the demonstrated performance of operating units. Certain procedures discussed in detail in the dissertation reflect this modus operandi. 106 refs

  2. Methodology for assessment of undiscovered oil and gas resources for the 2008 Circum-Arctic Resource Appraisal

    Science.gov (United States)

    Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.

    2017-11-15

    The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.

  3. Monitoring sustainable biomass flows : General methodology development

    NARCIS (Netherlands)

    Goh, Chun Sheng; Junginger, Martin; Faaij, André

    Transition to a bio-based economy will create new demand for biomass, e.g. the increasing use of bioenergy, but the impacts on existing markets are unclear. Furthermore, there is a growing public concern on the sustainability of biomass. This study proposes a methodological framework for mapping

  4. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  5. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  6. Large Instrument Development for Radio Astronomy

    Science.gov (United States)

    Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo

    2009-03-01

    This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.

  7. Proposed Methodology for Developing a National Strategy for Human Resource Development: Lessons Learned from a NNSA Workshop

    International Nuclear Information System (INIS)

    Elkhamri, Oksana O.; Frazar, Sarah L.; Essner, Jonathan; Vergino, Eileen; Bissani, Mo; Apt, Kenneth E.; McClelland-Kerr, John; Mininni, Margot; VanSickle, Matthew; Kovacic, Donald

    2009-01-01

    This paper describes a recent National Nuclear Security Administration (NNSA) workshop on Human Resource Development, which was focused on the potential methodology for developing a National Human Resource strategy for nuclear power in emerging nuclear states. The need for indigenous human resource development (HRD) has been singled out as a key milestone by the International Atomic Energy Agency (IAEA) in its 2007 Milestones document. A number of countries considering nuclear energy have reiterated this need for experts and specialists to support a national nuclear program that is sustainable and secure. Many have expressed concern over how best to assure the long-term availability of crucial human resource, how to approach the workforce planning process, and how to determine the key elements of developing a national strategy.

  8. Development of a novel set of criteria to select methodology for designing product service systems

    Directory of Open Access Journals (Sweden)

    Tuananh Tran

    2016-04-01

    Full Text Available This paper proposes eight groups of twenty nine scoring criteria that can help designers and practitioners to compare and select an appropriate methodology for a certain problem in designing product service system (PSS. PSS has been researched for more than a decade and is now becoming more and more popular in academia as well as industry. Despite that fact, the adoption of PSS is still limited for its potential. One of the main reasons is that designing PSS itself is a challenge. Designers and developers face difficulties in choosing appropriate PSS design methodologies for their projects so that they can design effective PSS offerings. By proposing eight groups of twenty nine scoring criteria, this paper enables a “step by step” process to identify the most appropriate design methodology for a company’s PSS problem. An example is also introduced to illustrate the use of the proposed scoring criteria and provide a clear picture of how different design methodologies can be utilized at their best in terms of application.

  9. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    Science.gov (United States)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  10. Developing the P2/6 methodology [to assess the security capability of modern distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Allan, Ron; Strbac, Goran; Djapic, Predrag; Jarrett, Keith [Manchester Univ. Inst. of Science and Technology, Manchester (United Kingdom)

    2004-04-29

    The main objective of the project was to use the methodology developed in the previous Methodology project (ETSU/FES Project K/EL/00287) to assess the security capability of modern distributed generation in order to review Table 2 and related text of Engineering Recommendation P2/5, and to propose information and results that could be used to create a new P2/6 that takes into account modern types of generating units; unit numbers; unit availabilities; and capacities. Technical issues raised in the previous study but held over until this project include: Treatment of single unit generation systems; Effect of shape of load duration curves; Persistence of intermittent generation, T{sub m}; Time resolution of intermittent generation output profiles; Ride-through capability; Risk to loss of supply. Three main ways of implementing the methodology were recommended: Look-up table(s), Graphical, and Computer program. The specification for the computer program was to produce a simple spreadsheet application package that an engineer with a reasonably knowledge of the approach could use. This prototype package has been developed in conjunction with Workstream 3. Its objective is to calculate the capability contribution to security of supply from distributed generation connected to a particular demand group. The application has been developed using Microsoft Excel and Visual Basic for Applications. New Tables for inclusion in P2/6 are included. (UK)

  11. HRS Clinical Document Development Methodology Manual and Policies: Executive summary.

    Science.gov (United States)

    Indik, Julia H; Patton, Kristen K; Beardsall, Marianne; Chen-Scarabelli, Carol A; Cohen, Mitchell I; Dickfeld, Timm-Michael L; Haines, David E; Helm, Robert H; Krishnan, Kousik; Nielsen, Jens Cosedis; Rickard, John; Sapp, John L; Chung, Mina

    2017-10-01

    The Heart Rhythm Society (HRS) has been developing clinical practice documents in collaboration and partnership with other professional medical societies since 1996. The HRS formed a Scientific and Clinical Documents Committee (SCDC) with the sole purpose of managing the development of these documents from conception through publication. The SCDC oversees the process for developing clinical practice documents, with input and approval from the HRS Executive Committee and the Board of Trustees. As of May 2017, the HRS has produced more than 80 publications with other professional organizations. This process manual is produced to publicly and transparently declare the standards by which the HRS develops clinical practice documents, which include clinical practice guidelines, expert consensus statements, scientific statements, clinical competency statements, task force policy statements, and proceedings statements. The foundation for this process is informed by the Institute of Medicine's standards for developing trustworthy clinical practice guidelines; the new criteria from the National Guidelines Clearinghouse, effective June 2014; SCDC member discussions; and a review of guideline policies and methodologies used by other professional organizations. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  12. Methodological Aspects in Forecasting Innovation Development of Dairy Cattle Breeding in the Region

    Directory of Open Access Journals (Sweden)

    Natal’ya Aleksandrovna Medvedeva

    2016-07-01

    Full Text Available Due to the fact that Russia is now a member of the World Trade Organization, long-term forecasting becomes an objectively necessary condition that helps choose an effective science-based long-term strategy for development of dairy cattle breeding that would take into consideration intellectual and innovative characteristics. Current structure of available statistical information does not meet modern challenges of innovation development and does not reflect adequately the trends of ongoing changes. The paper suggests a system of indicators to analyze the status, development and prospects of dairy cattle breeding in the region; this system provides timely identification of emerging risks and threats of deviation from the specified parameters. The system included indicators contained in the current statistical reporting and new indicators of innovation development of the industry, the quality of human capital and the level of government support. When designing the system of indicators, we used several methodological aspects of the Oslo Manual, which the Federal State Statistics Service considers to be an official methodological document concerning the collection of information about innovation activities. A structured system of indicators shifts the emphasis in the analysis of the final results to the conditions and prerequisites that help achieve forecast performance indicators in the functioning of Russia’s economy under WTO rules and make substantiated management decisions

  13. Large superconducting coil fabrication development

    International Nuclear Information System (INIS)

    Brown, R.L.; Allred, E.L.; Anderson, W.C.; Burn, P.B.; Deaderick, R.I.; Henderson, G.M.; Marguerat, E.F.

    1975-01-01

    Toroidal fields for some fusion devices will be produced by an array of large superconducting coils. Their size, space limitation, and field requirements dictate that they be high performance coils. Once installed, accessibility for maintenance and repairs is severely restricted; therefore, good reliability is an obvious necessity. Sufficient coil fabrication will be undertaken to develop and test methods that are reliable, fast, and economical. Industrial participation will be encouraged from the outset to insure smooth transition from development phases to production phases. Initially, practice equipment for three meter bore circular coils will be developed. Oval shape coil forms will be included in the practice facility later. Equipment that is more automated will be developed with the expectation of winding faster and obtaining good coil quality. Alternate types of coil construction, methods of winding and insulating, will be investigated. Handling and assembly problems will be studied. All technology developed must be feasible for scaling up when much larger coils are needed. Experimental power reactors may need coils having six meter or larger bores

  14. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  15. Multi-objective and multi-physics optimization methodology for SFR core: application to CFV concept

    International Nuclear Information System (INIS)

    Fabbris, Olivier

    2014-01-01

    Nuclear reactor core design is a highly multidisciplinary task where neutronics, thermal-hydraulics, fuel thermo-mechanics and fuel cycle are involved. The problem is moreover multi-objective (several performances) and highly dimensional (several tens of design parameters).As the reference deterministic calculation codes for core characterization require important computing resources, the classical design method is not well suited to investigate and optimize new innovative core concepts. To cope with these difficulties, a new methodology has been developed in this thesis. Our work is based on the development and validation of simplified neutronics and thermal-hydraulics calculation schemes allowing the full characterization of Sodium-cooled Fast Reactor core regarding both neutronics performances and behavior during thermal hydraulic dimensioning transients.The developed methodology uses surrogate models (or meta-models) able to replace the neutronics and thermal-hydraulics calculation chain. Advanced mathematical methods for the design of experiment, building and validation of meta-models allows substituting this calculation chain by regression models with high prediction capabilities.The methodology is applied on a very large design space to a challenging core called CFV (French acronym for low void effect core) with a large gain on the sodium void effect. Global sensitivity analysis leads to identify the significant design parameters on the core design and its behavior during unprotected transient which can lead to severe accidents. Multi-objective optimizations lead to alternative core configurations with significantly improved performances. Validation results demonstrate the relevance of the methodology at the pre-design stage of a Sodium-cooled Fast Reactor core. (author) [fr

  16. Further developments of multiphysics and multiscale methodologies for coupled nuclear reactor simulations

    International Nuclear Information System (INIS)

    Gomez Torres, Armando Miguel

    2011-01-01

    This doctoral thesis describes the methodological development of coupled neutron-kinetics/thermal-hydraulics codes for the design and safety analysis of reactor systems taking into account the feedback mechanisms on the fuel rod level, according to different approaches. A central part of this thesis is the development and validation of a high fidelity simulation tool, DYNSUB, which results from the ''two-way-coupling'' of DYN3D-SP3 and SUBCHANFLOW. It allows the determination of local safety parameters through a detailed description of the core behavior under stationary and transient conditions at fuel rod level.

  17. Multi-Layer Integration Methodology for development of ICT competences in SMEs

    DEFF Research Database (Denmark)

    Nunez, Heilyn Camacho

    2013-01-01

    (ICT) is a challenge for many small and medium-sized enterprises (SMEs). As it is clear the important role of the SMEs for the countries development, this paper proposes a methodology that aims to interconnect external forces that can facilitate the removal or mitigation of some of the constraints......To modern enterprises, information technology has a multi-value, such as improving products, customer satisfaction and quality; it also facilitates administrative processes, reduces cost and improves competitiveness, among other things. Taking advantage of information and communication technologies...

  18. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  19. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  20. Development and Application of a Clinical Microsystem Simulation Methodology for Human Factors-Based Research of Alarm Fatigue.

    Science.gov (United States)

    Kobayashi, Leo; Gosbee, John W; Merck, Derek L

    2017-07-01

    (1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.

  1. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  2. Distributed computing methodology for training neural networks in an image-guided diagnostic application.

    Science.gov (United States)

    Plagianakos, V P; Magoulas, G D; Vrahatis, M N

    2006-03-01

    Distributed computing is a process through which a set of computers connected by a network is used collectively to solve a single problem. In this paper, we propose a distributed computing methodology for training neural networks for the detection of lesions in colonoscopy. Our approach is based on partitioning the training set across multiple processors using a parallel virtual machine. In this way, interconnected computers of varied architectures can be used for the distributed evaluation of the error function and gradient values, and, thus, training neural networks utilizing various learning methods. The proposed methodology has large granularity and low synchronization, and has been implemented and tested. Our results indicate that the parallel virtual machine implementation of the training algorithms developed leads to considerable speedup, especially when large network architectures and training sets are used.

  3. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  4. Development and application of a hybrid transport methodology for active interrogation systems

    Energy Technology Data Exchange (ETDEWEB)

    Royston, K.; Walters, W.; Haghighat, A. [Nuclear Engineering Program, Department of Mechanical Engineering, Virginia Tech., 900 N Glebe Rd., Arlington, VA 22203 (United States); Yi, C.; Sjoden, G. [Nuclear and Radiological Engineering, Georgia Tech, 801 Ferst Drive, Atlanta, GA 30332 (United States)

    2013-07-01

    A hybrid Monte Carlo and deterministic methodology has been developed for application to active interrogation systems. The methodology consists of four steps: i) neutron flux distribution due to neutron source transport and subcritical multiplication; ii) generation of gamma source distribution from (n, 7) interactions; iii) determination of gamma current at a detector window; iv) detection of gammas by the detector. This paper discusses the theory and results of the first three steps for the case of a cargo container with a sphere of HEU in third-density water cargo. To complete the first step, a response-function formulation has been developed to calculate the subcritical multiplication and neutron flux distribution. Response coefficients are pre-calculated using the MCNP5 Monte Carlo code. The second step uses the calculated neutron flux distribution and Bugle-96 (n, 7) cross sections to find the resulting gamma source distribution. In the third step the gamma source distribution is coupled with a pre-calculated adjoint function to determine the gamma current at a detector window. The AIMS (Active Interrogation for Monitoring Special-Nuclear-Materials) software has been written to output the gamma current for a source-detector assembly scanning across a cargo container using the pre-calculated values and taking significantly less time than a reference MCNP5 calculation. (authors)

  5. METHODOLOGICAL GUIDELINES FOR THE TRANSPROFESSIONALISM DEVELOPMENT AMONG VOCATIONAL EDUCATORS

    Directory of Open Access Journals (Sweden)

    E. F. Zeer

    2017-01-01

    Full Text Available Introduction. Nowadays, regarding the 6thwave of technological innovations and emergence of a phenomenon «transfession», there is a need for modernization of the vocational staff training in our country. Transfession is a type of the labour activity realized on the basis of synthesis and convergence of the professional competences that involve different specialized areas. Thus, the authors of the present article propose to use the professional and educational platform, developed by them, taking into account a specialists’ training specialty. The aims of the article are the following: to describe the phenomenon «transprofessionalism», to determine the initial attitudes towards its understanding; to present the block-modular model of the platform for the formation of the transprofessionalism of the teachers of the vocational school. Methodology and research methods. The research is based on the following theoretical and scientific methods: analysis, synthesis, concretization, generalization; hypothetical-deductive method; project-based method. The projecting of the transprofessionalism platform model was constructed on the basis of multidimensional, transdisciplinary, network and project approaches. Results and scientific novelty. The relevance of the discussed phenomenon in the productive-economic sphere is proved. The transprofessionalism requires a brand new content-informative and technological training of specialists. In particular, the concept «profession» has lost its original meaning as an area of the social division of labour during socio-technological development of the Russian economy. Therefore, transprofessionals are becoming more competitive and demanded in the employment market, being capable to perform a wide range of specialized types of professional activities. The structure, principles and mechanisms of the professional-educational platform functioning for transprofessionalism formation among the members of professional

  6. Development and evaluation of clicker methodology for introductory physics courses

    Science.gov (United States)

    Lee, Albert H.

    Many educators understand that lectures are cost effective but not learning efficient, so continue to search for ways to increase active student participation in this traditionally passive learning environment. In-class polling systems, or "clickers", are inexpensive and reliable tools allowing students to actively participate in lectures by answering multiple-choice questions. Students assess their learning in real time by observing instant polling summaries displayed in front of them. This in turn motivates additional discussions which increase the opportunity for active learning. We wanted to develop a comprehensive clicker methodology that creates an active lecture environment for a broad spectrum of students taking introductory physics courses. We wanted our methodology to incorporate many findings of contemporary learning science. It is recognized that learning requires active construction; students need to be actively involved in their own learning process. Learning also depends on preexisting knowledge; students construct new knowledge and understandings based on what they already know and believe. Learning is context dependent; students who have learned to apply a concept in one context may not be able to recognize and apply the same concept in a different context, even when both contexts are considered to be isomorphic by experts. On this basis, we developed question sequences, each involving the same concept but having different contexts. Answer choices are designed to address students preexisting knowledge. These sequences are used with the clickers to promote active discussions and multiple assessments. We have created, validated, and evaluated sequences sufficient in number to populate all of introductory physics courses. Our research has found that using clickers with our question sequences significantly improved student conceptual understanding. Our research has also found how to best measure student conceptual gain using research-based instruments

  7. Understanding care in the past to develop caring science of the future: a historical methodological approach.

    Science.gov (United States)

    Nyborg, Vibeke N; Hvalvik, Sigrun; McCormack, Brendan

    2018-05-31

    In this paper, we explore how the development of historical research methodologies during the last centuries can contribute to more diverse and interdisciplinary research in future caring science, especially towards a care focus that is more person-centred. The adding of a historical approach by professional historians to the theory of person-centredness and person-centred care can develop knowledge that enables a more holistic understanding of the patient and the development of the patient perspective from the past until today. Thus, the aim was to show how developments within historical methodology can help us to understand elements of care in the past to further develop caring science in future. Historical research methodologies have advocated a "history from below" perspective, and this has enabled the evolution of systematic approaches to historical research that can be explored and critically analysed. Linked with this, the development of a more social and cultural oriented understanding of historical research has enabled historians to explore and add knowledge from a broader societal perspective. By focusing on the life of ordinary people and taking social and cultural aspects into account when trying to reconstruct the past, we can get a deeper understanding of health, care and medical development. However, an interdisciplinary research focus on person-centredness and person-centred care that includes professional historians can be challenging. In this paper, we argue that a historical perspective is necessary to meet the challenges we face in future delivery of health care to all people, in all parts of society in an ever more global world. © 2018 Nordic College of Caring Science.

  8. Development of a methodology for the safety assessment of near surface disposal facilities for radioactive waste

    International Nuclear Information System (INIS)

    Simon, I.; Cancio, D.; Alonso, L.F.; Agueero, A.; Lopez de la Higuera, J.; Gil, E.; Garcia, E.

    2000-01-01

    The Project on the Environmental Radiological Impact in CIEMAT is developing, for the Spanish regulatory body Consejo de Seguridad Nuclear (CSN), a methodology for the Safety Assessment of near surface disposal facilities. This method has been developed incorporating some elements developed through the participation in the IAEA's ISAM Programme (Improving Long Term Safety Assessment Methodologies for Near Surface Radioactive Waste Disposal Facilities). The first step of the approach is the consideration of the assessment context, including the purpose of the assessment, the end-Points, philosophy, disposal system, source term and temporal scales as well as the hypothesis about the critical group. Once the context has been established, and considering the peculiarities of the system, an specific list of features, events and processes (FEPs) is produced. These will be incorporated into the assessment scenarios. The set of scenarios will be represented in the conceptual and mathematical models. By the use of mathematical codes, calculations are performed to obtain results (i.e. in terms of doses) to be analysed and compared against the criteria. The methodology is being tested by the application to an hypothetical engineered disposal system based on an exercise within the ISAM Programme, and will finally be applied to the Spanish case. (author)

  9. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  10. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  11. Accounting Research Methodology Textbook Development to Provide College Students in Accounting Subject

    OpenAIRE

    Muchson, Mochamad

    2015-01-01

    Due to the lackness of a research accounting methodology book (accounting research methodology) causing difficulties for college students to understand the steps of research in accounting, so that's the textbook is needed to completing Accounting Research Methodology book that has been exist and aimed to guide students for proposal composing or accounting research report so that it is easier to study by students.This textbook contains of accounting research methodology definition, accounting ...

  12. H2POWER: Development of a methodology to calculate life cycle cost of small and medium-scale hydrogen systems

    International Nuclear Information System (INIS)

    Verduzco, Laura E.; Duffey, Michael R.; Deason, Jonathan P.

    2007-01-01

    At this time, hydrogen-based power plants and large hydrogen production facilities are capital intensive and unable to compete financially against hydrocarbon-based energy production facilities. An option to overcome this problem and foster the introduction of hydrogen technology is to introduce small and medium-scale applications such as residential and community hydrogen refueling units. Such units could potentially be used to generate both electricity and heat for the home, as well as hydrogen fuel for the automobile. Cost modeling for the integration of these three forms of energy presents several methodological challenges. This is particularly true since the technology is still in the development phase and both the financial and the environmental cost must be calculated using mainly secondary sources. In order to address these issues and aid in the design of small and medium-scale hydrogen systems, this study presents a computer model to calculate financial and environmental costs of this technology using different hydrogen pathways. The model can design and compare hydrogen refueling units against hydrocarbon-based technologies, including the 'gap' between financial and economic costs. Using the methodology, various penalties and incentives that can foster the introduction of hydrogen-based technologies can be added to the analysis to study their impact on financial cost

  13. Megastudies, crowdsourcing, and large datasets in psycholinguistics: An overview of recent developments.

    Science.gov (United States)

    Keuleers, Emmanuel; Balota, David A

    2015-01-01

    This paper introduces and summarizes the special issue on megastudies, crowdsourcing, and large datasets in psycholinguistics. We provide a brief historical overview and show how the papers in this issue have extended the field by compiling new databases and making important theoretical contributions. In addition, we discuss several studies that use text corpora to build distributional semantic models to tackle various interesting problems in psycholinguistics. Finally, as is the case across the papers, we highlight some methodological issues that are brought forth via the analyses of such datasets.

  14. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Scenario Methodology for Modelling of Future Landscape Developments as Basis for Assessing Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Matthias Rosenberg

    2014-04-01

    Full Text Available The ecosystems of our intensively used European landscapes produce a variety of natural goods and services for the benefit of humankind, and secure the basics and quality of life. Because these ecosystems are still undergoing fundamental changes, the interest of the society is to know more about future developments and their ecological impacts. To describe and analyze these changes, scenarios can be developed and an assessment of the ecological changes can be carried out subsequently. In the project „Landscape Saxony 2050“; a methodology for the construction of exploratory scenarios was worked out. The presented methodology provides a possibility to identify the driving forces (socio-cultural, economic and ecological conditions of the landscape development. It allows to indicate possible future paths which lead to a change of structures and processes in the landscape and can influence the capability to provide ecosystem services. One essential component of the applied technique is that an approach for the assessment of the effects of the landscape changes on ecosystem services is integrated into the developed scenario methodology. Another is, that the methodology is strong designed as participatory, i.e. stakeholders are integrated actively. The method is a seven phase model which provides the option for the integration of the stakeholders‘ participation at all levels of scenario development. The scenario framework was applied to the district of Görlitz, an area of 2100 sq km located at the eastern border of Germany. The region is affected by strong demographic as well as economic changes. The core issue focused on the examination of landscape change in terms of biodiversity. Together with stakeholders, a trend scenario and two alternative scenarios were developed. The changes of the landscape structure are represented in story lines, maps and tables. On basis of the driving forces of the issue areas „cultural / social values“ and

  16. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  17. Trends in scenario development methodologies and integration in NUMO's approach

    International Nuclear Information System (INIS)

    Ebashi, Takeshi; Ishiguro, Katsuhiko; Wakasugi, Keiichiro; Kawamura, Hideki; Gaus, Irina; Vomvoris, Stratis; Martin, Andrew J.; Smith, Paul

    2011-01-01

    The development of scenarios for quantitative or qualitative analysis is a key element of the assessment of the safety of geological disposal systems. As an outcome of an international workshop attended by European and the Japanese implementers, a number of features common to current methodologies could be identified, as well as trends in their evolution over time. In the late nineties, scenario development was often described as a bottom-up process, whereby scenarios were said to be developed in essence from FEP databases. Nowadays, it is recognised that, in practice, the approaches actually adopted are better described as top-down or 'hybrid', taking as their starting point an integrated (top-down) understanding of the system under consideration including uncertainties in initial state, sometimes assisted by the development of 'storyboards'. A bottom-up element remains (hence the term 'hybrid') to the extent that FEP databases or FEP catalogues (including interactions) are still used, but the focus is generally on completeness checking, which occurs parallel to the main assessment process. Recent advances focus on the consistent treatment of uncertainties throughout the safety assessment and on the integration of operational safety and long term safety. (author)

  18. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  19. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  20. Systems engineering for very large systems

    Science.gov (United States)

    Lewkowicz, Paul E.

    Very large integrated systems have always posed special problems for engineers. Whether they are power generation systems, computer networks or space vehicles, whenever there are multiple interfaces, complex technologies or just demanding customers, the challenges are unique. 'Systems engineering' has evolved as a discipline in order to meet these challenges by providing a structured, top-down design and development methodology for the engineer. This paper attempts to define the general class of problems requiring the complete systems engineering treatment and to show how systems engineering can be utilized to improve customer satisfaction and profit ability. Specifically, this work will focus on a design methodology for the largest of systems, not necessarily in terms of physical size, but in terms of complexity and interconnectivity.

  1. Selection methodology for LWR safety programs and proposals. Volume 2. Methodology application

    International Nuclear Information System (INIS)

    Ritzman, R.L.; Husseiny, A.A.

    1980-08-01

    The results of work done to update and apply a methodology for selecting (prioritizing) LWR safety technology R and D programs are described. The methodology is based on multiattribute utility (MAU) theory. Application of the methodology to rank-order a group of specific R and D programs included development of a complete set of attribute utility functions, specification of individual attribute scaling constants, and refinement and use of an interactive computer program (MAUP) to process decision-maker inputs and generate overall (multiattribute) program utility values. The output results from several decision-makers are examined for consistency and conclusions and recommendations regarding general use of the methodology are presented. 3 figures, 18 tables

  2. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  3. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  4. Safety Assessment Methodologies and Their Application in Development of Near Surface Waste Disposal Facilities--ASAM Project

    International Nuclear Information System (INIS)

    Batandjieva, B.; Metcalf, P.

    2003-01-01

    Safety of near surface disposal facilities is a primary focus and objective of stakeholders involved in radioactive waste management of low and intermediate level waste and safety assessment is an important tool contributing to the evaluation and demonstration of the overall safety of these facilities. It plays significant role in different stages of development of these facilities (site characterization, design, operation, closure) and especially for those facilities for which safety assessment has not been performed or safety has not been demonstrated yet and the future has not been decided. Safety assessments also create the basis for the safety arguments presented to nuclear regulators, public and other interested parties in respect of the safety of existing facilities, the measures to upgrade existing facilities and development of new facilities. The International Atomic Energy Agency (IAEA) has initiated a number of research coordinated projects in the field of development and improvement of approaches to safety assessment and methodologies for safety assessment of near surface disposal facilities, such as NSARS (Near Surface Radioactive Waste Disposal Safety Assessment Reliability Study) and ISAM (Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities) projects. These projects were very successful and showed that there is a need to promote the consistent application of the safety assessment methodologies and to explore approaches to regulatory review of safety assessments and safety cases in order to make safety related decisions. These objectives have been the basis of the IAEA follow up coordinated research project--ASAM (Application of Safety Assessment Methodologies for Near Surface Disposal Facilities), which will commence in November 2002 and continue for a period of three years

  5. Effective diagnosis of Alzheimer’s disease by means of large margin-based methodology

    Directory of Open Access Journals (Sweden)

    Chaves Rosa

    2012-07-01

    Full Text Available Abstract Background Functional brain images such as Single-Photon Emission Computed Tomography (SPECT and Positron Emission Tomography (PET have been widely used to guide the clinicians in the Alzheimer’s Disease (AD diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD Systems. Methods It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT, Principal Component Analysis (PCA or Partial Least Squares (PLS (the two latter also analysed with a LMNN transformation. Regarding the classifiers, kernel Support Vector Machines (SVMs and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. Results Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i linear transformation of the PLS or PCA reduced data, ii feature reduction technique, and iii classifier (with Euclidean, Mahalanobis or Energy-based methodology. The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT and 90.67%, 88% and 93.33% (for PET, respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. Conclusions All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between

  6. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  7. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    International Nuclear Information System (INIS)

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-01-01

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soil and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE's) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This

  8. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  9. Development of a Methodology to Gather Seated Anthropometry in a Microgravity Environment

    Science.gov (United States)

    Rajulu, Sudhakar; Young, Karen; Mesloh, Miranda

    2009-01-01

    The Constellation Program's Crew Exploration Vehicle (CEV) is required to accommodate the full population range of crewmembers according to the anthropometry requirements stated in the Human-Systems Integration Requirement (HSIR) document (CxP70024). Seated height is one of many critical dimensions of importance to the CEV designers in determining the optimum seat configuration in the vehicle. Changes in seated height may have a large impact to the design, accommodation, and safety of the crewmembers. Seated height can change due to elongation of the spine when crewmembers are exposed to microgravity. Spinal elongation is the straightening of the natural curvature of the spine and the expansion of inter-vertebral disks. This straightening occurs due to fluid shifts in the body and the lack of compressive forces on the spinal vertebrae. Previous studies have shown that as the natural curvature of the spine straightens, an increase in overall height of 3% of stature occurs which has been the basis of the current HSIR requirements. However due to variations in the torso/leg ratio and impact of soft tissue, data is nonexistent as to how spinal elongation specifically affects the measurement of seated height. In order to obtain this data, an experiment was designed to collect spinal elongation data while in a seated posture in microgravity. The purpose of this study was to provide quantitative data that represents the amount of change that occurs in seated height due to spinal elongation in microgravity environments. Given the schedule and budget constraints of ISS and Shuttle missions and the uniqueness of the problem, a methodology had to be developed to ensure that the seated height measurements were accurately collected. Therefore, simulated microgravity evaluations were conducted to test the methodology and procedures of the experiment. This evaluation obtained seat pan pressure and seated height data to a) ensure that the lap restraint provided sufficient

  10. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  11. Development of a methodology for radionuclide impurity analysis in radiopharmaceuticals using gamma spectrometry

    International Nuclear Information System (INIS)

    Paula, Eduardo Bonfim de; Araujo, Miriam Taina Ferreira de; Delgado, Jose Ubiratan; Poledna, Roberto; Lins, Ronaldo; Leiras, Anderson; Silva, Carlos Jose da; Oliveira, Antonio Eduardo de

    2016-01-01

    The LNMRI has sought to develop a methodology for the identification and accurate detection of gamma radiation impurities stations in metrological level, aiming to meet the recommendations not only of international pharmacopoeia, as well as the CNEN and ANVISA regarding the quality control can ensure patients the doses received by the practices are as low as feasible. As an initial target, it was possible to obtain an efficiency curve with uncertainty around 1% necessary to initiate future measurements of interest applied to nuclear medicine and to start the development of impurities analysis technique. (author)

  12. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  13. DEVELOPMENT OF METHODOLOGY FOR TRAFFIC ACCIDENT FORECASTING AT VARIOUS TYPICAL URBAN AREAS

    OpenAIRE

    D. V. Kapsky

    2012-01-01

    The paper provides investigation results pertaining to development of methodology for forecasting traffic accidents using a “conflict zone” method that considers potential danger for two typical urban areas, namely: signaled crossings and bumps that are made in the areas of zebra crossings and it also considers various types and kinds of conflicts. The investigations have made it possible to obtain various indices of threshold sensitivity in respect of  potential risks  and in relation to tra...

  14. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  15. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  16. Determining the directions of increasing the innovative potential of the region by developing innovative technologies and competences when preparing and implementing large investment projects

    Directory of Open Access Journals (Sweden)

    Timur V. Kramin

    2015-12-01

    Full Text Available Objective to determine the directions of increasing the innovative potential of a region through the development of innovative technologies and competences in the process of preparation and implementation of large investment projects in the Republic of Tatarstan. Methods methodology of project management institutional approach. Results it is proved that the main largescale directions of innovative potential development in the Republic of Tatarstan as a result of preparation and implementing of Universiade 2013 in Kazan are knowledge management information technologies risk management. It is shown that in the framework of the considered innovative areas a complete system was formed of competences of employees and managers in the fields of education trade hospitality and service. Scientific novelty the key directions were defined of increasing the innovative potential of a region through the development of innovative technologies and competences in the process of preparation and implementation of large investment projects by the example of the World Summer Student Games in Kazan in 2013. Practical significance on the basis of specific examples the authors illustrate the practiceoriented mechanism of innovative potential development of a region as a result of implementation of large investment projects. nbsp

  17. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    Energy Technology Data Exchange (ETDEWEB)

    Babu, Sudarsanam Suresh [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Love, Lonnie J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Peter, William H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility; Dehoff, Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Manufacturing Demonstration Facility

    2016-05-01

    ) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.

  18. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  19. A methodological approach to development of circular economy options in businesses

    DEFF Research Database (Denmark)

    Jørgensen, Michael Søgaard; Remmen, Arne

    2018-01-01

    Three types of re-design processes are described in the development of circular economy options in businesses: 1) Re-design of the provided services considering changes in the roles of products, users, service, infrastructure, etc.; 2) Re-design of the value chain relations up-stream to suppliers...... of the business based on an environmental mapping in life cycle perspective and an organizational analysis mapping of value chain relations, competitive position, innovation practices, user practices and relations to regulation and civil society. The article shows that circular economy can be relevant...... and down-stream to customers and users; 3) Redesign of internal business organization considering necessary changes in tasks, competences, structures and technologies. These redesign processes have been developed as methodology within a research project based on cooperation with businesses in Denmark about...

  20. Development of proliferation resistance assessment methodology based on international standard

    International Nuclear Information System (INIS)

    Ko, W. I.; Chang, H. L.; Lee, Y. D.; Lee, J. W.; Park, J. H.; Kim, Y. I.; Ryu, J. S.; Ko, H. S.; Lee, K. W.

    2012-04-01

    Nonproliferation is one of the main requirements to be satisfied by the advanced future nuclear energy systems that have been developed in the Generation IV and INPRO studies. The methodologies to evaluate proliferation resistance has been developed since 1980s, however, the systematic evaluation approach has begun from around 2000. Domestically a study to develop national method to evaluate proliferation resistance (PR) of advanced future nuclear energy systems has started in 2007 as one of the long-term nuclear R and D subjects in order to promote export and international credibility and transparency of national nuclear energy systems and nuclear fuel cycle technology development program. In the first phase (2007-2010) development and improvement of intrinsic evaluation parameters for the evaluation of proliferation resistance, quantification of evaluation parameters, development of evaluation models, and development of permissible ranges of evaluation parameters have been carried out. In the second phase (2010-2012) generic principle of to evaluate PR was established, and techincal guidelines, nuclear material diversion pathway analysis method, and a method to integrate evaluation parameters have been developed. which were applied to 5 alternative nuclear fuel cycles to estimate their appicability and objectivity. In addition, measures to enhance PR of advanced future nuclear energy systems and technical guidelines of PR assessment using intrinsic PR evaluation parameters were developed. Lastly, requlatory requirements to secure nonproliferation requirements of nuclear energy systems from the early design stage, operation and to decommissioning which will support the export of newly developed advanced future nuclear energy system

  1. Development of a database: DACTARI for a radio-toxic element ranking methodology

    International Nuclear Information System (INIS)

    Ansoborlo, E.; Santucci, C.; Grouiller, J.P.; Boucher, L.; Fluery-Herard, A.; Menetrier, F.; Comte, A.; Cook, E.; Moulin, V.

    2007-01-01

    Dosimetric impact studies aim at evaluating potential radiological effects of chronic or acute releases from nuclear facilities. A methodology for ranking radionuclides (RN) in terms of their health-related impact on the human population was first developed at CEA with specific criteria for each RN that could be applied to a variety of situations. It is based, in particular, on applying physico-chemical criteria to the complete RN inventory (present in the release or in the source term) and on applying norms related to radiation protection and chemical toxicology. The initial step consisted in identifying and collecting data necessary to apply the methodology, with reference to a previous database of long-lived radionuclides (LLRN, with half-lives ranging from 30 to 10 14 y) containing 95 radionuclides. The initial results have allowed us to identify missing data and revealed the need to complete the study for both toxic and radio-toxic aspects. This led us to the next step, developing a specific database, Database for Chemical Toxicity and Radiotoxicity Assessment of RadIonuclides (DACTARI), to collect data on chemical toxicity and radiotoxicity, including acute or chronic toxicity, the chemical form of the compounds, the contamination route (ingestion, inhalation), lethal doses, target organs, intestinal and maternal-foetal transfer, drinking water guidelines and the mutagenic and carcinogenic properties. (authors)

  2. The study of methodologies of software development for the next generation of HEP detector software

    International Nuclear Information System (INIS)

    Ding Yuzheng; Wang Taijie; Dai Guiliang

    1997-01-01

    The author discusses the characteristics of the next generation of HEP (High Energy Physics) detector software, and describes the basic strategy for the usage of object oriented methodologies, languages and tools in the development of the next generation of HEP detector software

  3. DEVELOPMENT OF A METHODOLOGY TO ASSESS PROLIFERATION RESISTANCE AND PHYSICAL PROTECTION FOR GENERATION IV SYSTEMS

    International Nuclear Information System (INIS)

    Nishimura, R.; Bari, R.; Peterson, P.; Roglans-Ribas, J.; Kalenchuk, D.

    2004-01-01

    Enhanced proliferation resistance and physical protection (PR and PP) is one of the technology goals for advanced nuclear concepts, such as Generation IV systems. Under the auspices of the Generation IV International Forum, the Office of Nuclear Energy, Science and Technology of the U.S. DOE, the Office of Nonproliferation Policy of the National Nuclear Security Administration, and participating organizations from six other countries are sponsoring an international working group to develop an evaluation methodology for PR and PP. This methodology will permit an objective PR and PP comparison between alternative nuclear systems (e.g., different reactor types or fuel cycles) and support design optimization to enhance robustness against proliferation, theft and sabotage. The paper summarizes the proposed assessment methodology including the assessment framework, measures used to express the PR and PP characteristics of the system, threat definition, system element and target identification, pathway identification and analysis, and estimation of the measures

  4. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    Science.gov (United States)

    2014-06-01

    System Number CAIDA Cooperative Association of Internet Data Analysis GB gigabyte IETF IPv4 IP IPv6 ISP NPS NTC RFC RTT TTL ICMP NPS ESD VSD TCP UDP DoS...including, DIMES, IPlane, Ark IPv4 All Prefix /24 and recently NPS probing methodol- ogy. NPS probing methodology is different from the others because it...trace, a history of the forward interface-level path and time to send and acknowledge are available to analyze. However, traceroute may not return

  5. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  6. Bioclim deliverable D8a: development of the rule-based down-scaling methodology for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    The BIOCLIM project on modelling sequential Biosphere systems under Climate change for radioactive waste disposal is part of the EURATOM fifth European framework programme. The project was launched in October 2000 for a three-year period. The project aims at providing a scientific basis and practical methodology for assessing the possible long term impacts on the safety of radioactive waste repositories in deep formations due to climate and environmental change. Five work packages (WP) have been identified to fulfill the project objectives. One of the tasks of BIOCLIM WP3 was to develop a rule-based approach for down-scaling from the MoBidiC model of intermediate complexity in order to provide consistent estimates of monthly temperature and precipitation for the specific regions of interest to BIOCLIM (Central Spain, Central England and Northeast France, together with Germany and the Czech Republic). A statistical down-scaling methodology has been developed by Philippe Marbaix of CEA/LSCE for use with the second climate model of intermediate complexity used in BIOCLIM - CLIMBER-GREMLINS. The rule-based methodology assigns climate states or classes to a point on the time continuum of a region according to a combination of simple threshold values which can be determined from the coarse scale climate model. Once climate states or classes have been defined, monthly temperature and precipitation climatologies are constructed using analogue stations identified from a data base of present-day climate observations. The most appropriate climate classification for BIOCLIM purposes is the Koeppen/Trewartha scheme. This scheme has the advantage of being empirical, but only requires monthly averages of temperature and precipitation as input variables. Section 2 of this deliverable (D8a) outline how each of the eight methodological steps have been undertaken for each of the three main BIOCLIM study regions (Central England, Northeast France and Central Spain) using Mo

  7. Developing methodological awareness of reading, thinking and writing as knowledge producing practices

    DEFF Research Database (Denmark)

    Katan, Lina Hauge; Baarts, Charlotte

    Developing methodological awareness among university students about reading, thinking and writing as knowledge producing practices Integrated acts of reading, thinking and writing comprise an extensive and extremely significant part of the learning processes through which we produce knowledge...... text books on method and classes too. As a consequence students have few chances of encountering the practices of reading, thinking and writing depicted as those imperative parts of knowledge making that we as researchers of the humanities and social sciences know them to be. Subsequently students...... are not taught to understand reading, thinking and writing as central practices of research nor do they come to develop methododological awareness about them as such. In this paper, we report from our endavour into designing and developing a course offered for under- and graduate students, with the aim...

  8. System assessment using modular logic fault tree methodology

    International Nuclear Information System (INIS)

    Troncoso Fleitas, M.

    1996-01-01

    In the process of a Probabilistic Safety analysis (PSA) study a large number of fault trees are generated by different specialist. Modular Logic Fault Tree Methodology pave the way the way to systematize the procedures and to unify the criteria in the process of systems modulation. An example of of the application of this methodology is shown

  9. Using model based systems engineering for the development of the Large Synoptic Survey Telescope's operational plan

    Science.gov (United States)

    Selvy, Brian M.; Claver, Charles; Willman, Beth; Petravick, Don; Johnson, Margaret; Reil, Kevin; Marshall, Stuart; Thomas, Sandrine; Lotz, Paul; Schumacher, German; Lim, Kian-Tat; Jenness, Tim; Jacoby, Suzanne; Emmons, Ben; Axelrod, Tim

    2016-08-01

    We† provide an overview of the Model Based Systems Engineering (MBSE) language, tool, and methodology being used in our development of the Operational Plan for Large Synoptic Survey Telescope (LSST) operations. LSST's Systems Engineering (SE) team is using a model-based approach to operational plan development to: 1) capture the topdown stakeholders' needs and functional allocations defining the scope, required tasks, and personnel needed for operations, and 2) capture the bottom-up operations and maintenance activities required to conduct the LSST survey across its distributed operations sites for the full ten year survey duration. To accomplish these complimentary goals and ensure that they result in self-consistent results, we have developed a holistic approach using the Sparx Enterprise Architect modeling tool and Systems Modeling Language (SysML). This approach utilizes SysML Use Cases, Actors, associated relationships, and Activity Diagrams to document and refine all of the major operations and maintenance activities that will be required to successfully operate the observatory and meet stakeholder expectations. We have developed several customized extensions of the SysML language including the creation of a custom stereotyped Use Case element with unique tagged values, as well as unique association connectors and Actor stereotypes. We demonstrate this customized MBSE methodology enables us to define: 1) the rolls each human Actor must take on to successfully carry out the activities associated with the Use Cases; 2) the skills each Actor must possess; 3) the functional allocation of all required stakeholder activities and Use Cases to organizational entities tasked with carrying them out; and 4) the organization structure required to successfully execute the operational survey. Our approach allows for continual refinement utilizing the systems engineering spiral method to expose finer levels of detail as necessary. For example, the bottom-up, Use Case

  10. Diffusion of innovative agricultural production systems for sustainable development of small islands: A methodological approach based on the science of complexity

    Science.gov (United States)

    Barbera, Guiseppe; Butera, Federico M.

    1992-09-01

    In order to develop small islands, not only must a vital agricultural system be maintained, but the range of opportunities for tourism must be increased with respect to both the seaside and the environmental features of the rural landscape. As an alternative to the traditional and economically declining ones, many innovative production processes can be identified, but their success depends on their interaction with the physical, biological, economic and social environment. In order to identify the main nodes and the most critical interactions, so as to increase the probability of success of a new productive process, a methodological approach based on the science of complexity is proposed for the cultivation of capers ( Capparis spinosa L.) on the island of Pantelleria. The methodology encompasses the identification of actors and factors involved. the quantitative evaluation of their interactions with the different stages of the productive process, and a quasiquantitative evaluation of the probability that the particular action will be performed successfully. The study of “traditional,” “modernized,” and “modernized-sustainable” processes, shows that the modernized-sustainable process offers mutually reinforcing opportunities in terms of an integrated development of high-quality agricultural products and the enhancement of environmental features, in conjunction with high-efficiency production techniques, in conjunction with high-efficiency production techniques, in a way that suits the development of Pantelleria. There is a high probability of failure, however, as a result of the large number of critical factors. Nevertheless, the present study indicates which activities will enhance the probability of successful innovation in the production process.

  11. Methodological development of the process of appreciation of photography Conceptions

    Directory of Open Access Journals (Sweden)

    Yovany Álvarez García

    2012-12-01

    Full Text Available This article discusses the different concepts that are used to methodological appreciation of photography. Since photography is one of the manifestations of the visu al arts with the most commonly interacts daily ; from which can be found in books, magazines and other publications, discusses various methodologies to assess the photographic image. It addresses also the classic themes of photography as well as some expres sive elements.

  12. Methodology for urban rail and construction technology research and development planning

    Science.gov (United States)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  13. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega

    2017-01-01

    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  14. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  15. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    Directory of Open Access Journals (Sweden)

    Wani Tanveer A

    2012-11-01

    Full Text Available Abstract Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca. Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ, time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993 over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach and reduction in the analysis cost by 50-fold.

  16. Use of response surface methodology for development of new microwell-based spectrophotometric method for determination of atrovastatin calcium in tablets

    Science.gov (United States)

    2012-01-01

    Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143

  17. Development of a Graphical Tool to integrate the Prometheus AEOlus methodology and Jason Platform

    Directory of Open Access Journals (Sweden)

    Rafhael CUNHA

    2017-07-01

    Full Text Available Software Engineering (SE is an area that intends to build high-quality software in a systematic way. However, traditional software engineering techniques and methods do not support the demand for developing Multiagent Systems (MAS. Therefore a new subarea has been studied, called Agent Oriented Software Engineering (AOSE. The AOSE area proposes solutions to issues related to the development of agent oriented systems. There is still no standardization in this subarea, resulting in several methodologies. Another issue of this subarea is that there are very few tools that are able to automatically generate code. In this work we propose a tool to support the Prometheus AEOlus Methodology because it provides modelling artifacts to all MAS dimensions: agents, environment, interaction, and organization. The tool supports all Prometheus AEOlus artifacts and can automatically generated code to the agent and interaction dimensions in the AgentSpeak Language, which is the language used in the Jason Platform. We have done some validations with the proposed tool and a case study is presented.

  18. CONSIDERATIONS ON THE METHODOLOGY FOR IDENTIFYING AND PRIORITIZING PUBLIC INVESTMENT PROJECTS IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Marilena BOGHEANU

    2012-01-01

    Full Text Available The optimization of the investment project pipeline of public administration is a key issue for member states, including Romania, in the current European Union policy. Solving this problem depends largely on the establishment of an appropriate methodological framework for identifying and prioritizing projects and development programs. Based on the new requirements imposed by the European Commission for the next financial cycle 2014-2020, in this article we propose solutions for improving the methodological framework by establishing general and specific criteria for evaluating projects, especially in the ex-ante stage.Our research is based on critical analysis of the current situation, including sample survey. The results obtained were materialized in an improved methodology for selecting and prioritizing projects that can contribute to a stable and uniform mechanism for planning at local level.

  19. Contextual factors, methodological principles and teacher cognition

    Directory of Open Access Journals (Sweden)

    Rupert Walsh

    2014-01-01

    Full Text Available Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT, are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the West there is a happy congruence between policy supportive of CLT or Task-Based Language Teaching, teacher education and supervision, and curriculum design with teachers’ cognitions and their practices. Our case study of three EFL teachers at a UK adult education college is motivated by a wish to question this assumption. Findings from observational and interview data suggest the practices of two teachers were largely consistent with their methodological principles, relating to stronger and weaker forms of CLT respectively, as well as to more general educational principles, such as a concern for learners; the supportive environment seemed to help. The third teacher appeared to put “difficult” contextual factors, for example, tests, ahead of methodological principles without, however, obviously benefiting. Implications highlight the important role of teacher cognition research in challenging cultural assumptions.

  20. Mapping plant species ranges in the Hawaiian Islands: developing a methodology and associated GIS layers

    Science.gov (United States)

    Price, Jonathan P.; Jacobi, James D.; Gon, Samuel M.; Matsuwaki, Dwight; Mehrhoff, Loyal; Wagner, Warren; Lucas, Matthew; Rowe, Barbara

    2012-01-01

    This report documents a methodology for projecting the geographic ranges of plant species in the Hawaiian Islands. The methodology consists primarily of the creation of several geographic information system (GIS) data layers depicting attributes related to the geographic ranges of plant species. The most important spatial-data layer generated here is an objectively defined classification of climate as it pertains to the distribution of plant species. By examining previous zonal-vegetation classifications in light of spatially detailed climate data, broad zones of climate relevant to contemporary concepts of vegetation in the Hawaiian Islands can be explicitly defined. Other spatial-data layers presented here include the following: substrate age, as large areas of the island of Hawai'i, in particular, are covered by very young lava flows inimical to the growth of many plant species; biogeographic regions of the larger islands that are composites of multiple volcanoes, as many of their species are restricted to a given topographically isolated mountain or a specified group of them; and human impact, which can reduce the range of many species relative to where they formerly were found. Other factors influencing the geographic ranges of species that are discussed here but not developed further, owing to limitations in rendering them spatially, include topography, soils, and disturbance. A method is described for analyzing these layers in a GIS, in conjunction with a database of species distributions, to project the ranges of plant species, which include both the potential range prior to human disturbance and the projected present range. Examples of range maps for several species are given as case studies that demonstrate different spatial characteristics of range. Several potential applications of species-range maps are discussed, including facilitating field surveys, informing restoration efforts, studying range size and rarity, studying biodiversity, managing

  1. Future development of large steam turbines

    International Nuclear Information System (INIS)

    Chevance, A.

    1975-01-01

    An attempt is made to forecast the future of the large steam turbines till 1985. Three parameters affect the development of large turbines: 1) unit output; and a 2000 to 2500MW output may be scheduled; 2) steam quality: and two steam qualities may be considered: medium pressure saturated or slightly overheated steam (light water, heavy water); light enthalpie drop, high pressure steam, high temperature; high enthalpic drop; and 3) the quality of cooling supply. The largest range to be considered might be: open system cooling for sea-sites; humid tower cooling and dry tower cooling. Bi-fluid cooling cycles should be also mentioned. From the study of these influencing factors, it appears that the constructor, for an output of about 2500MW should have at his disposal the followings: two construction technologies for inlet parts and for high and intermediate pressure parts corresponding to both steam qualities; exhaust sections suitable for the different qualities of cooling supply. The two construction technologies with the two steam qualities already exist and involve no major developments. But, the exhaust section sets the question of rotational speed [fr

  2. Demonstration of an infiltration evaluation methodology

    International Nuclear Information System (INIS)

    Smyth, J.D.; Gee, G.W.; Kincaid, C.T.; Nichols, W.M.; Bresler, E.

    1990-07-01

    An Infiltration Evaluation Methodology (IEM) was developed for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL) to provide a consistent, well formulated approach for evaluating drainage through engineered covers at low-level radioactive waste (LLW) sites. The methodology is designed to help evaluate the ability of proposed waste site covers to minimize drainage for LLW site license applications and for sites associated with the Uranium Mill Tailings Remedial Action (UMTRA) program. The objective of this methodology is to estimate the drainage through an engineered burial site cover system. The drainage estimate can be used as an input to a broader performance assessment methodology currently under development by the NRC. The methodology is designed to simulate, at the field scale, significant factors and hydrologic conditions which determine or influence estimates of infiltration, long-term moisture content profiles, and drainage from engineered covers and barriers. The IEM developed under this study acknowledges the uncertainty inherent in soil properties and quantifies the influence of such uncertainty on the estimates of drainage in engineered cover systems at waste disposal sites. 6 refs., 1 fig

  3. Development of the processing software package for RPV neutron fluence determination methodology

    International Nuclear Information System (INIS)

    Belousov, S.; Kirilova, K.; Ilieva, K.

    2001-01-01

    According to the INRNE methodology the neutron transport calculation is carried out by two steps. At the first step reactor core eigenvalue calculation is performed. This calculation is used for determination of the fixed source for the next step calculation of neutron transport from the reactor core to the RPV. Both calculation steps are performed by state of the art and tested codes. The interface software package DOSRC developed at INRNE is used as a link between these two calculations. The package transforms reactor core calculation results to neutron source input data in format appropriate for the neutron transport codes (DORT, TORT and ASYNT) based on the discrete ordinates method. These codes are applied for calculation of the RPV neutron flux and its responses - induced activity, radiation damage, neutron fluence etc. Fore more precise estimation of the neutron fluence, the INRNE methodology has been supplemented by the next improvements: - implementation of more advanced codes (PYTHIA/DERAB) for neutron-physics parameter calculations; - more detailed neutron source presentation; - verification of neutron fluence by statistically treated experimental data. (author)

  4. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  5. A gamma heating calculation methodology for research reactor application

    International Nuclear Information System (INIS)

    Lee, Y.K.; David, J.C.; Carcreff, H.

    2001-01-01

    Gamma heating is an important issue in research reactor operation and fuel safety. Heat deposition in irradiation targets and temperature distribution in irradiation facility should be determined so as to obtain the optimal irradiation conditions. This paper presents a recently developed gamma heating calculation methodology and its application on the research reactors. Based on the TRIPOLI-4 Monte Carlo code under the continuous-energy option, this new calculation methodology was validated against calorimetric measurements realized within a large ex-core irradiation facility of the 70 MWth OSIRIS materials testing reactor (MTR). The contributions from prompt fission neutrons, prompt fission γ-rays, capture γ-rays and inelastic γ-rays to heat deposition were evaluated by a coupled (n, γ) transport calculation. The fission product decay γ-rays were also considered but the activation γ-rays were neglected in this study. (author)

  6. On Methodology and (PostColonialism in Tourism Promotional Language

    Directory of Open Access Journals (Sweden)

    Ana Crăciunescu

    2016-07-01

    Full Text Available The greatest challenge of the present paper is the research paradox that tourism imbeds within its economic and historical realities. Whilst tributary to Western research methodologies, we realized that an impartial analysis of tourism studies, which is essentially postcolonial, presupposes a pertinent review of ''marginal'' state of the art in the field. As scholars argue, tourism has developed a language of its own in order to promote cultural patterns of authenticity. This universal lexicon is largely rooted in myth and otherness, which lead us to the exemplification from a semio-linguistic perspective of a paradisiacal destination −Jamaica − included in the categories of such by the normative Westerner. We also touch upon gender issues on the larger colonialist background that the relation Self-Other, Orient-Occident, Colonizer-Colonized raises even at a methodological level in recent literature.

  7. Water Quality Research Program: Development of Unstructured Grid Linkage Methodology and Software for CE-QUAL-ICM

    National Research Council Canada - National Science Library

    Chapman, Raymond

    1997-01-01

    This study was conducted for the purpose of developing a methodology and associated software for linking hydrodynamic output from the RMAlO finite element model to the CE-QUAL-ICM finite volume water quality model...

  8. Development of safety analysis methodology for moderator system failure of CANDU-6 reactor by thermal-hydraulics/physics coupling

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Jin, Dong Sik; Chang, Soon Heung

    2013-01-01

    Highlights: • Developed new safety analysis methodology of moderator system failures for CANDU-6. • The new methodology used the TH-physics coupling concept. • Thermalhydraulic code is CATHENA, physics code is RFSP-IST. • Moderator system failure ends to the subcriticality through self-shutdown. -- Abstract: The new safety analysis methodology for the CANDU-6 nuclear power plant (NPP) moderator system failure has been developed by using the coupling technology with the thermalhydraulic code, CATHENA and reactor core physics code, RFSP-IST. This sophisticated methodology can replace the legacy methodology using the MODSTBOIL and SMOKIN-G2 in the field of the thermalhydraulics and reactor physics, respectively. The CATHENA thermalhydraulic model of the moderator system can simulate the thermalhydraulic behaviors of all the moderator systems such as the calandria tank, head tank, moderator circulating circuit and cover gas circulating circuit and can also predict the thermalhydraulic property of the moderator such as moderator density, temperature and water level in the calandria tank as the moderator system failures go on. And these calculated moderator thermalhydraulic properties are provided to the 3-dimensional neutron kinetics solution module – CERBRRS of RFSP-IST as inputs, which can predict the change of the reactor power and provide the calculated reactor power to the CATHENA. These coupling calculations are performed at every 2 s time steps, which are equivalent to the slow control of CANDU-6 reactor regulating systems (RRS). The safety analysis results using this coupling methodology reveal that the reactor operation enters into the self-shutdown mode without any engineering safety system and/or human interventions for the postulated moderator system failures of the loss of heat sink and moderator inventory, respectively

  9. Engendering Development: Some Methodological Perspectives on Child Labour

    Directory of Open Access Journals (Sweden)

    Erica Burman

    2006-01-01

    Full Text Available In this article I address when and why it is useful to focus on gender in the design and conceptualisation of developmental psychological research. Since methodological debates treated in the abstract tend to lack both the specificity and rigour that application to a particular context or topic imports, I take a particular focus for my discussion: child labour. In doing so I hope to highlight the analytical and practical gains of bringing gendered agendas alongside, and into, developmental research. While child labour may seem a rather curious topic for discussion of developmental psychological research practice, this article will show how it indicates with particular clarity issues that mainstream psychological research often occludes or forgets. In particular, I explore analytical and methodological benefits of exploring the diverse ways gender structures notions of childhood, alongside the developmental commonalities and asymmetries of gender and age as categories. I suggest that the usual assumed elision between women and children is often unhelpful for both women and children. Instead, an analytical attention to the shifting forms and relations of children's work facilitates more differentiated perspectives on how its meanings reflect economic and cultural (including gendered conditions, and so attends better to social inequalities. These inequalities also structure the methodological conditions and paradigms for research with children, and so the article finishes by elaborating from this discussion of child labour four key principles for engendering psychological research with and about children, which also have broader implications for conceptualisations of the relations between gender, childhood, culture and families. URN: urn:nbn:de:0114-fqs060111

  10. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  11. Development of methodology for early detection of BWR instabilities

    International Nuclear Information System (INIS)

    Alessandro Petruzzi; Shin Chin; Kostadin Ivanov; Asok Ray; Fan-Bill Cheung

    2005-01-01

    Full text of publication follows: The objective of the work presented in this paper research, which is supported by the US Department of Energy under the NEER program, is to develop an early anomaly detection methodology in order to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, based on the US NRC coupled code TRACE/PARCS, is being utilized as a generator of time series data for anomaly detection at an early stage. The concept of the methodology is based on the fact that nonlinear systems show bifurcation, which is a change in the qualitative behavior as the system parameters vary. Some of these parameters may change on their own accord and account for the anomaly, while certain parameters can be altered in a controlled fashion. The non-linear, non-autonomous BWR system model considered in this research exhibits phenomena at two time scales. Anomalies occur at the slow time scale while the observation of the dynamical behavior, based on which inferences are made, takes place at the fast time scale. It is assumed that: (i) the system behavior is stationary at the fast time scale; and (ii) any observable non-stationary behavior is associated with parametric changes evolving at the slow time scale. The goal is to make inferences about evolving anomalies based on the asymptotic behavior derived from the computer simulation. However, only sufficient changes in the slowly varying parameter may lead to detectable difference in the asymptotic behavior. The need to detect such small changes in parameters and hence early detection of an anomaly motivate the utilized stimulus-response approach. In this approach, the model

  12. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.

    1994-01-01

    A new methodology for equivalent dose calculations has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neutral network. The research was orientated towards the optimization of the whole set of parameters involves in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neutral network was performed by taking the readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation. (author)

  13. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.; Campos, L.L.

    1994-01-01

    A new methodology for equivalent dose calculation has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neural network. The research was oriented towards the optimization of the whole set of parameters involved in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neural network was performed by taking readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation

  14. Methodological development for selection of significant predictors explaining fatal road accidents.

    Science.gov (United States)

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  15. DEVELOPMENT OF METHODOLOGY FOR THE CALCULATION OF THE PROJECT INNOVATION INDICATOR AND ITS CRITERIA COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mariya Vishnevskaya

    2017-12-01

    Full Text Available Two main components of the problem studied in the article are revealed. At the practical level, the provision of the convenient tools allowing a comprehensive evaluation the proposed innovative project in terms of its possibilities for inclusion in the portfolio or development program, and on the level of science – the need for improvement and complementing the existing methodology of assessment of innovative projects attractiveness in the context of their properties and a specific set of components. The research is scientifically applied since the problem solution involves the science-based development of a set of techniques, allowing the practical use of knowledge gained from large information arrays at the initialization stage. The purpose of the study is the formation of an integrated indicator of the project innovation, with a substantive justification of the calculation method, as a tool for the evaluation and selection of projects to be included in the portfolio of projects and programs. The theoretical and methodological basis of the research is the conceptual provisions and scientific developments of experts on project management issues, published in monographs, periodicals, materials of scientific and practical conferences on the topic of research. The tasks were solved using the general scientific and special methods, mathematical modelling methods based on the system approach. Results. A balanced system of parametric single indicators of innovation is presented – the risks, personnel, quality, innovation, resources, and performers, which allows getting a comprehensive idea of any project already in the initial stages. The choice of a risk tolerance as a key criterion of the “risks” element and the reference characteristics is substantiated, in relation to which it can be argued that the potential project holds promise. A tool for calculating the risk tolerance based on the use of matrices and vector analysis is proposed

  16. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    Science.gov (United States)

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  17. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  18. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  19. Large-scale CO2 injection demos for the development of monitoring and verification technology and guidelines (CO2ReMoVe)

    Energy Technology Data Exchange (ETDEWEB)

    Wildenborg, T.; David, P. [TNO Built Environment and Geosciences, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Bentham, M.; Chadwick, A.; Kirk, K. [British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham NG12 5GG (United Kingdom); Dillen, M. [SINTEF Petroleum Research, Trondheim (Norway); Groenenberg, H. [Unit Policy Studies, Energy Research Centre of the Netherlands ECN, Amsterdam (Netherlands); Deflandre, J.P.; Le Gallo, J. [Institut Francais du Petrole, Rueil-Malmaison (France)

    2009-04-15

    The objectives of the EU project CO2ReMoVe are to undertake the research and development necessary to establish scientifically based standards for monitoring future CCS operations and to develop the performance assessment methodologies necessary to demonstrate the long-term reliability of geological storage of CO2. This could in turn lead to guidelines for the certification of sites suitable for CCS on a wide scale. Crucial to the project portfolio are the continuing large-scale CO2 injection operation at Sleipner, the injection operation at In Salah (Algeria) and the recently started injection project at Snoehvit (Norway). Two pilot sites are also currently in the project portfolio, Ketzin in Germany and K12-B in the offshore continental shelf of the Netherlands.

  20. A methodology for laser diagnostics in large-bore marine two-stroke diesel engines

    International Nuclear Information System (INIS)

    Hult, J; Mayer, S

    2013-01-01

    Large two-stroke diesel engines for marine propulsion offer several challenges to successful implementation of the laser diagnostic techniques applied extensively in smaller automotive engines. For this purpose a fully operational large-bore engine has been modified to allow flexible optical access, through 24 optical ports with clear diameters of 40 mm. By mounting the entire optical set-up directly to the engine, effects of the vigorous vibrations and thermal drifts on alignment can be minimized. Wide-angle observation and illumination, as well as relatively large aperture detection, is made possible through mounting of optical modules and relays inside optical ports. This allows positioning of the last optical element within 10 mm from the cylinder wall. Finally, the implementation on a multi-cylinder engine allows for flexible and independent operation of the optically accessible cylinder for testing purposes. The performance of the integrated optical engine and imaging system developed is demonstrated through laser Mie scattering imaging of fuel jet structures, from which information on liquid penetration and spray angles can be deduced. Double pulse laser-sheet imaging of native in-cylinder structures is also demonstrated, for the purpose of velocimetry. (paper)

  1. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  2. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  3. Methodology development for plutonium categorization and enhancement of proliferation resistance by P3 mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Saito, M.; Kimura, Y.; Sagara, H.; Han, C. Y. [Tokyo Institute of Technology, Tokyo (Japan); Koyama, S. [Japan Atomic Energy Agency, Ibaraki (Japan)

    2012-03-15

    'Protected Plutonium Production (P3)' has been proposed to enhance the proliferation resistance of plutonium by the transmutation of Minor Actinides (MA). For example, adding the small amount of Minor Actinides such as {sup 237}Np or {sup 241}Am with large neutron capture cross-section to the uranium fuel to enhance the production of {sup 238}Pu, which has high spontaneous fission neutron rate do deteriorate the quality of the nuclear weapon manufacture and maintenance technologically difficult, is very effective for improving the isotopic barrier for the proliferation of plutonium. To demonstrate the P3 mechanism experimentally, U samples with 2, 5 and 10% {sup 237}Np doping were irradiated in Advanced Thermal Reactor (ATR) of INL. The fuel test samples were removed from the core at 100, 200 and 300 effective full power days (EFPD), and then post irradiation examination was completed at Chemical Lab. in Idaho National Laboratory(INL). The theoretical results of P3 mechanism predict the experimental ones quite well. The evaluation function, 'Attractiveness', was introduced as the ratio of function of Rossi-alpha to the 'Technical Difficulties for Fission Explosive Device Use. 'Rossi-alpha defined as the ratio of super-criticality to prompt neutron lifetime is the meaningful feature of the explosive yield. The Technical Difficulties for Fission Explosive Device Use can be expressed by the function of specific decay heat , spontaneous fission neutron rate and radiation of plutonium metal. Original methodology to evaluate Attractiveness of Plutonium has been improved by considering the effect of the compression of Plutonium isotope and also pre-detonation probability due to spontaneous fission neutron ate, which was applied for the categorization of the plutonium from the conventional reactors and the innovative reactors based on P3 mechanism. In the present paper, the fundamentals of P3 mechanism, the experimental demonstration of P3

  4. Methodology development for plutonium categorization and enhancement of proliferation resistance by P3 mechanism

    International Nuclear Information System (INIS)

    Saito, M.; Kimura, Y.; Sagara, H.; Han, C. Y.; Koyama, S.

    2012-01-01

    'Protected Plutonium Production (P3)' has been proposed to enhance the proliferation resistance of plutonium by the transmutation of Minor Actinides (MA). For example, adding the small amount of Minor Actinides such as 237 Np or 241 Am with large neutron capture cross-section to the uranium fuel to enhance the production of 238 Pu, which has high spontaneous fission neutron rate do deteriorate the quality of the nuclear weapon manufacture and maintenance technologically difficult, is very effective for improving the isotopic barrier for the proliferation of plutonium. To demonstrate the P3 mechanism experimentally, U samples with 2, 5 and 10% 237 Np doping were irradiated in Advanced Thermal Reactor (ATR) of INL. The fuel test samples were removed from the core at 100, 200 and 300 effective full power days (EFPD), and then post irradiation examination was completed at Chemical Lab. in Idaho National Laboratory(INL). The theoretical results of P3 mechanism predict the experimental ones quite well. The evaluation function, 'Attractiveness', was introduced as the ratio of function of Rossi-alpha to the 'Technical Difficulties for Fission Explosive Device Use. 'Rossi-alpha defined as the ratio of super-criticality to prompt neutron lifetime is the meaningful feature of the explosive yield. The Technical Difficulties for Fission Explosive Device Use can be expressed by the function of specific decay heat , spontaneous fission neutron rate and radiation of plutonium metal. Original methodology to evaluate Attractiveness of Plutonium has been improved by considering the effect of the compression of Plutonium isotope and also pre-detonation probability due to spontaneous fission neutron ate, which was applied for the categorization of the plutonium from the conventional reactors and the innovative reactors based on P3 mechanism. In the present paper, the fundamentals of P3 mechanism, the experimental demonstration of P3 mechanism in ATR of INL and the methodology

  5. Current status and future directions of development of PR/PP evaluation methodology

    International Nuclear Information System (INIS)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D.

    2012-01-01

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research

  6. Current status and future directions of development of PR/PP evaluation methodology

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. Y.; Kwon, E. H.; Kim, H. D. [KAERI, Daejeon (Korea, Republic of)

    2012-10-15

    A mandatory design requirement for the introduction of generation IV nuclear energy systems (NESs) is defined as the characteristic of a nuclear energy system that impedes the diversion or undeclared production of nuclear material, or misuse of technology, by State in order to acquire nuclear weapons or other nuclear explosive devices. The same report also defines physical protection (PP) as the use of technical, administrative, and operational measures to prevent the theft of nuclear/radioactive material for the purpose of producing nuclear weapons, producing nuclear devices for nuclear terrorism, or using the facility or transportation system for radiological sabotage. Since the early 1970s right after the Indian nuclear test, the international community has recognized the limits of political and diplomatic means to prevent overt proliferation by states and looked for ways to incorporate technical features that are inherent in NESs. As a first step, active research has been conducted to develop a methodology to evaluate PR and PP components of NESs and has now been reduced to two main R and D streams: the Generation IV International Forum (GIF) and International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). (Currently, GIF and INPRO are leading the debate as major projects for PR and PP evaluation methods.) This paper presents an overview of the R and D accomplishments during the development of PR and PP evaluation methodology. It also suggests some directions for future research.

  7. Organizing the Methodology Work at Higher School

    Directory of Open Access Journals (Sweden)

    O. A. Plaksina

    2012-01-01

    Full Text Available The paper considers the methodology components of organizing the higher school training. The research and analysis of the existing methodology systems carried out by the authors reveals that their advantages and disadvantages are related to the type of the system creating element of the methodology system organizational structure. The optimal scheme of such system has been developed in the context of Vocational School Reorganization implying the specification and expansion of the set of basic design principles of any control system. Following the suggested organizational approach provides the grounds for teachers’ self development and professional growth. The methodology of the approach allows using the given structure in any higher educational institution, providing the system transition from its simple functioning to the sustainable development mode. 

  8. Assessment of the MDNBR enhancement methodologies for the SMART control rods banks withdrawal event

    International Nuclear Information System (INIS)

    Yang, Soo Hyung; Chung, Young-Jong; Kim, Hee Cheol

    2005-01-01

    For an electricity generation and seawater desalination, a 330 MW System-integrated Modular Advanced ReacTor (SMART) was developed by KAERI. The safety level of the SMART is enhanced when compared to that of the typical commercial reactors, with the aid of an elimination of a large break loss of coolant accident by placing the major components of the primary system in a reactor vessel and the adoption of a new technology and a passive design concept into the safety system. However, the events related to reactivity and power distribution anomalies have been evaluated as vulnerable points when compared to the other initiating events in the SMART, since the reactivity worth of the control rods (CR) banks is quite large due to the boron free core concept. Especially, safety margins, i.e., minimum departure from nucleate boiling ratio (MDNBR), are significantly threatened during the CR banks withdrawal event. Therefore, MDNBR enhancement methodology for the CR banks withdrawal event should be considered to further enhance the safety level of the SMART design. Two methodologies have been suggested to enhance the MDNBR during the CR banks withdrawal event: the application of a DNBR trip function into a core protection system and a turbine trip delay methodology. Sensitivity studies are performed to evaluate the two MDNBR enhancement methodologies and show that the suggested methodologies could enhance the MDNBR during the CR banks withdrawal event of the SMART

  9. Methodological Aspects of Modeling Development and Viability of Systems and Counterparties in the Digital Economy

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2018-03-01

    Full Text Available The aim of the article is to study and generalize methodological approaches to modeling economic development and viability of economic systems with consideration for risk, changing their goals, status, and behavior in the digital economy. The definition of categories of economic development and viability is offered, the directions of their research by means of mathematical modeling are grounded. The system of characteristics and markers of the external economic environment under conditions of digitalization of economic activity is analyzed. The theoretical foundations and methodology for mathematical modeling of development of economic systems as well as ensuring their viability and security under conditions of introducing infrastructure of information society and digital economy on the principles of the information and knowledge approach are considered. It is proved that in an information society, predictive model technologies are a growing safety resource. There studied prerequisites for replacing the traditional integration concept of evaluation, analysis, modeling, management, and administration of economic development based on a threat-oriented approach to the definition of security protectors, information, and knowledge. There proposed a concept of creating a database of models for examining trends and patterns of economic development, which, unlike traditional trend models of dynamics, identifies and iteratively conceptualizes processes based on a set of knowledgeable predictors based on the use of data mining and machine learning tools, including in-depth training.

  10. THEORETIC AND METHODOLOGIC BASICS OF DEVELOPMENT OF THE NATIONAL LOGISTICS SYSTEM IN THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    R. B. Ivut

    2016-01-01

    Full Text Available The article presents the results of a study, the aim of which is the formation of the theoretical and methodological foundations in the framework of scientific maintenance for the further development processes of the national logistics system in the Republic of Belarus. The relevance of the study relates to the fact that at present the introduction of the concept of logistics and the formation of the optimal infrastructure for its implementation are the key factors for economic development of Belarus as a transit country. At the same time the pace of development of the logistic activities in the country is currently slightly lower in comparison with the neighboring countries, as evidenced by the dynamics of the country’s position in international rankings (in particular, according to the LPI index. Overcoming these gaps requires improved competitiveness of the logistics infrastructure in the international market. This, in turn, is possible due to the clear formulation and adherence of the effective functioning principles for macro logistics system of Belarus, as well as by increasing the quality of logistics design by means of applying econometric models and methods presented in the article. The proposed auctorial approach is the differentiation of the general principles of logistics specific to the logistics systems of all levels, and the specific principles of development of the macro level logistics system related to improving its transit attractiveness for international freight carriers. The study also systematizes the model for determining the optimal location of logistics facilities. Particular attention is paid to the methodological basis of the analysis of transport terminals functioning as part of the logistics centers both in the stages of design and operation. The developed theoretical and methodological recommendations are universal and can be used in the design of the logistics infrastructure for various purposes and functions

  11. Radiological risk assessment for the public under the loss of medium and large sources using bayesian methodology

    International Nuclear Information System (INIS)

    Kim, Joo Yeon; Jang, Han Ki; Lee, Jai Ki

    2005-01-01

    Bayesian methodology is appropriated for use in PRA because subjective knowledges as well as objective data are applied to assessment. In this study, radiological risk based on Bayesian methodology is assessed for the loss of source in field radiography. The exposure scenario for the lost source presented in U.S. NRC is reconstructed by considering the domestic situation and Bayes theorem is applied to updating of failure probabilities of safety functions. In case of updating of failure probabilities, it shows that 5% Bayes credible intervals using Jeffreys prior distribution are lower than ones using vague prior distribution. It is noted that Jeffreys prior distribution is appropriated in risk assessment for systems having very low failure probabilities. And, it shows that the mean of the expected annual dose for the public based on Bayesian methodology is higher than the dose based on classical methodology because the means of the updated probabilities are higher than classical probabilities. The database for radiological risk assessment are sparse in domestic. It summarizes that Bayesian methodology can be applied as an useful alternative for risk assessment and the study on risk assessment will be contributed to risk-informed regulation in the field of radiation safety

  12. Methodology of gender research and local development concepts: report on workshop, 11-12 November 1999

    OpenAIRE

    Klein-Hessling, Ruth

    2000-01-01

    Der Workshop "Methodology of Gender Research and Local Development Concepts" wurde aus Anlass eines Besuches zweier Angehöriger der 'Ahfad University of Women' Omdurmman, Sudan von der 'Gender Division of the Sociology of Development Research Centre' an der Universität Bielefeld organisiert und von ungefähr 30 Teilnehmern besucht. Erfahrungen von empirischen Feldforschungsarbeiten aus dem Sudan, Kenia, Ruanda, Westafrika und Südasien bildeten den Ausgangspunkt für Diskussionen über methodolo...

  13. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    Santana, Priscila do Carmo

    2010-01-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  14. Hadoop Cluster Deployment: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Ronaldo Celso Messias Correia

    2018-05-01

    Full Text Available For a long time, data has been treated as a general problem because it just represents fractions of an event without any relevant purpose. However, the last decade has been just about information and how to get it. Seeking meaning in data and trying to solve scalability problems, many frameworks have been developed to improve data storage and its analysis. As a framework, Hadoop was presented as a powerful tool to deal with large amounts of data. However, it still causes doubts about how to deal with its deployment and if there is any reliable method to compare the performance of distinct Hadoop clusters. This paper presents a methodology based on benchmark analysis to guide the Hadoop cluster deployment. The experiments employed The Apache Hadoop and the Hadoop distributions of Cloudera, Hortonworks, and MapR, analyzing the architectures on local and on clouding—using centralized and geographically distributed servers. The results show the methodology can be dynamically applied on a reliable comparison among different architectures. Additionally, the study suggests that the knowledge acquired can be used to improve the data analysis process by understanding the Hadoop architecture.

  15. Ranking of Higher Education Institutions: Ideology and Methodology of Development (Russian Practice

    Directory of Open Access Journals (Sweden)

    I V Trotsuk

    2009-03-01

    Full Text Available The article comprises the second part of the analytical review of ideology, methodology and actual practice of higher education institutions ranking development (the first part revealing the international experience was published in the second issue of the journal in 2008. The author examines the current circumstances of higher education institutions ranking and particular education programmes in Russia. Inparticular, the main approaches to ranking elaboration primarily associated with the authors’ and clients’ «status» and the appropriate goals of higher education institutions ranking are revealed in the paper.

  16. An increasing number of qualitative research papers in oncology and palliative care: does it mean a thorough development of the methodology of research?

    Directory of Open Access Journals (Sweden)

    Brunelli Cinzia

    2004-01-01

    Full Text Available Abstract Background In the second half of the nineties, a scientific debate about the usefulness of qualitative research in medicine began in the main medical journals as well as the amount of "qualitative" papers published on peer reviewed journals has noticeably increased during these last years. Nevertheless the label of qualitative methodology has been assigned to an heterogeneous collection of studies. Some of them show a complete awareness of the specificity of this kind of research, while others are still largely influenced by the quantitative paradigm prevailing in the medical field. The concern with the rigour and credibility of qualitative methods has lead to the development of a number of checklist for assessing qualitative research. The purposes of this review were to describe the quality of the development of qualitative research in the medical field, focusing on oncology and palliative care, and to discuss the applicability of a descriptive checklist. Methods A review was conducted on Medline and PsycINFO databases. On the basis of their abstract, papers found have been classified considering: publication year, kind of journal, paper type, data gathering method, sample size and declared methodological approach. A sub sample of the previous papers was than selected and their methodological characteristics were evaluated based on a descriptive checklist. Results 351 abstracts and 26 full papers were analysed. An increase over time in the number of qualitative studies is evident. While most of the papers before 1999 were published on nursing journals (43%, afterwards also medical journals were largely represented. Psychological journals increased from 7% to 12%. The 22% of studies used a sample size lower than 15 and the 15% did not specify the sample size in the abstract. The methodological approach was also often not specified and the percentage increased in the second time period (from 73% to 80%. Grounded theory was the most

  17. Risk-assessment methodology development for waste isolation in geologic media

    International Nuclear Information System (INIS)

    Stevens, C.A.; Fullwood, R.R.; Amirijafari, B.; Basin, S.L.; Cohen, J.

    1982-12-01

    A review of three documents prepared for the USNRC by Sandia National Laboratories (SNL) is presented. These are NUREG/CR-1634, Volume 4 concerned with the effects of variable hydrology on waste migration; NUREG/CR-2324, a user's manual for SWIFT; and NUREG/2343, a user's manual for DNET. This review completes Task 4 of the detailed technical review of the SNL program for Risk Assessment Methodology Development for Waste Isolation in Geologic Media. In general, these reports exhibit high technical quality that characterizes the SNL work. They are tersely written with little condescension to the non-expert reader for understanding the physical situation being modeled. Indeed, the emphasis is on the mathematical procedures rather than the repository physics, leaving the adequacy of the results presented in many computer plots, pretty much to the interpretation of the reader. Other general comments have been presented previously, such as the data conservatisms, need for data that cannot be measured without disturbing the geometry, and the overall plan for use of the many codes developed in the program

  18. Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics

    Science.gov (United States)

    Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.

    2016-01-01

    Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…

  19. Methodology for Monitoring Sustainable Development of Isolated Microgrids in Rural Communities

    Directory of Open Access Journals (Sweden)

    Claudia Rahmann

    2016-11-01

    Full Text Available Microgrids are a rapidly evolving and increasingly common form of local power generation used to serve the needs of both rural and urban communities. In this paper, we present a methodology to evaluate the evolution of the sustainability of stand-alone microgrids projects. The proposed methodology considers a composite sustainability index (CSI that includes both positive and negative impacts of the operation of the microgrid in a given community. The CSI is constructed along environmental, social, economic and technical dimensions of the microgrid. The sub-indexes of each dimension are aggregated into the CSI via a set of adaptive weighting factors, which indicate the relative importance of the corresponding dimension in the sustainability goals. The proposed methodology aims to be a support instrument for policy makers especially when defining sound corrective measures to guarantee the sustainability of small, isolated microgrid projects. To validate the performance of the proposed methodology, a microgrid installed in the northern part of Chile (Huatacondo has been used as a benchmarking project.

  20. Estimation of the Joint Patient Condition Occurrence Frequencies from Operation Iraqi Freedom and Operation Enduring Freedom. Volume I: Development of Methodology

    Science.gov (United States)

    2011-03-28

    Chest PCOF Vol. 1: Development of Methodology 18 Supplemented BM To more accurately describe combat trauma , a slight modification was made to the BM... Pneumothorax without Open Wound into Thorax INTERNAL ORGAN CHEST 860.1 Traumatic Pneumothorax with open Wound into Thorax INTERNAL ORGAN CHEST ...with Open Wound into Thorax INTERNAL ORGAN CHEST PCOF Vol. 1: Development of Methodology 31 DMMPO ICD-9 codes Trauma category Anatomical location