WorldWideScience

Sample records for methodological developments large

  1. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  2. Development of theoretical methodology for large molecules

    International Nuclear Information System (INIS)

    Maggiora, G.M.; Christoffersen, R.E.; Yoffe, J.A.; Petke, J.D.

    1981-01-01

    A major advantage of the use of floating spherical Gaussian or vitals (FSGO) is the extreme rapidity with which the necessary quantum mechanical integrals can be evaluated. This advantage has been exploited in several quantum mechanical procedures for molecular electronic structure calculations, as described below. Several other properties of these functions have also been exploited, and have led to the development of semiclassical point charge and harmonic oscillator models capable of describing first and second order electromagnetic properties and intermolecular forces with reasonable accuracy in all cases and with considerably better accuracy than much more elaborate theoretical procedures in some cases. These applications are also described below. The primary intent of the current paper is to present an overview of some of the uses of FSGOs in the study of molecular electronic structure and properties and to indicate possible directions for future applications. No attempt will be made to include all possible applications. Rather, those applications of interest to the authors have been stressed. Hopefully, this paper will further stimulate the development of additional uses of these remarkable functions

  3. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Science.gov (United States)

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  4. Development of an Evaluation Methodology for Loss of Large Area induced from extreme events

    International Nuclear Information System (INIS)

    Kim, Sok Chul; Park, Jong Seuk; Kim, Byung Soon; Jang, Dong Ju; Lee, Seung Woo

    2015-01-01

    USNRC announced several regulatory requirements and guidance documents regarding the event of loss of large area including 10CFR 50.54(hh), Regulatory Guide 1.214 and SRP 19.4. In Korea, consideration of loss of large area has been limitedly taken into account for newly constructing NPPs as voluntary based. In general, it is hardly possible to find available information on methodology and key assumptions for the assessment of LOLA due to 'need to know based approach'. Urgent needs exists for developing country specific regulatory requirements, guidance and evaluation methodology by themselves with the consideration of their own geographical and nuclear safety and security environments. Currently, Korea Hydro and Nuclear Power Company (KHNP) has developed an Extended Damage Mitigation Guideline (EDMG) for APR1400 under contract with foreign consulting company. The submittal guidance NEI 06-12 related to B.5.b Phase 2 and 3 focused on unit-wise mitigation strategy instead of site level mitigation or response strategy. Phase 1 mitigating strategy and guideline for LOLA (Loss of Large Area) provides emphasis on site level arrangement including cooperative networking outside organizations and agile command and control system. Korea Institute of Nuclear Safety has carried out a pilot in-house research project to develop the methodology and guideline for evaluation of LOLA since 2014. This paper introduces the summary of major results and outcomes of the aforementioned research project. After Fukushima Dai-Ichi accident, the awareness on countering the event of loss of large area induced from extreme man-made hazards or extreme beyond design basis external event. Urgent need exists to develop regulatory guidance for coping with this undesirable situation, which has been out of consideration at existing nuclear safety regulatory framework due to the expectation of rare possibility of occurrence

  5. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    Science.gov (United States)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  6. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  7. Development of an Evaluation Methodology for Loss of Large Area Induced from Extreme Events with Malicious Origin

    International Nuclear Information System (INIS)

    Kim, S.C.; Park, J.S.; Chang, D.J.; Kim, D.H.; Lee, S.W.; Lee, Y.J.; Kim, H.W.

    2016-01-01

    Event of loss of large area (LOLA) induced from extreme external event at multi-units nuclear installation has been emerged a new challenges in the realm of nuclear safety and regulation after Fukushima Dai-Ichi accident. The relevant information and experience on evaluation methodology and regulatory requirements are rarely available and negative to share due to the security sensitivity. Most of countries has been prepared their own regulatory requirements and methodologies to evaluate impact of LOLA at nuclear power plant. In Korea, newly amended the Nuclear Safety Acts requires to assess LOLA in terms of EDMG (Extended Damage Mitigation Guideline). Korea Institute of Nuclear Safety (KINS) has performed a pilot research project to develop the methodology and regulatory review guidance on LOLA at multi-units nuclear power plant since 2014. Through this research, we proposed a methodology to identify the strategies for preventive and mitigation of the consequences of LOLA utilizing PSA techniques or its results. The proposed methodology is comprised of 8 steps including policy consideration, threat evaluation, identification of damage path sets, SSCs capacity evaluation and identification of mitigation measures and strategies. The consequence of LOLA due to malevolent aircraft crash may significantly susceptible with analysis assumptions including type of aircraft, amount of residual fuel, and hittable angle and so on, which cannot be shared overtly. This paper introduces a evaluation methodology for LOLA using PSA technique and its results. Also we provide a case study to evaluate hittable access angle using flight simulator for two types of air crafts and to identify potential path sets leading to core damage by affected SSCs within damaged area.(author).

  8. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  9. Setting up fuel supply strategies for large-scale bio-energy projects using agricultural and forest residues. A methodology for developing countries

    International Nuclear Information System (INIS)

    Junginger, M.

    2000-08-01

    The objective of this paper is to develop a coherent methodology to set up fuel supply strategies for large-scale biomass-conversion units. This method will explicitly take risks and uncertainties regarding availability and costs in relation to time into account. This paper aims at providing general guidelines, which are not country-specific. These guidelines cannot provide 'perfect fit'-solutions, but aim to give general help to overcome barriers and to set up supply strategies. It will mainly focus on residues from the agricultural and forestry sector. This study focuses on electricity or both electricity and heat production (CHP) with plant scales between 1040 MWe. This range is chosen due to rules of economies of scale. In large-scale plants the benefits of increased efficiency outweigh increased transportation costs, allowing a lower price per kWh which in turn may allow higher biomass costs. However, fuel-supply risks tend to get higher with increasing plant size, which makes it more important to assess them for large(r) conversion plants. Although the methodology does not focus on a specific conversion technology, it should be stressed that the technology must be able to handle a wide variety of biomass fuels with different characteristics because many biomass residues are not available the year round and various fuels are needed for a constant supply. The methodology allows for comparing different technologies (with known investment and operational and maintenance costs from literature) and evaluation for different fuel supply scenarios. In order to demonstrate the methodology, a case study was carried out for the north-eastern part of Thailand (Isaan), an agricultural region. The research was conducted in collaboration with the Regional Wood Energy Development Programme in Asia (RWEDP), a project of the UN Food and Agricultural Organization (FAO) in Bangkok, Thailand. In Section 2 of this paper the methodology will be presented. In Section 3 the economic

  10. Development by a Large Integrated Health Care System of an Objective Methodology for Evaluation of Medical Oncology Service Sites.

    Science.gov (United States)

    Bjegovich-Weidman, Marija; Kahabka, Jill; Bock, Amy; Frick, Jacob; Kowalski, Helga; Mirro, Joseph

    2012-03-01

    Aurora Health Care (AHC) is the largest health care system in Wisconsin, with 14 acute care hospitals. In early 2010, a group of 18 medical oncologists became affiliated with AHC. This affiliation added 13 medical oncology infusion clinics to our existing 12 sites. In the era of health care reform and declining reimbursement, we need an objective method and criteria to evaluate our 25 outpatient medical oncology sites. We developed financial, clinical, and strategic tools for the evaluation and management of our cancer subservice lines and outpatient sites. The key to our success has been the direct involvement of stakeholders with a vested interest in the services in the selection of the criteria and evaluation process. We developed our objective metrics for evaluation based on strategic, financial, operational, and patient experience criteria. Strategic criteria included: population trends, full-time equivalent (FTE) medical oncologists/primary care physicians, FTE radiation oncologists, FTE oncologic surgeons, new annual cases of patients with cancer, and market share trends. Financial criteria per site included: physician work relative value units, staff FTE by type, staff salaries, and profit and loss. Operational criteria included: facility by type (clinic v hospital based), hours of operation, and facility detail (eg, No. of chairs, No. of procedure and examination rooms, square footage). Patient experience criteria included: nursing model primary/nurse navigators, multidisciplinary support at site, Press Ganey (South Bend, IN; health care performance improvement company) results, and employee engagement score. The outcome of our data analysis has resulted in the development of recommendations for AHC senior leadership and geographic market leadership to consider the consolidation of four sites (phase one, four sites; phase two, two sites) and priority strategic sites to address capacity issues that limit growth. The recommendations if implemented would

  11. Safeguards methodology development history

    International Nuclear Information System (INIS)

    Chapman, L.D.; Bennett, H.A.; Engi, D.; Grady, L.M.; Hulme, B.L.; Sasser, D.W.

    1979-01-01

    The development of models for the evaluation and design of fixed-site nuclear facility, physical protection systems was under way in 1974 at Sandia Laboratories and has continued to the present. A history of the evolution of these models and the model descriptions are presented. Several models have been and are continuing to be applied to evaluate and design facility protection systems

  12. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available as the acceleration of the system development phases through an iterative construction approach. These methodologies also claim to manage the changing nature of requirements. However, during the development of large and complex systems by a small and technically...

  13. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  14. Adaptation of a software development methodology to the implementation of a large-scale data acquisition and control system. [for Deep Space Network

    Science.gov (United States)

    Madrid, G. A.; Westmoreland, P. T.

    1983-01-01

    A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.

  15. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  16. Remotely controlled large container disposal methodology

    International Nuclear Information System (INIS)

    Amir, S.J.

    1994-09-01

    Remotely Handled Large Containers (RHLC), also called drag-off boxes, have been used at the Hanford Site since the 1940s to dispose of large pieces of radioactively contaminated equipment. These containers are typically large steel-reinforced concrete boxes, which weigh as much as 40 tons. Because large quantities of high-dose waste can produce radiation levels as high as 200 mrem/hour at 200 ft, the containers are remotely handled (either lifted off the railcar by crane or dragged off with a cable). Many of the existing containers do not meet existing structural and safety design criteria and some of the transportation requirements. The drag-off method of pulling the box off the railcar using a cable and a tractor is also not considered a safe operation, especially in view of past mishaps

  17. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  18. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    Science.gov (United States)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean

  19. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  20. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  1. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  2. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  3. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  4. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  5. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  6. A study on methodological of software development for HEP

    International Nuclear Information System (INIS)

    Ding Yuzheng; Dai Guiliang

    1999-01-01

    The HEP related software system is a large one. It comprises mainly detector simulation software, DAQ software and offline system. The author discusses the advantages of OO object oriented methodologies applying to such software system, and the basic strategy for the usage of OO methodologies, languages and tools in the development of the HEP related software are given

  7. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  8. Developing educational hypermedia applications: a methodological approach

    Directory of Open Access Journals (Sweden)

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  9. Comparative study on software development methodologies

    OpenAIRE

    Mihai Liviu DESPA

    2014-01-01

    This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager han...

  10. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  11. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  12. New computational methodology for large 3D neutron transport problems

    International Nuclear Information System (INIS)

    Dahmani, M.; Roy, R.; Koclas, J.

    2004-01-01

    We present a new computational methodology, based on 3D characteristics method, dedicated to solve very large 3D problems without spatial homogenization. In order to eliminate the input/output problems occurring when solving these large problems, we set up a new computing scheme that requires more CPU resources than the usual one, based on sweeps over large tracking files. The huge capacity of storage needed in some problems and the related I/O queries needed by the characteristics solver are replaced by on-the-fly recalculation of tracks at each iteration step. Using this technique, large 3D problems are no longer I/O-bound, and distributed CPU resources can be efficiently used. (authors)

  13. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  14. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  15. The prosa methodology for scenario development

    International Nuclear Information System (INIS)

    Grupa, J.B.

    2001-01-01

    In this paper a methodology for scenario development is proposed. The method is developed in an effort to convince ourselves (and others) that all conceivable future developments of a waste repository have been covered. To be able to assess all conceivable future developments, the method needs to be comprehensive. To convince us and others the method should be structured in such a way that the treatment of each conceivable future development is traceable. The methodology is currently being applied to two Dutch disposal designs. Preliminary results show that the elaborated method functions better than the original method. However, some elements in the method will need further refinement. (author)

  16. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  17. Evolution of courseware development methodology : recent issues

    NARCIS (Netherlands)

    Moonen, J.C.M.M.; Schoenmaker, Jan

    1992-01-01

    To improve the quality of courseware products and the efficiency of the courseware development process, a methodology based upon "courseware engineering", being a combination of instructional systems development and software engineering, has emerged over the last 10¿15 years. Recently, software

  18. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  19. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  20. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  1. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral......World-wide, much attention has been given to cadastral development. As a consequence of experiences made during the last decades, several authors have stated the need of research in the domain of cadastre and proposed methodologies to be used. The purpose of this paper is to contribute...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  2. Development Methodology for an Integrated Legal Cadastre

    NARCIS (Netherlands)

    Hespanha, J.P.

    2012-01-01

    This Thesis describes the research process followed in order to achieve a development methodology applicable to the reform of cadastral systems with a legal basis. It was motivated by the author’s participation in one of the first surveying and mapping operations for a digital cadastre in Portugal,

  3. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology

  4. Development of seismic PSA methodology at JAERI

    International Nuclear Information System (INIS)

    Muramatsu, K.; Ebisawa, K.; Matsumoto, K.; Oikawa, T.; Kondo, M.

    1995-01-01

    The Japan Atomic Energy Research Institute (JAERI) is developing a methodology for seismic probabilistic safety assessment (PSA) of nuclear power plants, aiming at providing a set of procedures, computer codes and data suitable for performing seismic PSA in Japan. In order to demonstrate the usefulness of JAERI's methodology and to obtain better understanding on the controlling factors of the results of seismic PSAs, a seismic PSA for a BWR is in progress. In the course of this PSA, various improvements were made on the methodology. In the area of the hazard analysis, the application of the current method to the model plant site is being carried out. In the area of response analysis, the response factor method was modified to consider the non-linear response effect of the building. As for the capacity evaluation of components, since capacity data for PSA in Japan are very scarce, capacities of selected components used in Japan were evaluated. In the systems analysis, the improvement of the SECOM2 code was made to perform importance analysis and sensitivity analysis for the effect of correlation of responses and correlation of capacities. This paper summarizes the recent progress of the seismic PSA research at JAERI with emphasis on the evaluation of component capacity and the methodology improvement of systems reliability analysis. (author)

  5. Synthesis of methodology development and case studies

    OpenAIRE

    Roetter, R.P.; Keulen, van, H.; Laar, van, H.H.

    2000-01-01

    The .Systems Research Network for Ecoregional Land Use Planning in Support of Natural Resource Management in Tropical Asia (SysNet). was financed under the Ecoregional Fund, administered by the International Service for National Agricultural Research (ISNAR). The objective of the project was to develop and evaluate methodologies and tools for land use analysis, and apply them at the subnational scale to support agricultural and environmental policy formulation. In the framework of this projec...

  6. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  7. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  8. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  9. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  10. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...... the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main...

  11. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  12. Scrum of scrums solution for large size teams using scrum methodology

    OpenAIRE

    Qurashi, Saja Al; Qureshi, M. Rizwan Jameel

    2014-01-01

    Scrum is a structured framework to support complex product development. However, Scrum methodology faces a challenge of managing large teams. To address this challenge, in this paper we propose a solution called Scrum of Scrums. In Scrum of Scrums, we divide the Scrum team into teams of the right size, and then organize them hierarchically into a Scrum of Scrums. The main goals of the proposed solution are to optimize communication between teams in Scrum of Scrums; to make the system work aft...

  13. Extending statistical boosting. An overview of recent methodological developments.

    Science.gov (United States)

    Mayr, A; Binder, H; Gefeller, O; Schmid, M

    2014-01-01

    Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately in the literature up to now. The methodological developments on statistical boosting during the last ten years can be grouped into three different lines of research: i) efforts to ensure variable selection leading to sparser models, ii) developments regarding different types of predictor effects and how to choose them, iii) approaches to extend the statistical boosting framework to new regression settings. Statistical boosting algorithms have been adapted to carry out unbiased variable selection and automated model choice during the fitting process and can nowadays be applied in almost any regression setting in combination with a large amount of different types of predictor effects.

  14. Large shaft development test plan

    International Nuclear Information System (INIS)

    Krug, A.D.

    1984-03-01

    This test plan proposes the conduct of shaft liner tests as part of the large shaft development test proposed for the Hanford Site in support of the repository development program. The objectives of these tests are to develop techniques for measuring liner alignment (straightness), both construction assembly alignment and downhole cumulative alignment, and to assess the alignment information as a real time feedback to aid the installation procedure. The test plan is based upon installing a 16 foot ID shaft liner into a 20 foot diameter shaft to a depth of 1000 feet. This test plan is considered to be preliminary in that it was prepared as input for the decision to determine if development testing is required in this area. Should the decision be made to proceed with development testing, this test plan shall be updated and revised. 6 refs., 2 figs

  15. Large shaft development test plan

    International Nuclear Information System (INIS)

    Krug, A.D.

    1984-03-01

    This test plan proposes the conduct of a large shaft development test at the Hanford site in support of the repository development program. The purpose and objective of the test plan is to obtain the information necessary to establish feasibility and to predict the performance of the drilling system used to drill large diameter shafts. The test plan is based upon drilling a 20 ft diameter shaft to a depth of 1,000 feet. The test plan specifies series of tests to evaluate the performance of the downhole assembly, the performance of the rig, and the ability of the system to cope with geologic hazards. The quality of the hole produced will also be determined. This test plan is considered to be preliminary in that it was prepared as input for the decision to determine if development testing is required in this area. Should the decision be made to proceed with development testing, this test plan shall be updated and revised. 6 refs., 2 figs., 3 tabs

  16. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  17. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Kim, Yoon Ik; Yang, Hui Chang; Huh, Byeong Gill; Lee, Dong Won; Ahn, Gwan Won [Seoul National Univ., Seoul (Korea, Republic of)

    2001-03-15

    The purpose of this study IS the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of nuclear power plants. In this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested. Configuration control assessment methodology suggested in this study for the purpose of the development of configuration control methodology reflecting the characteristics of Korean NPPs, can be utilized as the supplement of current PSA methodologies.

  18. Large superconducting coil fabrication development

    International Nuclear Information System (INIS)

    Brown, R.L.; Allred, E.L.; Anderson, W.C.; Burn, P.B.; Deaderick, R.I.; Henderson, G.M.; Marguerat, E.F.

    1975-01-01

    Toroidal fields for some fusion devices will be produced by an array of large superconducting coils. Their size, space limitation, and field requirements dictate that they be high performance coils. Once installed, accessibility for maintenance and repairs is severely restricted; therefore, good reliability is an obvious necessity. Sufficient coil fabrication will be undertaken to develop and test methods that are reliable, fast, and economical. Industrial participation will be encouraged from the outset to insure smooth transition from development phases to production phases. Initially, practice equipment for three meter bore circular coils will be developed. Oval shape coil forms will be included in the practice facility later. Equipment that is more automated will be developed with the expectation of winding faster and obtaining good coil quality. Alternate types of coil construction, methods of winding and insulating, will be investigated. Handling and assembly problems will be studied. All technology developed must be feasible for scaling up when much larger coils are needed. Experimental power reactors may need coils having six meter or larger bores

  19. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    Science.gov (United States)

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  20. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Chang Hyeon; Yu, Yeong Woo; Cho, Jae Seon; Kim, Ju Yeol; Kim, Yun Ik; Yang, Hui Chang; Park, Gang Min; Hur, Byeong Gil [Seoul National Univ., Seoul (Korea, Republic of)

    1999-03-15

    The purpose of this study is the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of Nuclear Power Plant. The development of this methodology can contribute to enhance safety. In the first year of this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested.

  1. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  2. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  3. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    Science.gov (United States)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  4. Monitoring sustainable biomass flows : General methodology development

    NARCIS (Netherlands)

    Goh, Chun Sheng; Junginger, Martin; Faaij, André

    Transition to a bio-based economy will create new demand for biomass, e.g. the increasing use of bioenergy, but the impacts on existing markets are unclear. Furthermore, there is a growing public concern on the sustainability of biomass. This study proposes a methodological framework for mapping

  5. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  6. Design Methodology of Large-scale Thermoelectric Generation

    DEFF Research Database (Denmark)

    Chen, Min; Gao, Junling; Zhu, Junpeng

    2011-01-01

    A thermoelectric generation system (TEGS) consists of not only thermoelectric modules (TEMs), but also the external load circuitry and the fluidic heat sources. In this paper, a system-level model is proposed in the SPICE-compatible environment to seamlessly integrate the complete fluid-thermal-e......A thermoelectric generation system (TEGS) consists of not only thermoelectric modules (TEMs), but also the external load circuitry and the fluidic heat sources. In this paper, a system-level model is proposed in the SPICE-compatible environment to seamlessly integrate the complete fluid......-thermal-electric-circuit multiphysics behaviors. Firstly, a quasi one-dimension numerical model for the thermal fluids and their non-uniform temperature distribution as the boundary condition for TEMs is implemented in SPICE using electrothermal analogy. Secondly, the electric field calculation of the previously proposed device......-level SPICE model is upgraded to reflect the resistive behaviors of thermoelements, so that the electric connections among spatially distributed TEMs and the load circuitry can be freely combined in the simulation. Thirdly, a hierarchical and TEM-object oriented strategy is developed to make the system...

  7. On-Line Maintenance Methodology Development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyo Won; Kim, Jae Ho; Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2012-05-15

    Most of domestic maintenance activities for nuclear power plants are performed while overhaul. Therefore, On-Line Maintenance (OLM) is one of the proper risks informed application techniques for diffusing maintenance burden during overhaul with safety of the plant is secured. The NUMARC 93-01 (Rev.3) presents the OLM state of the art and it provides methodology. This study adopts NUMARC 93-01 (Rev.3) and present OLM. The reference component is Emergency Diesel Generator (EDG) of Ulchin 3, 4

  8. Inkjet printed large-area flexible circuits: a simple methodology for optimizing the printing quality

    Science.gov (United States)

    Cheng, Tao; Wu, Youwei; Shen, Xiaoqin; Lai, Wenyong; Huang, Wei

    2018-01-01

    In this work, a simple methodology was developed to enhance the patterning resolution of inkjet printing, involving process optimization as well as substrate modification and treatment. The line width of the inkjet-printed silver lines was successfully reduced to 1/3 of the original value using this methodology. Large-area flexible circuits with delicate patterns and good morphology were thus fabricated. The resultant flexible circuits showed excellent electrical conductivity as low as 4.5 Ω/□ and strong tolerance to mechanical bending. The simple methodology is also applicable to substrates with various wettability, which suggests a general strategy to enhance the printing quality of inkjet printing for manufacturing high-performance large-area flexible electronics. Project supported by the National Key Basic Research Program of China (Nos. 2014CB648300, 2017YFB0404501), the National Natural Science Foundation of China (Nos. 21422402, 21674050), the Natural Science Foundation of Jiangsu Province (Nos. BK20140060, BK20130037, BK20140865, BM2012010), the Program for Jiangsu Specially-Appointed Professors (No. RK030STP15001), the Program for New Century Excellent Talents in University (No. NCET-13-0872), the NUPT "1311 Project" and Scientific Foundation (Nos. NY213119, NY213169), the Synergetic Innovation Center for Organic Electronics and Information Displays, the Priority Academic Program Development of Jiangsu Higher Education Institutions (PAPD), the Leading Talent of Technological Innovation of National Ten-Thousands Talents Program of China, the Excellent Scientific and Technological Innovative Teams of Jiangsu Higher Education Institutions (No. TJ217038), the Program for Graduate Students Research and Innovation of Jiangsu Province (No. KYZZ16-0253), and the 333 Project of Jiangsu Province (Nos. BRA2017402, BRA2015374).

  9. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  10. A component-based groupware development methodology

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.

    2000-01-01

    Software development in general and groupware applications in particular can greatly benefit from the reusability and interoperability aspects associated with software components. Component-based software development enables the construction of software artefacts by assembling prefabricated,

  11. The Typology of Methodological Approaches to Development of Innovative Clusters

    Directory of Open Access Journals (Sweden)

    Farat Olexandra V.

    2017-06-01

    Full Text Available The aim of the article is to study the existing methodological approaches to assessing the development of enterprises for further substantiation of possibilities of their using by cluster associations. As a result of research, based on the analysis of scientific literature, the most applicable methodological approaches to assessing the development of enterprises are characterized. 8 methodical approaches to assessing the level of development of enterprises and 4 methodological approaches to assessing the level of development of clusters are singled out. Each of the approaches is characterized by the presence of certain advantages and disadvantages, but none of them allows to obtain a systematic assessment of all areas of cluster functioning, identify possible reserves for cluster competitiveness growth and characterize possible strategies for their future development. Taking into account peculiarities of the functioning and development of cluster associations of enterprises, we propose our own methodological approach for assessing the development of innovative cluster structures.

  12. Development methodology for industrial diesel engines; Entwicklungsmethode fuer Industrie-Dieselmotoren

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Dirk; Kech, Johannes [MTU Friedrichshafen GmbH (Germany)

    2011-11-15

    In order to remain cost-effective with relatively low production volumes in spite of the high requirements regarding emissions and durability, MTU uses a clearly structured development methodology with a close interlinking of technology and product development in the development of its large engines. For the new engine of the 4000 Series with cooled EGR, MTU applied this methodology in order to implement the emissions concept from the initial idea right through to the serial product. (orig.)

  13. Cooperative learning as a methodology for inclusive education development

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz Martínez

    2017-06-01

    Full Text Available This paper presents the methodology of cooperative learning as a strategy to develop the principles of inclusive education. It has a very practical orientation, with the intention of providing tools for teachers who want to implement this methodology in the classroom, starting with a theoretical review, and then a description of a case in which they have worked this methodology for 5 years. We describe specific activities and ways of working with students, later reaching conclusions on the implementation of the methodology.

  14. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  15. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  16. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  17. Developments in reactor materials science methodology

    International Nuclear Information System (INIS)

    Tsykanov, V.A.; Ivanov, V.B.

    1987-01-01

    Problems related to organization of investigations into reactor materials science are considered. Currently the efficiency and reliability of nuclear power units are largely determined by the fact, how correctly and quickly conclusions concerning the parameters of designs and materials worked out for a long time in reactor cores, are made. To increase information value of materials science investigations it is necessary to create a uniform system, providing for solving methodical, technical and organizational problems. Peculiarities of the current state of reactor material science are analysed and recommendations on constructing an optimal scheme of investigations and data flow interconnection are given

  18. Development of Engine Loads Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR seeks to improve the definition of design loads for rocket engine components such that higher performing, lighter weight engines can be developed more...

  19. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  20. A Review of Roads Data Development Methodologies

    Directory of Open Access Journals (Sweden)

    Taro Ubukawa

    2014-05-01

    Full Text Available There is a clear need for a public domain data set of road networks with high special accuracy and global coverage for a range of applications. The Global Roads Open Access Data Set (gROADS, version 1, is a first step in that direction. gROADS relies on data from a wide range of sources and was developed using a range of methods. Traditionally, map development was highly centralized and controlled by government agencies due to the high cost or required expertise and technology. In the past decade, however, high resolution satellite imagery and global positioning system (GPS technologies have come into wide use, and there has been significant innovation in web services, such that a number of new methods to develop geospatial information have emerged, including automated and semi-automated road extraction from satellite/aerial imagery and crowdsourcing. In this paper we review the data sources, methods, and pros and cons of a range of road data development methods: heads-up digitizing, automated/semi-automated extraction from remote sensing imagery, GPS technology, crowdsourcing, and compiling existing data sets. We also consider the implications for each method in the production of open data.

  1. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  2. Development of risk-informed assessment (RIA) design methodology

    International Nuclear Information System (INIS)

    Ji, S. K.; Park, S. J.; Park, B. R.; Kim, M. R.; Choi, C. J.

    2001-01-01

    It has been assessed that the capital cost for future nuclear power plants needs to be reduced on the order of 35% to 40% for Advanced Light Water Reactors such as KNGR and System 80+. Such reduction in the capital cost will require a fundamental re-evaluation of the industry standards and regulatory basis under which nuclear plants are designed and licensed. The objective of this study is to develop the risk-informed assessment (RIA) design methodology for future nuclear power plants. In order to meet this objective, the design simplification method is developed and RIA design methodology exercised for conceptual system. For the methodology verification, simplified conceptual ECCS and feedwater system are developed, then LOCA sensitivity analyses and agressive secondary cooldown analyses for these systems are performed. In addition, the probability safety assessment (PSA) model for LOCA is developed and the validation of RIA design methodology is demonstrated

  3. Large rotorcraft transmission technology development program

    Science.gov (United States)

    Mack, J. C.

    1983-01-01

    Testing of a U.S. Army XCH-62 HLH aft rotor transmission under NASA Contract NAS 3-22143 was successfully completed. This test establishes the feasibility of large, high power rotorcraft transmissions as well as demonstrating the resolution of deficiencies identified during the HLH advanced technology programs and reported by USAAMRDLTR-77-38. Over 100 hours of testing was conducted. At the 100% design power rating of 10,620 horsepower, the power transferred through a single spiral bevel gear mesh is more than twice that of current helicopter bevel gearing. In the original design of these gears, industry-wide design methods were employed and failures were experienced which identified problem areas unique to gear size. To remedy this technology shortfall, a program was developed to predict gear stresses using finite element analysis for complete and accurate representation of the gear tooth and supporting structure. To validate the finite element methodology gear strain data from the existing U.S. Army HLH aft transmission was acquired, and existing data from smaller gears were made available.

  4. Why did humans develop a large brain?

    OpenAIRE

    Muscat Baron, Yves

    2012-01-01

    "Of all animals, man has the largest brain in proportion to his size"- Aristotle. Dr Yves Muscat Baron shares his theory on how humans evolved large brains. The theory outlines how gravity could have helped humans develop a large brain- the author has named the theory 'The Gravitational Vascular Theory'. http://www.um.edu.mt/think/why-did-humans-develop-a-large-brain/

  5. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  6. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  7. Radioimmunoassay of h-TSH - methodological suggestions for dealing with medium to large numbers of samples

    International Nuclear Information System (INIS)

    Mahlstedt, J.

    1977-01-01

    The article deals with practical aspects of establishing a TSH-RIA for patients, with particular regard to predetermined quality criteria. Methodological suggestions are made for medium to large numbers of samples with the target of reducing monotonous precision working steps by means of simple aids. The quality criteria required are well met, while the test procedure is well adapted to the rhythm of work and may be carried out without loss of precision even with large numbers of samples. (orig.) [de

  8. What the Current System Development Trends tell us about Systems Development Methodologies: Toward explaining SSDAM, Agile and IDEF0 Methodologies

    Directory of Open Access Journals (Sweden)

    Abdulla F. Ally

    2015-03-01

    Full Text Available Systems integration, customization and component based development approach are of increasing attention. This trend facilitates the research attention to also focus on systems development methodologies. The availability of systems development tools, rapid change in technologies, evolution of mobile computing and the growth of cloud computing have necessitated a move toward systems integration and customization rather than developing systems from scratch. This tendency encourages component based development and discourages traditional systems development approach. The paper presents and evaluates SSADM, IDEF0 and Agile systems development methodologies. More specifically, it examines how they fit or not fit into the current competitive market of systems development. In the view of this perspective, it is anticipated that despite of its popularity, SSADM methodology is becoming obsolete while Agile and IDEF0 methodologies are still gaining acceptance in the current competitive market of systems development. The present study more likely enrich our understanding of the systems development methodologies concepts and draw attention regarding where the current trends in system development are heading.

  9. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  10. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  11. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  12. Prometheus Reactor I and C Software Development Methodology, for Action

    International Nuclear Information System (INIS)

    T. Hamilton

    2005-01-01

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I and C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I and C Software Development Process Manual and Reactor Module Software Development Plan to NR for information

  13. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  14. A methodology for the synthesis of heat exchanger networks having large numbers of uncertain parameters

    International Nuclear Information System (INIS)

    Novak Pintarič, Zorka; Kravanja, Zdravko

    2015-01-01

    This paper presents a robust computational methodology for the synthesis and design of flexible HEN (Heat Exchanger Networks) having large numbers of uncertain parameters. This methodology combines several heuristic methods which progressively lead to a flexible HEN design at a specific level of confidence. During the first step, a HEN topology is generated under nominal conditions followed by determining those points critical for flexibility. A significantly reduced multi-scenario model for flexible HEN design is formulated at the nominal point with the flexibility constraints at the critical points. The optimal design obtained is tested by stochastic Monte Carlo optimization and the flexibility index through solving one-scenario problems within a loop. This presented methodology is novel regarding the enormous reduction of scenarios in HEN design problems, and computational effort. Despite several simplifications, the capability of designing flexible HENs with large numbers of uncertain parameters, which are typical throughout industry, is not compromised. An illustrative case study is presented for flexible HEN synthesis comprising 42 uncertain parameters. - Highlights: • Methodology for HEN (Heat Exchanger Network) design under uncertainty is presented. • The main benefit is solving HENs having large numbers of uncertain parameters. • Drastically reduced multi-scenario HEN design problem is formulated through several steps. • Flexibility of HEN is guaranteed at a specific level of confidence.

  15. Non-economic determinants of economic development: methodology and influence

    OpenAIRE

    Barashov, N.

    2011-01-01

    The paper deals with research methodology of non-economic determinants of economic development. The author considers various theoretical approaches to definition of economic growth factors. Considerable attention is given to studying possible influence of non-economic determinants on quality of economic development.

  16. A vision on methodology for integrated sustainable urban development: bequest

    NARCIS (Netherlands)

    Bentivegna, V.; Curwell, S.; Deakin, M.; Lombardi, P.; Mitchell, G.; Nijkamp, P.

    2002-01-01

    The concepts and visions of sustainable development that have emerged in the post-Brundtland era are explored in terms laying the foundations for a common vision of sustainable urban development (SUD). The described vision and methodology for SUD resulted from the activities of an international

  17. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  18. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  19. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  20. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  1. Methodology for development of risk indicators for offshore platforms

    International Nuclear Information System (INIS)

    Oeien, K.; Sklet, S.

    1999-01-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  2. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Ganapathi Iyer, S.; Ali, M.M.; Thantry, S.S.; Verma, R.; Arunachalam, J.; Walvekar, A.P.

    1992-01-01

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  3. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions

    DEFF Research Database (Denmark)

    Roberts, Chris; Freeman, John; Samdal, Oddrun

    2009-01-01

    OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members...... of the network, we collaboratively analysed our joint understandings of the survey's methodology. RESULTS: We identified four tensions that are likely to be present in upcoming survey cycles: (1) maintaining quality standards against a background of rapid growth, (2) continuous improvement with limited financial...... in working through such challenges renders it likely that HBSC can provide a model of other similar studies facing these tensions....

  4. Discontinuous Galerkin methodology for Large-Eddy Simulations of wind turbine airfoils

    DEFF Research Database (Denmark)

    Frére, A.; Sørensen, Niels N.; Hillewaert, K.

    2016-01-01

    This paper aims at evaluating the potential of the Discontinuous Galerkin (DG) methodology for Large-Eddy Simulation (LES) of wind turbine airfoils. The DG method has shown high accuracy, excellent scalability and capacity to handle unstructured meshes. It is however not used in the wind energy...... sector yet. The present study aims at evaluating this methodology on an application which is relevant for that sector and focuses on blade section aerodynamics characterization. To be pertinent for large wind turbines, the simulations would need to be at low Mach numbers (M ≤ 0.3) where compressible...... at low and high Reynolds numbers and compares the results to state-of-the-art models used in industry, namely the panel method (XFOIL with boundary layer modeling) and Reynolds Averaged Navier-Stokes (RANS). At low Reynolds number (Re = 6 × 104), involving laminar boundary layer separation and transition...

  5. A Methodology of Estimation on Air Pollution and Its Health Effects in Large Japanese Cities

    OpenAIRE

    Hirota, Keiko; Shibuya, Satoshi; Sakamoto, Shogo; Kashima, Shigeru

    2012-01-01

    The correlation between air pollution and health effects in large Japanese cities presents a great challenge owing to the limited availability of data on the exposure to pollution, health effects and the uncertainty of mixed causes. A methodology for quantitative relationships (between the emission volume and air quality, and the air quality and health effects) is analysed with a statistical method in this article; the correlation of air pollution reduction policy in Japan from 1974 to 2007. ...

  6. NPA4K development system using object-oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced.

  7. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  8. NPA4K development system using object-oriented methodology

    International Nuclear Information System (INIS)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced

  9. Methodological Grounds of Managing Innovation Development of Restaurants

    OpenAIRE

    Naidiuk V. S.

    2013-01-01

    The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the "managing innovation development of an enterprise" notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficien...

  10. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  11. Embracing Agile methodology during DevOps Developer Internship Program

    OpenAIRE

    Patwardhan, Amol; Kidd, Jon; Urena, Tiffany; Rajgopalan, Aishwarya

    2016-01-01

    The DevOps team adopted agile methodologies during the summer internship program as an initiative to move away from waterfall. The DevOps team implemented the Scrum software development strategy to create an internal data dictionary web application. This article reports on the transition process and lessons learned from the pilot program.

  12. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  13. Summary of FY-1978 consultation input for Scenario Methodology Development

    International Nuclear Information System (INIS)

    Scott, B.L.; Benson, G.L.; Craig, R.A.; Harwell, M.A.

    1979-11-01

    The Scenario Methodology Development task is concerned with evaluating the geologic system surrounding an underground repository and describing the phenomena (volcanic, seismic, meteorite, hydrologic, tectonic, climate, etc.) which could perturb the system and possibly cause loss of repository integrity. This document includes 14 individual papers. Separate abstracts were prepared for all 14 papers

  14. Reaching the grassroots: publishing methodologies for development organizations.

    Science.gov (United States)

    Zielinski, C

    1987-01-01

    There are 3 major distinctions between the traditional form of academic publishing and publishing for the grassroots as a development-organization activity, particularly in developing countries. Whereas academic publishing seeks to cover the target audience in its entirety, grassroots publishing can only cover a sampling. Academic publishing fulfills a need, while grassroots publishing demonstrates a need and a way to fulfill it. Finally, whereas academic publishing is largely a support activity aimed at facilitating the dissemination of information as a relatively minor part of a technical program, grassroots publishing is a more substantive activity aimed at producing a catalytic effect. Publication for the grassroots further calls for a different methodological approach. Given the constraint of numbers, publications aimed at the grassroots can only be examples or prototypes. The function of a prototype is to serve both as a basis for translation, adaptation, and replication and as a model end result. The approach to the use and promotion of prototypes differs according to the specific country situation. In countries with a heterogenous culture or several different languages, 2 items should be produced: a prototype of the complete text, which should be pretested and evaluated, and a prototype adaptation kit stripped of cultural and social biases. Promotion of the translation and replication of a publication can be achieved by involving officials at the various levels of government, interesting international and voluntary funding agencies, and stimulating indigenous printing capacities at the community level. The most important factors are the appropriateness of the publication in solving specific priority problems and the interest and involvement of national and state authorities at all stages of the project.

  15. European methodology for qualification of NDT as developed by ENIQ

    International Nuclear Information System (INIS)

    Champigny, F.; Sandberg, U.; Engl, G.; Crutzen, S.; Lemaitre, P.

    1997-01-01

    The European Network for Inspection Qualification (ENIQ) groups the major part of the nuclear power plant operators in the European Union (and Switzerland). The main objective of ENIQ is to co-ordinate and manage at European level expertise and resources for the qualification of NDE inspection systems, primarily for nuclear components. In the framework of ENIQ the European methodology for qualification of NDT has been developed. In this paper the main principles of the European methodology are given besides the main activities and organisation of ENIQ. (orig.)

  16. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  17. Risk-Informed Assessment Methodology Development and Application

    International Nuclear Information System (INIS)

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  18. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  19. Selecting a software development methodology. [of digital flight control systems

    Science.gov (United States)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  20. Territory development as economic and geographical activity (theory, methodology, practice

    Directory of Open Access Journals (Sweden)

    Vitaliy Nikolaevich Lazhentsev

    2013-03-01

    Full Text Available Accents in a description of theory and methodology of territory development are displaced from distribution of the national benefits on formation of territorial natural and economic systems and organization of economical and geographical activity. The author reveals theconcept of «territory development» and reviews its placein thetheory and methodology of human geography and regionaleconomy. In the articletheindividual directions ofeconomic activity areconsidered. The author has made an attempt to definethesubject matter of five levels of «ideal» territorial and economic systems as a part of objects of the nature, societies, population settlement, production, infrastructure and management. The author’s position of interpretation of sequences of mechanisms of territory development working according to a Nested Doll principle (mechanism of economy, economic management mechanism, controlling mechanism of economy is presented. The author shows the indicators, which authentically define territory development

  1. A pattern recognition methodology for evaluation of load profiles and typical days of large electricity customers

    International Nuclear Information System (INIS)

    Tsekouras, G.J.; Kotoulas, P.B.; Tsirekis, C.D.; Dialynas, E.N.; Hatziargyriou, N.D.

    2008-01-01

    This paper describes a pattern recognition methodology for the classification of the daily chronological load curves of each large electricity customer, in order to estimate his typical days and his respective representative daily load profiles. It is based on pattern recognition methods, such as k-means, self-organized maps (SOM), fuzzy k-means and hierarchical clustering, which are theoretically described and properly adapted. The parameters of each clustering method are properly selected by an optimization process, which is separately applied for each one of six adequacy measures. The results can be used for the short-term and mid-term load forecasting of each consumer, for the choice of the proper tariffs and the feasibility studies of demand side management programs. This methodology is analytically applied for one medium voltage industrial customer and synoptically for a set of medium voltage customers of the Greek power system. The results of the clustering methods are presented and discussed. (author)

  2. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  3. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Raff, D. [U.S. Dept. of the Interior, Bureau of Reclamation, Denver, Colorado (United States)

    2008-07-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  4. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    International Nuclear Information System (INIS)

    Raff, D.

    2008-01-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  5. Development of a comprehensive management site evaluation methodology

    International Nuclear Information System (INIS)

    Rodgers, J.C.; Onishi, Y.

    1981-01-01

    The Nuclear Regulatory Commission is in the process of preparing regulations that will define the necessary conditions for adequate disposal of low-level waste (LLW) by confinement in an LLW disposal facility. These proposed regulations form the context in which the motivation for the joint Los Alamos National Laboratory Battelle Pacific Northwest Laboratory program to develop a site-specific, LLW site evaluation methodology is discussed. The overall effort is divided into three development areas: land-use evaluation, environmental transport modelling, and long term scenario development including long-range climatology projections. At the present time four steps are envisioned in the application of the methodology to a site: site land use suitability assessment, land use-ecosystem interaction, contaminant transport simulation, and sensitivity analysis. Each of these steps is discussed in the paper. 12 refs

  6. A Methodology for Estimating Large-Customer Demand Response MarketPotential

    Energy Technology Data Exchange (ETDEWEB)

    Goldman, Charles; Hopper, Nicole; Bharvirkar, Ranjit; Neenan,Bernie; Cappers,Peter

    2007-08-01

    Demand response (DR) is increasingly recognized as an essential ingredient to well-functioning electricity markets. DR market potential studies can answer questions about the amount of DR available in a given area and from which market segments. Several recent DR market potential studies have been conducted, most adapting techniques used to estimate energy-efficiency (EE) potential. In this scoping study, we: reviewed and categorized seven recent DR market potential studies; recommended a methodology for estimating DR market potential for large, non-residential utility customers that uses price elasticities to account for behavior and prices; compiled participation rates and elasticity values from six DR options offered to large customers in recent years, and demonstrated our recommended methodology with large customer market potential scenarios at an illustrative Northeastern utility. We observe that EE and DR have several important differences that argue for an elasticity approach for large-customer DR options that rely on customer-initiated response to prices, rather than the engineering approaches typical of EE potential studies. Base-case estimates suggest that offering DR options to large, non-residential customers results in 1-3% reductions in their class peak demand in response to prices or incentive payments of $500/MWh. Participation rates (i.e., enrollment in voluntary DR programs or acceptance of default hourly pricing) have the greatest influence on DR impacts of all factors studied, yet are the least well understood. Elasticity refinements to reflect the impact of enabling technologies and response at high prices provide more accurate market potential estimates, particularly when arc elasticities (rather than substitution elasticities) are estimated.

  7. Vedic division methodology for high-speed very large scale integration applications

    Directory of Open Access Journals (Sweden)

    Prabir Saha

    2014-02-01

    Full Text Available Transistor level implementation of division methodology using ancient Vedic mathematics is reported in this Letter. The potentiality of the ‘Dhvajanka (on top of the flag’ formula was adopted from Vedic mathematics to implement such type of divider for practical very large scale integration applications. The division methodology was implemented through half of the divisor bit instead of the actual divisor, subtraction and little multiplication. Propagation delay and dynamic power consumption of divider circuitry were minimised significantly by stage reduction through Vedic division methodology. The functionality of the division algorithm was checked and performance parameters like propagation delay and dynamic power consumption were calculated through spice spectre with 90 nm complementary metal oxide semiconductor technology. The propagation delay of the resulted (32 ÷ 16 bit divider circuitry was only ∼300 ns and consumed ∼32.5 mW power for a layout area of 17.39 mm^2. Combination of Boolean arithmetic along with ancient Vedic mathematics, substantial amount of iterations were reduced resulted as ∼47, ∼38, 34% reduction in delay and ∼34, ∼21, ∼18% reduction in power were investigated compared with the mostly used (e.g. digit-recurrence, Newton–Raphson, Goldschmidt architectures.

  8. Development of a methodology of evaluation of financial stability of commercial banks

    Directory of Open Access Journals (Sweden)

    Brauers Willem Karel M.

    2014-01-01

    Full Text Available The field of evaluation of financial stability of commercial banks, which emanates from persistent existence of financial crisis, induces interest of researchers for over a century. The span of prevailing methodologies stretches from over-simplified risk-return approaches to ones comprising large number of economic variables on the micro- and/or macro-economic level. Methodologies of rating agencies and current methodologies reviewed and applied by the ECB are not intended for reducing information asymmetry in the market of commercial banks. In the paper it is shown that the Lithuanian financial system is bankbased with deposits of households being its primary sources, and its stability is primarily depending on behavior of depositors. A methodology of evaluation of commercial banks with features of decreasing information asymmetry in the market of commercial banks is being developed by comparing different MCDA methods.

  9. Development of the affiliate system based on modern development methodologies

    OpenAIRE

    Fajmut, Aljaž

    2016-01-01

    Affiliate partnership is a popular and effective method of online marketing through affiliate partners. The thesis describes the development of a product, which allows us to easily integrate affiliate system into an existing platform (e-commerce or service). This kind of functionality opens up growth opportunities for the business. The system is designed in a way that it requires minimal amount of changes for the implementation into an existing application. The development of the product is ...

  10. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  11. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  12. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  13. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  14. Status of Methodology Development for the Evaluation of Proliferation Resistance

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Ko, Won Il; Lee, Jung Won

    2010-01-01

    Concerning the increasing energy demand and green house effect, nuclear energy is now the most feasible option. Therefore, recently, oil countries even have a plan to build the nuclear power plant for energy production. If nuclear systems are to make a major and sustainable contribution to the worlds energy supply, future nuclear energy systems must meet specific requirements. One of the requirements is to satisfy the proliferation resistance condition in an entire nuclear system. Therefore, from the beginning of future nuclear energy system development, it is important to consider a proliferation resistance to prevent the diversion of nuclear materials. The misuse of a nuclear system must be considered as well. Moreover, in the import and export of nuclear system, the evaluation of the proliferation resistance on the nuclear system becomes a key factor The INPRO (International Project on Innovative Nuclear Reactors and Fuel Cycles) program initiated by the IAEA proposed proliferation resistance (PR) as a key component of a future innovative nuclear system (INS) with a sustainability, economics, safety of nuclear installation and waste management. The technical goal for Generation IV (Gen IV) nuclear energy systems (NESs) highlights a Proliferation Resistance and Physical Protection (PR and PP), sustainability, safety, reliability and economics as well. Based on INPRO and Gen IV study, the methodology development for the evaluation of proliferation resistance has been carried out in KAERI. Finally, the systematic procedure for methodology was setup and the indicators for the procedure were decided. The methodology involves the evaluation from total nuclear system to individual process. Therefore, in this study, the detailed procedure for the evaluation of proliferation resistance and the newly proposed additional indicators are described and several conditions are proposed to increase the proliferation resistance in the future nuclear system. The assessment of PR

  15. A Methodology for Measuring Microplastic Transport in Large or Medium Rivers

    Directory of Open Access Journals (Sweden)

    Marcel Liedermann

    2018-04-01

    Full Text Available Plastic waste as a persistent contaminant of our environment is a matter of increasing concern due to the largely unknown long-term effects on biota. Although freshwater systems are known to be the transport paths of plastic debris to the ocean, most research has been focused on marine environments. In recent years, freshwater studies have advanced rapidly, but they rarely address the spatial distribution of plastic debris in the water column. A methodology for measuring microplastic transport at various depths that is applicable to medium and large rivers is needed. We present a new methodology offering the possibility of measuring microplastic transport at different depths of verticals that are distributed within a profile. The net-based device is robust and can be applied at high flow velocities and discharges. Nets with different sizes (41 µm, 250 µm, and 500 µm are exposed in three different depths of the water column. The methodology was tested in the Austrian Danube River, showing a high heterogeneity of microplastic concentrations within one cross section. Due to turbulent mixing, the different densities of the polymers, aggregation, and the growth of biofilms, plastic transport cannot be limited to the surface layer of a river, and must be examined within the whole water column as for suspended sediments. These results imply that multipoint measurements are required for obtaining the spatial distribution of plastic concentration and are therefore a prerequisite for calculating the passing transport. The analysis of filtration efficiency and side-by-side measurements with different mesh sizes showed that 500 µm nets led to optimal results.

  16. Simulating large-scale pedestrian movement using CA and event driven model: Methodology and case study

    Science.gov (United States)

    Li, Jun; Fu, Siyao; He, Haibo; Jia, Hongfei; Li, Yanzhong; Guo, Yi

    2015-11-01

    Large-scale regional evacuation is an important part of national security emergency response plan. Large commercial shopping area, as the typical service system, its emergency evacuation is one of the hot research topics. A systematic methodology based on Cellular Automata with the Dynamic Floor Field and event driven model has been proposed, and the methodology has been examined within context of a case study involving the evacuation within a commercial shopping mall. Pedestrians walking is based on Cellular Automata and event driven model. In this paper, the event driven model is adopted to simulate the pedestrian movement patterns, the simulation process is divided into normal situation and emergency evacuation. The model is composed of four layers: environment layer, customer layer, clerk layer and trajectory layer. For the simulation of movement route of pedestrians, the model takes into account purchase intention of customers and density of pedestrians. Based on evacuation model of Cellular Automata with Dynamic Floor Field and event driven model, we can reflect behavior characteristics of customers and clerks at the situations of normal and emergency evacuation. The distribution of individual evacuation time as a function of initial positions and the dynamics of the evacuation process is studied. Our results indicate that the evacuation model using the combination of Cellular Automata with Dynamic Floor Field and event driven scheduling can be used to simulate the evacuation of pedestrian flows in indoor areas with complicated surroundings and to investigate the layout of shopping mall.

  17. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    Science.gov (United States)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  18. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  19. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2005-04-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  20. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2008-01-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  1. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  2. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  3. Development of a new methodology for quantifying nuclear safety culture

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-01-15

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  4. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  5. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  6. Effective diagnosis of Alzheimer’s disease by means of large margin-based methodology

    Directory of Open Access Journals (Sweden)

    Chaves Rosa

    2012-07-01

    Full Text Available Abstract Background Functional brain images such as Single-Photon Emission Computed Tomography (SPECT and Positron Emission Tomography (PET have been widely used to guide the clinicians in the Alzheimer’s Disease (AD diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD Systems. Methods It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT, Principal Component Analysis (PCA or Partial Least Squares (PLS (the two latter also analysed with a LMNN transformation. Regarding the classifiers, kernel Support Vector Machines (SVMs and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. Results Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i linear transformation of the PLS or PCA reduced data, ii feature reduction technique, and iii classifier (with Euclidean, Mahalanobis or Energy-based methodology. The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT and 90.67%, 88% and 93.33% (for PET, respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. Conclusions All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between

  7. Establishing a methodology to develop complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-02-01

    Full Text Available Many modern management systems, such as military command and control, tend to be large and highly interconnected sociotechnical systems operating in a complex environment. Successful development, assessment and implementation of these systems...

  8. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  9. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  10. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  11. A New Methodology of Design and Development of Serious Games

    Directory of Open Access Journals (Sweden)

    André F. S. Barbosa

    2014-01-01

    Full Text Available The development of a serious game requires perfect knowledge of the learning domain to obtain the desired results. But it is also true that this may not be enough to develop a successful serious game. First of all, the player has to feel that he is playing a game where the learning is only a consequence of the playing actions. Otherwise, the game is viewed as boring and not as a fun activity and engaging. For example, the player can catch some items in the scenario and then separate them according to its type (i.e., recycle them. Thus, the main action for player is catching the items in the scenario where the recycle action is a second action, which is viewed as a consequence of the first action. Sometimes, the game design relies on a detailed approach based on the ideas of the developers because some educational content are difficult to integrate in the games, while maintaining the fun factor in the first place. In this paper we propose a new methodology of design and development of serious games that facilitates the integration of educational contents in the games. Furthermore, we present a serious game, called “Clean World”, created using this new methodology.

  12. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  13. Design Methodology of Process Layout considering Various Equipment Types for Large scale Pyro processing Facility

    International Nuclear Information System (INIS)

    Yu, Seung Nam; Lee, Jong Kwang; Lee, Hyo Jik

    2016-01-01

    At present, each item of process equipment required for integrated processing is being examined, based on experience acquired during the Pyropocess Integrated Inactive Demonstration Facility (PRIDE) project, and considering the requirements and desired performance enhancement of KAPF as a new facility beyond PRIDE. Essentially, KAPF will be required to handle hazardous materials such as spent nuclear fuel, which must be processed in an isolated and shielded area separate from the operator location. Moreover, an inert-gas atmosphere must be maintained, because of the radiation and deliquescence of the materials. KAPF must also achieve the goal of significantly increased yearly production beyond that of the previous facility; therefore, several parts of the production line must be automated. This article presents the method considered for the conceptual design of both the production line and the overall layout of the KAPF process equipment. This study has proposed a design methodology that can be utilized as a preliminary step for the design of a hot-cell-type, large-scale facility, in which the various types of processing equipment operated by the remote handling system are integrated. The proposed methodology applies to part of the overall design procedure and contains various weaknesses. However, if the designer is required to maximize the efficiency of the installed material-handling system while considering operation restrictions and maintenance conditions, this kind of design process can accommodate the essential components that must be employed simultaneously in a general hot-cell system

  14. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  15. Theoretical and methodological foundations of sustainable development of Geosystems

    Science.gov (United States)

    Mandryk, O. M.; Arkhypova, L. M.; Pukish, A. V.; Zelmanovych, A.; Yakovlyuk, Kh

    2017-05-01

    The theoretical and methodological foundations of sustainable development of Geosystems were further evolved. It was grounded the new scientific direction “constructive Hydroecology” - the science that studies the Hydrosphere from the standpoint of natural and technogenic safety based on geosystematical approach. A structural separation for constructive Hydroecology based on objective, subjective, and application characteristics was set. The main object of study of the new scientific field is the hydroecological environment under which the part of Hydrosphere should be understood as a part of the multicomponent dynamic system that is influenced by engineering and economical human activities and, in turn, determines to some extent this activity.

  16. System study methodology development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Sarto, S.; Zappellini, G.; Gambi, G.

    1989-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to describe the rules to apply in scientific research. This methodology is a powerful tool for evaluating the options, compared with conventional analytical methods as a higher number of parameters can be taken into account, with a higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. This method can be limited to a specific objective such as a fusion reactor safety analysis, taking into account other major constraints such as the economical environment. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design. (orig.)

  17. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  18. Development of proliferation resistance assessment methodology based on international standard

    International Nuclear Information System (INIS)

    Ko, W. I.; Chang, H. L.; Lee, Y. D.; Lee, J. W.; Park, J. H.; Kim, Y. I.; Ryu, J. S.; Ko, H. S.; Lee, K. W.

    2012-04-01

    Nonproliferation is one of the main requirements to be satisfied by the advanced future nuclear energy systems that have been developed in the Generation IV and INPRO studies. The methodologies to evaluate proliferation resistance has been developed since 1980s, however, the systematic evaluation approach has begun from around 2000. Domestically a study to develop national method to evaluate proliferation resistance (PR) of advanced future nuclear energy systems has started in 2007 as one of the long-term nuclear R and D subjects in order to promote export and international credibility and transparency of national nuclear energy systems and nuclear fuel cycle technology development program. In the first phase (2007-2010) development and improvement of intrinsic evaluation parameters for the evaluation of proliferation resistance, quantification of evaluation parameters, development of evaluation models, and development of permissible ranges of evaluation parameters have been carried out. In the second phase (2010-2012) generic principle of to evaluate PR was established, and techincal guidelines, nuclear material diversion pathway analysis method, and a method to integrate evaluation parameters have been developed. which were applied to 5 alternative nuclear fuel cycles to estimate their appicability and objectivity. In addition, measures to enhance PR of advanced future nuclear energy systems and technical guidelines of PR assessment using intrinsic PR evaluation parameters were developed. Lastly, requlatory requirements to secure nonproliferation requirements of nuclear energy systems from the early design stage, operation and to decommissioning which will support the export of newly developed advanced future nuclear energy system

  19. Use of New Methodologies for Students Assessment in Large Groups in Engineering Education

    Directory of Open Access Journals (Sweden)

    B. Tormos

    2014-03-01

    Full Text Available In this paper, a student evaluation methodology which applies the concept of continuous assessment proposed by Bologna is presented for new degrees in higher education. An important part of the student's final grade is based on the performance of several individual works throughout the semester. The paper shows the correction system used which is based on using a spreadsheet with macros and a template in which the student provides the solution of each task. The employ of this correction system together with the available e-learning platform allows the teachers to perform automatic tasks evaluations compatible with courses with large number of students. The paper also raises the different solutions adopted to avoid plagiarism and to try that the final grade reflects, as closely as possible, the knowledge acquired by the students.

  20. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  1. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  2. TECHNOLOGY FOR DEVELOPMENT OF ELECTRONIC TEXTBOOK ON HANDICRAFTS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Iryna V. Androshchuk

    2017-10-01

    Full Text Available The main approaches to defining the concept of electronic textbook have been analyzed in the article. The main advantages of electronic textbooks in the context of future teachers’ training have been outlined. They are interactivity, feedback provision, availability of navigation and search engine. The author has presented and characterized the main stages in the technology of development of an electronic textbook on Handicraft and Technology Training Methodology: determination of its role and significance in the process of mastering the discipline; justification of its structure; outline of the stages of its development in accordance with the defined structure. The characteristic feature of the developed electronic textbook is availability of macro- and microstructure. Macrostructure is viewed as a sequence of components of the electronic textbook that are manifested in its content; microstructure is considered to be an internal pattern of each component of macrostructure.

  3. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    Science.gov (United States)

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  4. Developing new methodology for nuclear power plants vulnerability assessment

    International Nuclear Information System (INIS)

    Kostadinov, Venceslav

    2011-01-01

    new methodology and solution methods for vulnerability assessment can help the overall national energy sector to identify and understand the terrorist threats to and vulnerabilities of its critical infrastructure. Moreover, adopted methodology could help national regulators and agencies to develop and implement a vulnerability awareness and education programs for their critical assets to enhance the security and a safe operation of the entire energy infrastructure. New methods can also assist nuclear power plants to develop, validate, and disseminate assessment and surveys of new efficient countermeasures. Consequently, concise description of developed new quantitative method and adapted new methodology for nuclear regulatory vulnerability assessment of nuclear power plants are presented.

  5. Enabling Psychiatrists to be Mobile Phone App Developers: Insights Into App Development Methodologies.

    Science.gov (United States)

    Zhang, Melvyn Wb; Tsang, Tammy; Cheow, Enquan; Ho, Cyrus Sh; Yeong, Ng Beng; Ho, Roger Cm

    2014-11-11

    The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical

  6. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  7. Development of Cost Estimation Methodology of Decommissioning for PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il; Yoo, Yeon Jae; Lim, Yong Kyu; Chang, Hyeon Sik; Song, Geun Ho

    2013-01-01

    The permanent closure of nuclear power plant should be conducted with the strict laws and the profound planning including the cost and schedule estimation because the plant is very contaminated with the radioactivity. In Korea, there are two types of the nuclear power plant. One is the pressurized light water reactor (PWR) and the other is the pressurized heavy water reactor (PHWR) called as CANDU reactor. Also, the 50% of the operating nuclear power plant in Korea is the PWRs which were originally designed by CE (Combustion Engineering). There have been experiences about the decommissioning of Westinghouse type PWR, but are few experiences on that of CE type PWR. Therefore, the purpose of this paper is to develop the cost estimation methodology and evaluate technical level of decommissioning for the application to CE type PWR based on the system engineering technology. The aim of present study is to develop the cost estimation methodology of decommissioning for application to PWR. Through the study, the following conclusions are obtained: · Based on the system engineering, the decommissioning work can be classified as Set, Subset, Task, Subtask and Work cost units. · The Set and Task structure are grouped as 29 Sets and 15 Task s, respectively. · The final result shows the cost and project schedule for the project control and risk management. · The present results are preliminary and should be refined and improved based on the modeling and cost data reflecting available technology and current costs like labor and waste data

  8. Large-Scale Demand Driven Design of a Customized Bus Network: A Methodological Framework and Beijing Case Study

    Directory of Open Access Journals (Sweden)

    Jihui Ma

    2017-01-01

    Full Text Available In recent years, an innovative public transportation (PT mode known as the customized bus (CB has been proposed and implemented in many cities in China to efficiently and effectively shift private car users to PT to alleviate traffic congestion and traffic-related environmental pollution. The route network design activity plays an important role in the CB operation planning process because it serves as the basis for other operation planning activities, for example, timetable development, vehicle scheduling, and crew scheduling. In this paper, according to the demand characteristics and operational purpose, a methodological framework that includes the elements of large-scale travel demand data processing and analysis, hierarchical clustering-based route origin-destination (OD region division, route OD region pairing, and a route selection model is proposed for CB network design. Considering the operating cost and social benefits, a route selection model is proposed and a branch-and-bound-based solution method is developed. In addition, a computer-aided program is developed to analyze a real-world Beijing CB route network design problem. The results of the case study demonstrate that the current CB network of Beijing can be significantly improved, thus demonstrating the effectiveness of the proposed methodology.

  9. Future development of large superconducting generators

    International Nuclear Information System (INIS)

    Singh, S.K.; Mole, C.J.

    1989-01-01

    Large superconducting generators are being developed worldwide. The use of superconductors to reduce the electrical power dissipation in power equipment has been a technological possibility ever since the discovery of superconductivity, even though their use in power equipment remained an impractical dream for a long time. However, scientific and technological progress in superconductivity and cryogenics has brought this dream much closer to reality. Results obtained so far establish the technical feasibility of these machines. Analytical developments have been providing a sound basis for the design of superconducting machines and results of these design studies have shown improvements in power density of up to a factor of 10 higher than the power density for conventional machines. This paper describes the recently completed USA programs, the current foreign and USA programs, and then proposes a USA development program to maintain leadership in the field

  10. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  11. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  12. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  13. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  14. Large Instrument Development for Radio Astronomy

    Science.gov (United States)

    Fisher, J. Richard; Warnick, Karl F.; Jeffs, Brian D.; Norrod, Roger D.; Lockman, Felix J.; Cordes, James M.; Giovanelli, Riccardo

    2009-03-01

    This white paper offers cautionary observations about the planning and development of new, large radio astronomy instruments. Complexity is a strong cost driver so every effort should be made to assign differing science requirements to different instruments and probably different sites. The appeal of shared resources is generally not realized in practice and can often be counterproductive. Instrument optimization is much more difficult with longer lists of requirements, and the development process is longer and less efficient. More complex instruments are necessarily further behind the technology state of the art because of longer development times. Including technology R&D in the construction phase of projects is a growing trend that leads to higher risks, cost overruns, schedule delays, and project de-scoping. There are no technology breakthroughs just over the horizon that will suddenly bring down the cost of collecting area. Advances come largely through careful attention to detail in the adoption of new technology provided by industry and the commercial market. Radio astronomy instrumentation has a very bright future, but a vigorous long-term R&D program not tied directly to specific projects needs to be restored, fostered, and preserved.

  15. CHARACTERISTICS OF RESEARCH METHODOLOGY DEVELOPMENT IN SPECIAL EDUCATION AND REHABILITATION

    Directory of Open Access Journals (Sweden)

    Natasha ANGELOSKA-GALEVSKA

    2004-12-01

    Full Text Available The aim of the text is to point out the developmental tendencies in the research methodology of special education and rehabilitation worldwide and in our country and to emphasize the importance of methodological training of students in special education and rehabilitation at the Faculty of Philosophy in Skopje.The achieved scientific knowledge through research is the fundamental pre-condition for development of special education and rehabilitation theory and practice. The results of the scientific work sometimes cause small, insignificant changes, but, at times, they make radical changes. Thank to the scientific researches and knowledge, certain prejudices were rejected. For example, in the sixth decade of the last century there was a strong prejudice that mentally retarded children should be segregated from the society as aggressive and unfriendly ones or the deaf children should not learn sign language because they would not be motivated to learn lip-reading and would hardly adapt. Piaget and his colleagues from Geneva institute were the pioneers in researching this field and they imposed their belief that handicapped children were not handicapped in each field and they had potentials that could be developed and improved by systematic and organized work. It is important to initiate further researches in the field of special education and rehabilitation, as well as a critical analysis of realized researches. Further development of the scientific research in special education and rehabilitation should be a base for education policy on people with disabilities and development of institutional and non-institutional treatment of this population.

  16. Enhanced Methodologies to Enumerate Persons Experiencing Homelessness in a Large Urban Area.

    Science.gov (United States)

    Troisi, Catherine L; D'Andrea, Ritalinda; Grier, Gary; Williams, Stephen

    2015-10-01

    Homelessness is a public health problem, and persons experiencing homelessness are a vulnerable population. Estimates of the number of persons experiencing homelessness inform funding allocations and services planning and directly determine the ability of a community to intervene effectively in homelessness. The point-in-time (PIT) count presents a logistical problem in large urban areas, particularly those covering a vast geographical area. Working together, academia, local government, and community organizations improved the methodology for the count. Specific enhancements include use of incident command system (ICS), increased number of staging areas/teams, specialized outreach and Special Weapons and Tactics teams, and day-after surveying to collect demographic information. This collaboration and enhanced methodology resulted in a more accurate estimate of the number of persons experiencing homelessness and allowed comparison of findings for 4 years. While initial results showed an increase due to improved counting, the number of persons experiencing homelessness counted for the subsequent years showed significant decrease during the same time period as a "housing first" campaign was implemented. The collaboration also built capacity in each sector: The health department used ICS as a training opportunity; the academics enhanced their community health efforts; the service sector was taught and implemented more rigorous quantitative methods; and the community was exposed to public health as a pragmatic and effective discipline. Improvements made to increase the reliability of the PIT count can be adapted for use in other jurisdictions, leading to improved counts and better evaluation of progress in ending homelessness. © The Author(s) 2015.

  17. System study methodology. Development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Zappellini, G.; Gambi, G.

    1988-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to explicit the rules to apply in scientific research. This methodology is a powerful tool to evaluate the options to be made, compared with conventional analytical methods as a higher number of parameters can be taken into account, with higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships, by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. Experimental values collection, analysis of the problem, search of solutions, sizing of the installation from defined functions, cost evaluation (planning and operating) and ranking of the options as regard all the constraints are the main points considered for the system's application. This method can be limited to a specific objective such as a fusion reactor safety analysis. The possibility of taking into account all the options, possible accidents, quality assurance, exhaustivity of the safety analysis, identification of the residual risk and modelisation of the results are the main advantages of this approach. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design

  18. Urban Agglomerations in Regional Development: Theoretical, Methodological and Applied Aspects

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Shmidt

    2016-09-01

    Full Text Available The article focuses on the analysis of the major process of modern socio-economic development, such as the functioning of urban agglomerations. A short background of the economic literature on this phenomenon is given. There are the traditional (the concentration of urban types of activities, the grouping of urban settlements by the intensive production and labour communications and modern (cluster theories, theories of network society conceptions. Two methodological principles of studying the agglomeration are emphasized: the principle of the unity of the spatial concentration of economic activity and the principle of compact living of the population. The positive and negative effects of agglomeration in the economic and social spheres are studied. Therefore, it is concluded that the agglomeration is helpful in the case when it brings the agglomerative economy (the positive benefits from it exceed the additional costs. A methodology for examination the urban agglomeration and its role in the regional development is offered. The approbation of this methodology on the example of Chelyabinsk and Chelyabinsk region has allowed to carry out the comparative analysis of the regional centre and the whole region by the main socio-economic indexes under static and dynamic conditions, to draw the conclusions on a position of the city and the region based on such socio-economic indexes as an average monthly nominal accrued wage, the cost of fixed assets, the investments into fixed capital, new housing supply, a retail turnover, the volume of self-produced shipped goods, the works and services performed in the region. In the study, the analysis of a launching site of the Chelyabinsk agglomeration is carried out. It has revealed the following main characteristics of the core of the agglomeration in Chelyabinsk (structure feature, population, level of centralization of the core as well as the Chelyabinsk agglomeration in general (coefficient of agglomeration

  19. Future development of large steam turbines

    International Nuclear Information System (INIS)

    Chevance, A.

    1975-01-01

    An attempt is made to forecast the future of the large steam turbines till 1985. Three parameters affect the development of large turbines: 1) unit output; and a 2000 to 2500MW output may be scheduled; 2) steam quality: and two steam qualities may be considered: medium pressure saturated or slightly overheated steam (light water, heavy water); light enthalpie drop, high pressure steam, high temperature; high enthalpic drop; and 3) the quality of cooling supply. The largest range to be considered might be: open system cooling for sea-sites; humid tower cooling and dry tower cooling. Bi-fluid cooling cycles should be also mentioned. From the study of these influencing factors, it appears that the constructor, for an output of about 2500MW should have at his disposal the followings: two construction technologies for inlet parts and for high and intermediate pressure parts corresponding to both steam qualities; exhaust sections suitable for the different qualities of cooling supply. The two construction technologies with the two steam qualities already exist and involve no major developments. But, the exhaust section sets the question of rotational speed [fr

  20. METHODOLOGY OF RESEARCH AND DEVELOPMENT MANAGEMENT OF REGIONAL NETWORK ECONOMY

    Directory of Open Access Journals (Sweden)

    O.I. Botkin

    2007-06-01

    Full Text Available Information practically of all the Russian regions economy branches and development by managing subjects is information − communicative the Internet technologies render huge influence on economic attitudes development in the environment of regional business: there are new forms of interaction of managing subjects and change is information − organizational structures of regional business management. Integrated image of the set forth above innovations is the regional network economy representing the interactive environment in which on high speed and with minimal transaction (R.H.Coase’s costs are performed social economic and commodity monetary attitudes between managing subjects of region with use of Internet global network interactive opportunities. The urgency of the regional network economy phenomenon research, first of all, is caused by necessity of a substantiation of regional network economy methodology development and management mechanisms development by its infrastructure with the purpose of regional business efficiency increase. In our opinion, the decision of these problems will be the defining factor of effective economic development maintenance and russian regions economy growth in the near future.

  1. HRS Clinical Document Development Methodology Manual and Policies: Executive summary.

    Science.gov (United States)

    Indik, Julia H; Patton, Kristen K; Beardsall, Marianne; Chen-Scarabelli, Carol A; Cohen, Mitchell I; Dickfeld, Timm-Michael L; Haines, David E; Helm, Robert H; Krishnan, Kousik; Nielsen, Jens Cosedis; Rickard, John; Sapp, John L; Chung, Mina

    2017-10-01

    The Heart Rhythm Society (HRS) has been developing clinical practice documents in collaboration and partnership with other professional medical societies since 1996. The HRS formed a Scientific and Clinical Documents Committee (SCDC) with the sole purpose of managing the development of these documents from conception through publication. The SCDC oversees the process for developing clinical practice documents, with input and approval from the HRS Executive Committee and the Board of Trustees. As of May 2017, the HRS has produced more than 80 publications with other professional organizations. This process manual is produced to publicly and transparently declare the standards by which the HRS develops clinical practice documents, which include clinical practice guidelines, expert consensus statements, scientific statements, clinical competency statements, task force policy statements, and proceedings statements. The foundation for this process is informed by the Institute of Medicine's standards for developing trustworthy clinical practice guidelines; the new criteria from the National Guidelines Clearinghouse, effective June 2014; SCDC member discussions; and a review of guideline policies and methodologies used by other professional organizations. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  2. Indigenously developed large pumping speed cryoadsorption cryopump

    International Nuclear Information System (INIS)

    Gangradey, Ranjana; Mukherjee, Samiran Shanti; Agarwal, Jyoti

    2015-01-01

    Indigenous cryoadsorption cryopump with large pumping speeds for fusion reactor application has been developed at the Institute for Plasma Research (IPR). Towards its successful realization, technological bottlenecks were identified, studied and resolved. Hydroformed cryopanels were developed from concept leading to the design and product realization with successful technology transfer to the industry. This has led to the expertise for developing hydroformed panels for any desired shape, geometry and welding pattern. Activated sorbents were developed, characterized using an experimental set up which measures adsorption isotherms down to 4K for hydrogen and helium. Special techniques were evolved for coating sorbents on hydroformed cryopanels with suitable cryo-adhesives. Various arrangements of cryopanels at 4 K surrounded by 80 K shields and baffles (which are also hydroformed) were studied and optimized by transmission probability analysis using Monte Carlo techniques. CFD analysis was used to study the temperature distribution and flow analysis during the cryogen flow through the panels. Integration of the developed technologies to arrive at the final product was a challenging task and this was meticulously planned and executed. This resulted in a cryoadsorption cryopump offering pumping speeds as high as 50,000 to 70,000 1/s for helium and 1,50,000 1/s for hydrogen with a 3.2 m 2 of sorbent panel area. The first laboratory scale pump integrating the developed technologies was a Small Scale CryoPump (SSCP-01) with a pumping speed of 2,000 1/s for helium. Subsequently, Single Panel CryoPump (SPCP-01) with pumping speed 10,000 1/s for helium and a Multiple Panel CryoPump (MPCP-08) with a pumping speed of 70,000 1/s for helium and 1,50,000 1/s for hydrogen respectively were developed. This paper describes the efforts in realizing these products from laboratory to industrial scales. (author)

  3. Large Engine Technology (LET) Short Haul Civil Tiltrotor Contingency Power Materials Knowledge and Lifing Methodologies

    Science.gov (United States)

    Spring, Samuel D.

    2006-01-01

    This report documents the results of an experimental program conducted on two advanced metallic alloy systems (Rene' 142 directionally solidified alloy (DS) and Rene' N6 single crystal alloy) and the characterization of two distinct internal state variable inelastic constitutive models. The long term objective of the study was to develop a computational life prediction methodology that can integrate the obtained material data. A specialized test matrix for characterizing advanced unified viscoplastic models was specified and conducted. This matrix included strain controlled tensile tests with intermittent relaxtion test with 2 hr hold times, constant stress creep tests, stepped creep tests, mixed creep and plasticity tests, cyclic temperature creep tests and tests in which temperature overloads were present to simulate actual operation conditions for validation of the models. The selected internal state variable models where shown to be capable of representing the material behavior exhibited by the experimental results; however the program ended prior to final validation of the models.

  4. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.

    1994-01-01

    A new methodology for equivalent dose calculations has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neutral network. The research was orientated towards the optimization of the whole set of parameters involves in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neutral network was performed by taking the readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation. (author)

  5. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  6. Methodology on the sparger development for Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloudacceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs.

  7. Turbofan Engine Core Compartment Vent Aerodynamic Configuration Development Methodology

    Science.gov (United States)

    Hebert, Leonard J.

    2006-01-01

    This paper presents an overview of the design methodology used in the development of the aerodynamic configuration of the nacelle core compartment vent for a typical Boeing commercial airplane together with design challenges for future design efforts. Core compartment vents exhaust engine subsystem flows from the space contained between the engine case and the nacelle of an airplane propulsion system. These subsystem flows typically consist of precooler, oil cooler, turbine case cooling, compartment cooling and nacelle leakage air. The design of core compartment vents is challenging due to stringent design requirements, mass flow sensitivity of the system to small changes in vent exit pressure ratio, and the need to maximize overall exhaust system performance at cruise conditions.

  8. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  9. Development of a low-level waste risk methodology

    International Nuclear Information System (INIS)

    Fisher, J.E.; Falconer, K.L.

    1984-01-01

    A probabilistic risk assessment method is presented for performance evaluation of low-level waste disposal facilities. The associated program package calculates the risk associated with postulated radionuclide release and transport scenarios. Risk is computed as the mathematical product of two statistical variables: the dose consequence of a given release scenario, and its occurrence probability. A sample risk calculation is included which demonstrates the method. This PRA method will facilitate evaluation of facility performance, including identification of high risk scenarios and their mitigation via optimization of site parameters. The method is intended to be used in facility licensing as a demonstration of compliance with the performance objectives set forth in 10 CFR Part 61, or in corresponding state regulations. The Low-Level Waste Risk Methodology is being developed under sponsorship of the Nuclear Regulatory Commission

  10. Methodology on the sparger development for Korean next generation reactor

    International Nuclear Information System (INIS)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K.

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloud acceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs

  11. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.; Campos, L.L.

    1994-01-01

    A new methodology for equivalent dose calculation has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neural network. The research was oriented towards the optimization of the whole set of parameters involved in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neural network was performed by taking readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation

  12. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    Wallsgrove, R.; Munro, F.

    1991-01-01

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  13. Development and evaluation of clicker methodology for introductory physics courses

    Science.gov (United States)

    Lee, Albert H.

    Many educators understand that lectures are cost effective but not learning efficient, so continue to search for ways to increase active student participation in this traditionally passive learning environment. In-class polling systems, or "clickers", are inexpensive and reliable tools allowing students to actively participate in lectures by answering multiple-choice questions. Students assess their learning in real time by observing instant polling summaries displayed in front of them. This in turn motivates additional discussions which increase the opportunity for active learning. We wanted to develop a comprehensive clicker methodology that creates an active lecture environment for a broad spectrum of students taking introductory physics courses. We wanted our methodology to incorporate many findings of contemporary learning science. It is recognized that learning requires active construction; students need to be actively involved in their own learning process. Learning also depends on preexisting knowledge; students construct new knowledge and understandings based on what they already know and believe. Learning is context dependent; students who have learned to apply a concept in one context may not be able to recognize and apply the same concept in a different context, even when both contexts are considered to be isomorphic by experts. On this basis, we developed question sequences, each involving the same concept but having different contexts. Answer choices are designed to address students preexisting knowledge. These sequences are used with the clickers to promote active discussions and multiple assessments. We have created, validated, and evaluated sequences sufficient in number to populate all of introductory physics courses. Our research has found that using clickers with our question sequences significantly improved student conceptual understanding. Our research has also found how to best measure student conceptual gain using research-based instruments

  14. In-house developed methodologies and tools for decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, Michel; Centner, Baudouin

    2007-01-01

    The paper describes different methodologies and tools developed in-house by Tractebel Engineering to facilitate the engineering works to be carried out especially in the frame of decommissioning projects. Three examples of tools with their corresponding results are presented: - The LLWAA-DECOM code, a software developed for the radiological characterization of contaminated systems and equipment. The code constitutes a specific module of more general software that was originally developed to characterize radioactive waste streams in order to be able to declare the radiological inventory of critical nuclides, in particular difficult-to-measure radionuclides, to the Authorities. In the case of LLWAA-DECOM, deposited activities inside contaminated equipment (piping, tanks, heat exchangers...) and scaling factors between nuclides, at any given time of the decommissioning time schedule, are calculated on the basis of physical characteristics of the systems and of operational parameters of the nuclear power plant. This methodology was applied to assess decommissioning costs of Belgian NPPs, to characterize the primary system of Trino NPP in Italy, to characterize the equipment of miscellaneous circuits of Ignalina NPP and of Kozloduy unit 1 and, to calculate remaining dose rates around equipment in the frame of the preparation of decommissioning activities; - The VISIMODELLER tool, a user friendly CAD interface developed to ease the introduction of lay-out areas in a software named VISIPLAN. VISIPLAN is a 3D dose rate assessment tool for ALARA work planning, developed by the Belgian Nuclear Research Centre SCK.CEN. Both softwares were used for projects such as the steam generators replacements in Belgian NPPs or the preparation of the decommissioning of units 1 and 2 of Kozloduy NPP; - The DBS software, a software developed to manage the different kinds of activities that are part of the general time schedule of a decommissioning project. For each activity, when relevant

  15. METHODOLOGICAL GUIDELINES FOR THE TRANSPROFESSIONALISM DEVELOPMENT AMONG VOCATIONAL EDUCATORS

    Directory of Open Access Journals (Sweden)

    E. F. Zeer

    2017-01-01

    Full Text Available Introduction. Nowadays, regarding the 6thwave of technological innovations and emergence of a phenomenon «transfession», there is a need for modernization of the vocational staff training in our country. Transfession is a type of the labour activity realized on the basis of synthesis and convergence of the professional competences that involve different specialized areas. Thus, the authors of the present article propose to use the professional and educational platform, developed by them, taking into account a specialists’ training specialty. The aims of the article are the following: to describe the phenomenon «transprofessionalism», to determine the initial attitudes towards its understanding; to present the block-modular model of the platform for the formation of the transprofessionalism of the teachers of the vocational school. Methodology and research methods. The research is based on the following theoretical and scientific methods: analysis, synthesis, concretization, generalization; hypothetical-deductive method; project-based method. The projecting of the transprofessionalism platform model was constructed on the basis of multidimensional, transdisciplinary, network and project approaches. Results and scientific novelty. The relevance of the discussed phenomenon in the productive-economic sphere is proved. The transprofessionalism requires a brand new content-informative and technological training of specialists. In particular, the concept «profession» has lost its original meaning as an area of the social division of labour during socio-technological development of the Russian economy. Therefore, transprofessionals are becoming more competitive and demanded in the employment market, being capable to perform a wide range of specialized types of professional activities. The structure, principles and mechanisms of the professional-educational platform functioning for transprofessionalism formation among the members of professional

  16. Trends in scenario development methodologies and integration in NUMO's approach

    International Nuclear Information System (INIS)

    Ebashi, Takeshi; Ishiguro, Katsuhiko; Wakasugi, Keiichiro; Kawamura, Hideki; Gaus, Irina; Vomvoris, Stratis; Martin, Andrew J.; Smith, Paul

    2011-01-01

    The development of scenarios for quantitative or qualitative analysis is a key element of the assessment of the safety of geological disposal systems. As an outcome of an international workshop attended by European and the Japanese implementers, a number of features common to current methodologies could be identified, as well as trends in their evolution over time. In the late nineties, scenario development was often described as a bottom-up process, whereby scenarios were said to be developed in essence from FEP databases. Nowadays, it is recognised that, in practice, the approaches actually adopted are better described as top-down or 'hybrid', taking as their starting point an integrated (top-down) understanding of the system under consideration including uncertainties in initial state, sometimes assisted by the development of 'storyboards'. A bottom-up element remains (hence the term 'hybrid') to the extent that FEP databases or FEP catalogues (including interactions) are still used, but the focus is generally on completeness checking, which occurs parallel to the main assessment process. Recent advances focus on the consistent treatment of uncertainties throughout the safety assessment and on the integration of operational safety and long term safety. (author)

  17. Qualified software development methodologies for nuclear class 1E equipment

    International Nuclear Information System (INIS)

    Koch, Shlomo; Ruether, J.

    1992-01-01

    This article describes the experience learned at Northern States Power and Spectrum Technologies, during the development of a computer based Safeguard Load Sequencer, for Prairie Island Nuclear Generating Plant. The Safeguard Load Sequencer (SLS) performs the function of 4kV emergency bus voltage restoration, load shedding, and emergency diesel generator loading. The system is designed around an Allen-Bradley PLC-5 programmable controller. The Safeguard Load Sequencer is the vehicle to demonstrate the software engineering procedures and methodologies. The article analyzes the requirements imposed by the NUREG 4640 handbook, and the relevant IEEE standards. The article tries to answer the question what is software engineering, and describe the waterfall life cycle phases of software development. The effects of each phase on software quality and V and V plan is described. Issues designing a V and V plan is addressed, and considerations of cost and time to implement the program are described. The article also addresses the subject of tools that can increase productivity and reduce the cost and time of an extensive V and V plan. It describes the tools the authors used, and more importantly presents a wish list of tools that they as developers would like to have. The role of testing is presented. They show that testing at the final stage has a lower impact on software quality then generally assumed. Full coverage of testing is almost always impossible, and they demonstrate how alternative audits and test during the development phase can improve software reliability

  18. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  19. A Methodology of Health Effects Estimation from Air Pollution in Large Asian Cities

    Directory of Open Access Journals (Sweden)

    Keiko Hirota

    2017-09-01

    Full Text Available The increase of health effects caused by air pollution seems to be a growing concern in Asian cities with increasing motorization. This paper discusses methods of estimating the health effects of air pollution in large Asian cities. Due to the absence of statistical data in Asia, this paper carefully chooses the methodology using data of the Japanese compensation system. A basic idea of health effects will be captured from simple indicators, such as population and air quality, in a correlation model. This correlation model enables more estimation results of respiratory mortality caused by air pollution to be yielded than by using the relative model. The correlation model could be an alternative method to estimate mortality besides the relative risk model since the results of the correlation model are comparable with those of the relative model by city and by time series. The classification of respiratory diseases is not known from the statistical yearbooks in many countries. Estimation results could support policy decision-making with respect to public health in a cost-effective way.

  20. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning, the establishment and destruction of components at software run-time (dynamic reconfiguration, cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning.Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration, and finally with a mobile architecture (i.e. architecture with dynamic component mobility confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008. Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development. According to (Szyper, 2002 that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process, see (Sommer, 2010 as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of

  1. Development of very large helicon plasma source

    International Nuclear Information System (INIS)

    Shinohara, Shunjiro; Tanikawa, Takao

    2004-01-01

    We have developed a very large volume, high-density helicon plasma source, 75 cm in diameter and 486 cm in axial length; full width at half maximum of the plasma density is up to ∼42 cm with good plasma uniformity along the z axis. By the use of a spiral antenna located just outside the end of the vacuum chamber through a quartz-glass window, plasma can be initiated with a very low value of radio frequency (rf) power ( 12 cm -3 is successfully produced with less than several hundred Watt; achieving excellent discharge efficiency. It is possible to control the radial density profile in this device by changing the magnetic field configurations near the antenna and/or the antenna radiation-field patterns

  2. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  3. Safety-related operator actions: methodology for developing criteria

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria

  4. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  5. Development and testing of the methodology for performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Department of Energy (DOE) is in the process of implementing a set of materials control and accountability (MC ampersand A) performance requirements. These graded requirements set a uniform level of performance for similar materials at various facilities against the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a special value and include defense-in-depth requirements. The DOE has conducted an extensive effort over the last 2 1/2 yr to develop a practical methodology to be used in evaluating facility performance against the performance requirements specified in DOE order 5633.3. The major participants in the development process have been the Office of Safeguards and Security (OSS), Brookhaven National Laboratory, and Los Alamos National Laboratory. The process has included careful reviews of related evaluation systems, a review of the intent of the requirements in the order, and site visits to most of the major facilities in the DOE complex. As a result of this extensive effort to develop guidance for the MC ampersand A performance requirements, OSS was able to provide a practical method that will allow facilities to evaluate the performance of their safeguards systems against the performance requirements. In addition, the evaluations can be validated by the cognizant operations offices in a systematic manner

  6. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  7. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  8. Development of methodology for early detection of BWR instabilities

    International Nuclear Information System (INIS)

    Alessandro Petruzzi; Shin Chin; Kostadin Ivanov; Asok Ray; Fan-Bill Cheung

    2005-01-01

    Full text of publication follows: The objective of the work presented in this paper research, which is supported by the US Department of Energy under the NEER program, is to develop an early anomaly detection methodology in order to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, based on the US NRC coupled code TRACE/PARCS, is being utilized as a generator of time series data for anomaly detection at an early stage. The concept of the methodology is based on the fact that nonlinear systems show bifurcation, which is a change in the qualitative behavior as the system parameters vary. Some of these parameters may change on their own accord and account for the anomaly, while certain parameters can be altered in a controlled fashion. The non-linear, non-autonomous BWR system model considered in this research exhibits phenomena at two time scales. Anomalies occur at the slow time scale while the observation of the dynamical behavior, based on which inferences are made, takes place at the fast time scale. It is assumed that: (i) the system behavior is stationary at the fast time scale; and (ii) any observable non-stationary behavior is associated with parametric changes evolving at the slow time scale. The goal is to make inferences about evolving anomalies based on the asymptotic behavior derived from the computer simulation. However, only sufficient changes in the slowly varying parameter may lead to detectable difference in the asymptotic behavior. The need to detect such small changes in parameters and hence early detection of an anomaly motivate the utilized stimulus-response approach. In this approach, the model

  9. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  10. A Design Science Research Methodology for Expert Systems Development

    Directory of Open Access Journals (Sweden)

    Shah Jahan Miah

    2016-11-01

    Full Text Available The knowledge of design science research (DSR can have applications for improving expert systems (ES development research. Although significant progress of utilising DSR has been observed in particular information systems design – such as decision support systems (DSS studies – only rare attempts can be found in the ES design literature. Therefore, the aim of this study is to investigate the use of DSR for ES design. First, we explore the ES development literature to reveal the presence of DSR as a research methodology. For this, we select relevant literature criteria and apply a qualitative content analysis in order to generate themes inductively to match the DSR components. Second, utilising the findings of the comparison, we determine a new DSR approach for designing a specific ES that is guided by another result – the findings of a content analysis of examination scripts in Mathematics. The specific ES artefact for a case demonstration is designed for addressing the requirement of a ‘wicked’ problem in that the key purpose is to assist human assessors when evaluating multi-step question (MSQ solutions. It is anticipated that the proposed design knowledge, in terms of both problem class and functions of ES artefacts, will help ES designers and researchers to address similar issues for designing information system solutions.

  11. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  12. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    Science.gov (United States)

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  13. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  14. Development of methodology for the characterization of radioactive sealed sources

    International Nuclear Information System (INIS)

    Ferreira, Robson de Jesus

    2010-01-01

    Sealed radioactive sources are widely used in many applications of nuclear technology in industry, medicine, research and others. The International Atomic Energy Agency (IAEA) estimates tens of millions sources in the world. In Brazil, the number is about 500 thousand sources, if the Americium-241 sources present in radioactive lightning rods and smoke detectors are included in the inventory. At the end of the useful life, most sources become disused, constitute a radioactive waste, and are then termed spent sealed radioactive sources (SSRS). In Brazil, this waste is collected by the research institutes of the Nuclear Commission of Nuclear Energy and kept under centralized storage, awaiting definition of the final disposal route. The Waste Management Laboratory (WML) at the Nuclear and Energy Research Institute is the main storage center, having received until July 2010 about 14.000 disused sources, not including the tens of thousands of lightning rod and smoke detector sources. A program is underway in the WML to replacing the original shielding by a standard disposal package and to determining the radioisotope content and activity of each one. The identification of the radionuclides and the measurement of activities will be carried out with a well type ionization chamber. This work aims to develop a methodology for measuring or to determine the activity SSRS stored in the WML accordance with its geometry and determine their uncertainties. (author)

  15. Development of a Seismic Setpoint Calculation Methodology Using a Safety System Approach

    International Nuclear Information System (INIS)

    Lee, Chang Jae; Baik, Kwang Il; Lee, Sang Jeong

    2013-01-01

    The Automatic Seismic Trip System (ASTS) automatically actuates reactor trip when it detects seismic activities whose magnitudes are comparable to a Safe Shutdown Earthquake (SSE), which is the maximum hypothetical earthquake at the nuclear power plant site. To ensure that the reactor is tripped before the magnitude of earthquake exceeds the SSE, it is crucial to reasonably determine the seismic setpoint. The trip setpoint and allowable value for the ASTS for Advanced Power Reactor (APR) 1400 Nuclear Power Plants (NPPs) were determined by the methodology presented in this paper. The ASTS that trips the reactor when a large earthquake occurs is categorized as a non safety system because the system is not required by design basis event criteria. This means ASTS has neither specific analytical limit nor dedicated setpoint calculation methodology. Therefore, we developed the ASTS setpoint calculation methodology by conservatively considering that of PPS. By incorporating the developed methodology into the ASTS for APR1400, the more conservative trip setpoint and allowable value were determined. In addition, the ZPA from the Operating Basis Earthquake (OBE) FRS of the floor where the sensor module is located is 0.1g. Thus, the allowance of 0.17g between OBE of 0.1 g and ASTS trip setpoint of 0.27 g is sufficient to prevent the reactor trip before the magnitude of the earthquake exceeds the OBE. In result, the developed ASTS setpoint calculation methodology is evaluated as reasonable in both aspects of the safety and performance of the NPPs. This will be used to determine the ASTS trip setpoint and allowable for newly constructed plants

  16. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  17. Development of an aeroelastic methodology for surface morphing rotors

    Science.gov (United States)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  18. A methodology for laser diagnostics in large-bore marine two-stroke diesel engines

    International Nuclear Information System (INIS)

    Hult, J; Mayer, S

    2013-01-01

    Large two-stroke diesel engines for marine propulsion offer several challenges to successful implementation of the laser diagnostic techniques applied extensively in smaller automotive engines. For this purpose a fully operational large-bore engine has been modified to allow flexible optical access, through 24 optical ports with clear diameters of 40 mm. By mounting the entire optical set-up directly to the engine, effects of the vigorous vibrations and thermal drifts on alignment can be minimized. Wide-angle observation and illumination, as well as relatively large aperture detection, is made possible through mounting of optical modules and relays inside optical ports. This allows positioning of the last optical element within 10 mm from the cylinder wall. Finally, the implementation on a multi-cylinder engine allows for flexible and independent operation of the optically accessible cylinder for testing purposes. The performance of the integrated optical engine and imaging system developed is demonstrated through laser Mie scattering imaging of fuel jet structures, from which information on liquid penetration and spray angles can be deduced. Double pulse laser-sheet imaging of native in-cylinder structures is also demonstrated, for the purpose of velocimetry. (paper)

  19. Developing a Validation Methodology for TACAIR Soar Agents in EAAGLES

    National Research Council Canada - National Science Library

    Alford III, Lewis E; Dudas, Brian A

    2005-01-01

    ...) environment, but have potential for use in EAAGLES. SIMAF requested research be conducted on a validation methodology to apply to the agents' behavior once they have been successfully imported into the EAAGLES environment...

  20. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  1. Methodological development of the process of appreciation of photography Conceptions

    Directory of Open Access Journals (Sweden)

    Yovany Álvarez García

    2012-12-01

    Full Text Available This article discusses the different concepts that are used to methodological appreciation of photography. Since photography is one of the manifestations of the visu al arts with the most commonly interacts daily ; from which can be found in books, magazines and other publications, discusses various methodologies to assess the photographic image. It addresses also the classic themes of photography as well as some expres sive elements.

  2. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  3. Environmental quality indexing of large industrial development alternatives using AHP

    International Nuclear Information System (INIS)

    Solnes, Julius

    2003-01-01

    Two industrial development alternatives have been proposed for the East Coast of Iceland in order to strengthen its socio-economic basis. The favoured option is to build a large aluminium smelter, which requires massive hydropower development in the nearby highlands. Another viable option is the construction of a 6-million-ton oil refinery, following the planned exploitation of the Timan Pechora oil reserves in the Russian Arctic. A third 'fictitious' alternative could be general development of existing regional industry and new knowledge-based industries, development of ecotourism, establishment of national parks, accompanied by infrastructure improvement (roads, tunnels, communications, schools, etc.). The three alternatives will have different environmental consequences. The controversial hydropower plant for the smelter requires a large water reservoir as well as considerable land disturbance in this unique mountain territory, considered to be the largest uninhabited wilderness in Western Europe. The aluminium smelter and the oil refinery will give rise to substantial increase of the greenhouse gas (GHG) emissions of the country (about 20%). Then there is potential environmental risk associated with the refinery regarding oil spills at sea, which could have disastrous impact on the fisheries industry. However, the oil refinery does not require any hydropower development, which is a positive factor. Finally, the third alternative could be defined as a ''green'' solution whereby the detrimental environmental consequences of the two industrial solutions are mostly avoided. In order to compare the three alternatives in an orderly manner, the analytic hierarchy process methodology of Saaty was applied to calculate the environmental quality index of each alternative, which is defined as a weighted sum of selected environmental and socio-economic factors. These factors are evaluated on a comparison basis, applying the AHP methodology, and the weights in the quality

  4. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  5. Development on design methodology of PWR passive containment system

    International Nuclear Information System (INIS)

    Lee, Seong Wook

    1998-02-01

    The containment is the most important barrier against the release of radioactive materials into the environment during accident conditions of nuclear power plants. Therefore the development of a reliable containment cooling system is one of key areas in advanced reactor development. To enhance the safety of the containment system, many new containment system designs have been proposed and developed in the world. Several passive containment cooling system (PCCS) concepts for both steel and concrete containment systems are overviewed and assessed comparatively. Major concepts considered are: (a) the spray of water on the outer surface of a steel containment from an elevated tank, (b) an external moat for a steel containment, (c) a suppression pool for a concrete containment, and (d) combination of the internal spray and internal or external condensers for a concrete containment. Emphasis is given to the heat removal principles, the required heat transfer area, system complexity and operational reliability. As one of conceptual design steps of containment, a methodology based on scaling principles is proposed to determine the containment size according to the power level. The AP600 containment system is selected as the reference containment to which the scaling laws are applied. Governing equations of containment pressure are set up in consideration of containment behavior in accident conditions. Then, the dimensionless numbers, which characterize the containment phenomena, are derived for the blowdown dominant and decay heat dominant stage, respectively. The important phenomena in blowdown stage are mass and energy sources and their absorption in containment atmosphere or containment structure, while heat transfer to the outer environment becomes important in decay heat stage. Based on their similarity between the prototype and the model, the containment sizes are determined for higher power levels and are compared with the SPWR containment design values available

  6. Development of radiation risk assessment simulator using system dynamics methodology

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moosung

    2008-01-01

    The potential magnitudes of radionuclide releases under severe accident loadings and offsite consequences as well as the overall risk (the product of accident frequencies and consequences) are analyzed and evaluated quantitatively in this study. The system dynamics methodology has been applied to predict the time-dependent behaviors such as feedback and dependency as well as to model uncertain behavior of complex physical system. It is used to construct the transfer mechanisms of time dependent radioactivity concentration and to evaluate them. Dynamic variations of radio activities are simulated by considering several effects such as deposition, weathering, washout, re-suspension, root uptake, translocation, leaching, senescence, intake, and excretion of soil. The time-dependent radio-ecological model applicable to Korean specific environment has been developed in order to assess the radiological consequences following the short-term deposition of radio-nuclides during severe accidents nuclear power plant. An ingestion food chain model can estimate time dependent radioactivity concentrations in foodstuffs. And it is also shown that the system dynamics approach is useful for analyzing the phenomenon of the complex system as well as the behavior of structure values with respect to time. The output of this model (Bq ingested per Bq m - 2 deposited) may be multiplied by the deposition and a dose conversion factor (Gy Bq -1 ) to yield organ-specific doses. The model may be run deterministically to yield a single estimate or stochastic distributions by 'Monte-Carlo' calculation that reflects uncertainty of parameter and model uncertainties. The results of this study may contribute to identifying the relative importance of various parameters occurred in consequence analysis, as well as to assessing risk reduction effects in accident management. (author)

  7. Strain-Based Design Methodology of Large Diameter Grade X80 Linepipe

    Energy Technology Data Exchange (ETDEWEB)

    Lower, Mark D. [ORNL

    2014-04-01

    developed from test data. The results are intended to enhance SBD and analysis methods for producing safe and cost effective pipelines capable of accommodating large plastic strains in seismically active arctic areas.

  8. A combination of streamtube and geostatical simulation methodologies for the study of large oil reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Chakravarty, A.; Emanuel, A.S.; Bernath, J.A. [Chevron Petroleum Technology Company, LaHabra, CA (United States)

    1997-08-01

    The application of streamtube models for reservoir simulation has an extensive history in the oil industry. Although these models are strictly applicable only to fields under voidage balance, they have proved to be useful in a large number of fields provided that there is no solution gas evolution and production. These models combine the benefit of very fast computational time with the practical ability to model a large reservoir over the course of its history. These models do not, however, directly incorporate the detailed geological information that recent experience has taught is important. This paper presents a technique for mapping the saturation information contained in a history matched streamtube model onto a detailed geostatistically derived finite difference grid. With this technique, the saturation information in a streamtube model, data that is actually statistical in nature, can be identified with actual physical locations in a field and a picture of the remaining oil saturation can be determined. Alternatively, the streamtube model can be used to simulate the early development history of a field and the saturation data then used to initialize detailed late time finite difference models. The proposed method is presented through an example application to the Ninian reservoir. This reservoir, located in the North Sea (UK), is a heterogeneous sandstone characterized by a line drive waterflood, with about 160 wells, and a 16 year history. The reservoir was satisfactorily history matched and mapped for remaining oil saturation. A comparison to 3-D seismic survey and recently drilled wells have provided preliminary verification.

  9. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  10. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  11. Development of CANDU ECCS performance evaluation methodology and guides

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Kwang Hyun; Park, Kyung Soo; Chu, Won Ho [Korea Maritime Univ., Jinhae (Korea, Republic of)

    2003-03-15

    The objectives of the present work are to carry out technical evaluation and review of CANDU safety analysis methods in order to assist development of performance evaluation methods and review guides for CANDU ECCS. The applicability of PWR ECCS analysis models are examined and it suggests that unique data or models for CANDU are required for the following phenomena: break characteristics and flow, frictional pressure drop, post-CHF heat transfer correlations, core flow distribution during blowdown, containment pressure, and reflux rate. For safety analysis of CANDU, conservative analysis or best estimate analysis can be used. The main advantage of BE analysis is a more realistic prediction of margins to acceptance criteria. The expectation is that margins demonstrated with BE methods would be larger that when a conservative approach is applied. Some outstanding safety analysis issues can be resolved by demonstration that accident consequences are more benign than previously predicted. Success criteria for analysis and review of Large LOCA can be developed by top-down approach. The highest-level success criteria can be extracted from C-6 and from them, the lower level criteria can be developed step-by-step, in a logical fashion. The overall objectives for analysis and review are to verify radiological consequences and frequency are met.

  12. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  13. Development of a methodology for the detection of Ra226 in large volumes of water by gamma spectrometry; modification and validation of the method for detection and quantification of Ra226 in small volumes of water by alpha spectrometry, used by the Centro de Investigacion en Ciencias Atomicas, Nucleares y Moleculares (CICANUM, UCR)

    International Nuclear Information System (INIS)

    Molina Porras, Arnold

    2011-01-01

    The test method has been validated for quantifying the specific activity of Ra 226 in water alpha spectrometry. The CICANUM has used this method as part of the proposed harmonization of methods ARCAL (IAEA). The method is based on a first separation and preconcentration of Ra 226 by coprecipitation and subsequent MnO 2 micro precipitation as Ba (Ra) SO 4 . Samples were prepared and then was performed the counting by alpha spectrometry. A methodology of radio sampling for large volumes of water was tested in parallel, using acrylic fibers impregnated with manganese oxide (IV) to determine the amount of Ra 226 present by gamma spectrometry. Small-scale tests, have determined that the best way to prepare the fiber is the reference method found in the literature and using the oven at 60 degrees Celsius. (author) [es

  14. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    Science.gov (United States)

    2014-06-01

    System Number CAIDA Cooperative Association of Internet Data Analysis GB gigabyte IETF IPv4 IP IPv6 ISP NPS NTC RFC RTT TTL ICMP NPS ESD VSD TCP UDP DoS...including, DIMES, IPlane, Ark IPv4 All Prefix /24 and recently NPS probing methodol- ogy. NPS probing methodology is different from the others because it...trace, a history of the forward interface-level path and time to send and acknowledge are available to analyze. However, traceroute may not return

  15. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  16. MANAGING LARGE CLASSES IN DEVELOPING COUNTRIES

    African Journals Online (AJOL)

    PROF. BARTH EKWUEME

    GLOBAL JOURNAL OF EDUCATIONAL RESEARCH VOL 15, 2016: 31-39. COPYRIGHT© ... classes or overcrowded classrooms affect the quality of education delivered in the school system. ... central to their national development strategy.

  17. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  18. Large area electron beam diode development

    International Nuclear Information System (INIS)

    Helava, H.; Gilman, C.M.; Stringfield, R.M.; Young, T.

    1983-01-01

    A large area annular electron beam diode has been tested at Physics International Co. on the multi-terawatt PITHON generator. A twelve element post hole convolute converted the coaxial MITL into a triaxial arrangement of anode current return structures both inside and outside the cathode structure. The presence of both inner and outer current return paths provide magnetic pressure balance for the beam, as determined by diode current measurements. X-ray pinhole photographs indicated uniform emission with intensity maxima between the post positions. Current losses in the post hole region were negligible, as evidenced by the absence of damage to the aluminum hardware. Radial electron flow near the cathode ring however did damage the inner anode cylinder between the post positions. Cutting away these regions prevented further damage of the transmission lines

  19. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  20. Large wind turbine development in Europe

    Energy Technology Data Exchange (ETDEWEB)

    Zervos, A. [Center for Renewable Energy Sources, Attikis (Greece)

    1996-12-31

    During the last few years we have witnessed in Europe the development of a new generation of wind turbines ranging from 1000-1500 kW size. They are presently being tested and they are scheduled to reach the market in late 1996 early 1997. The European Commission has played a key role by funding the research leading to the development of these turbines. The most visible initiative at present is the WEGA program - the development, together with Europe`s leading wind industry players of a new generation of turbines in the MW range. By the year 1997 different European manufacturers will have introduced almost a dozen new MW machine types to the international market, half of them rated at 1.5 MW. 3 refs., 3 tabs.

  1. The Helicobacter Eradication Aspirin Trial (HEAT: A Large Simple Randomised Controlled Trial Using Novel Methodology in Primary Care

    Directory of Open Access Journals (Sweden)

    Jennifer S. Dumbleton

    2015-09-01

    Discussion: HEAT is important medically, because aspirin is so widely used, and methodologically, as a successful trial would show that large-scale studies of important clinical outcomes can be conducted at a fraction of the cost of those conducted by industry, which in turn will help to ensure that trials of primarily medical rather than commercial interest can be conducted successfully in the UK.

  2. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  3. Engendering Development: Some Methodological Perspectives on Child Labour

    Directory of Open Access Journals (Sweden)

    Erica Burman

    2006-01-01

    Full Text Available In this article I address when and why it is useful to focus on gender in the design and conceptualisation of developmental psychological research. Since methodological debates treated in the abstract tend to lack both the specificity and rigour that application to a particular context or topic imports, I take a particular focus for my discussion: child labour. In doing so I hope to highlight the analytical and practical gains of bringing gendered agendas alongside, and into, developmental research. While child labour may seem a rather curious topic for discussion of developmental psychological research practice, this article will show how it indicates with particular clarity issues that mainstream psychological research often occludes or forgets. In particular, I explore analytical and methodological benefits of exploring the diverse ways gender structures notions of childhood, alongside the developmental commonalities and asymmetries of gender and age as categories. I suggest that the usual assumed elision between women and children is often unhelpful for both women and children. Instead, an analytical attention to the shifting forms and relations of children's work facilitates more differentiated perspectives on how its meanings reflect economic and cultural (including gendered conditions, and so attends better to social inequalities. These inequalities also structure the methodological conditions and paradigms for research with children, and so the article finishes by elaborating from this discussion of child labour four key principles for engendering psychological research with and about children, which also have broader implications for conceptualisations of the relations between gender, childhood, culture and families. URN: urn:nbn:de:0114-fqs060111

  4. The Methodology of Management for Long Term Energy Efficiency Development

    International Nuclear Information System (INIS)

    Zebergs, V.; Kehris, O.; Savickis, J.; Zeltins, N.

    2010-01-01

    The paper has shown that the Member States of the European Union (EU) do what they can in order to accelerate the raising of energy efficiency (EE). In each EU Member State investigations are conducted in the planning and management methods with a view to achieve faster and greater EE gains. In Latvia, which imports almost 70% of the total energy resources consumed, saving of each 'toe' is of great importance. Adaptation of the general policy assessment methodology is being studied for planning and management of the EE process. 12 EE management methods have been analysed and recommendations worked out for the introduction of several most topical methods.(author).

  5. Development of new assessment methodology for locally corroded pipe

    International Nuclear Information System (INIS)

    Lim, Hwan; Shim, Do Jun; Kim, Yun Jae; Kim, Young Jin

    2002-01-01

    In this paper, a unified methodology based on the local stress concept to estimate residual strength of locally thinned pipes is proposed. An underlying idea of the proposed methodology is that the local stress in the minimum section for locally thinned pipe is related to the reference stress, popularly used in creep problems. Then the problem remains how to define the reference stress, that is the reference load. Extensive three-dimensional Finite Element (FE) analyses were performed to simulate full-scale pipe tests conducted for various shapes of wall thinned area under internal pressure and bending moment. Based on these FE results, the reference load is proposed, which is independent of materials. A natural outcome of this method is the maximum load capacity. By comparing with existing test results, it is shown that the reference stress is related to the fracture stress, which in turn can be posed as the fracture criterion of locally thinned pipes. The proposed method is powerful as it can be easily generalised to more complex problems, such as pipe bends and tee-joints

  6. Development of Risk Assessment Methodology for State's Nuclear Security Regime

    International Nuclear Information System (INIS)

    Jang, Sung Soon; Seo, Hyung Min; Lee, Jung Ho; Kwak, Sung Woo

    2011-01-01

    Threats of nuclear terrorism are increasing after 9/11 terrorist attack. Treats include nuclear explosive device (NED) made by terrorist groups, radiological damage caused by a sabotage aiming nuclear facilities, and radiological dispersion device (RDD), which is also called 'dirty bomb'. In 9/11, Al Qaeda planed to cause radiological consequences by the crash of a nuclear power plant and the captured airplane. The evidence of a dirty bomb experiment was found in Afganistan by the UK intelligence agency. Thus, the international communities including the IAEA work substantial efforts. The leaders of 47 nations attended the 2010 nuclear security summit hosted by President Obama, while the next global nuclear summit will be held in Seoul, 2012. Most states established and are maintaining state's nuclear security regime because of the increasing threat and the international obligations. However, each state's nuclear security regime is different and depends on the state's environment. The methodology for the assessment of state's nuclear security regime is necessary to design and implement an efficient nuclear security regime, and to figure out weak points. The IAEA's INPRO project suggests a checklist method for State's nuclear security regime. The IAEA is now researching more quantitative methods cooperatively with several countries including Korea. In this abstract, methodologies to evaluate state's nuclear security regime by risk assessment are addressed

  7. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  8. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  9. The use and effectiveness of information system development methodologies in health information systems / Pieter Wynand Conradie.

    OpenAIRE

    Conradie, Pieter Wynand

    2010-01-01

    Abstract The main focus of this study is the identification of factors influencing the use and effectiveness of information system development methodologies (Le., systems development methodologies) in health information systems. In essence, it can be viewed as exploratory research, utilizing a conceptual research model to investigate the relationships among the hypothesised factors. More specifically, classified as behavioural science, it combines two theoretical models, namely...

  10. Development of a Methodology for VHTR Accident Consequence Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joeun; Kim, Jintae; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    The substitution of the VHTR for burning fossil fuels conserves these hydrocarbon resources for other uses and eliminates the emissions of greenhouse. In Korea, for these reasons, constructing the VHTR plan for hydrogen production is in progress. In this study, the consequence analysis for the off-site releases of radioactive materials during severe accidents has been performed using the level 3 PRA technology. The offsite consequence analysis for a VHTR using the MACCS code has been performed. Since the passive system such as the RCCS(Reactor Cavity Cooling System) are equipped, the frequency of occurrence of accidents has been evaluated to be very low. For further study, the assessment for characteristic of VHTR safety system and precise quantification of its accident scenarios is expected to conduct more certain consequence analysis. This methodology shown in this study might contribute to enhancing the safety of VHTR design by utilizing the results having far lower effect on the environment than the LWRs.

  11. Development of a methodology for classifying software errors

    Science.gov (United States)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  12. Towards a general object-oriented software development methodology

    Science.gov (United States)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  13. Development of a methodology for life cycle building energy ratings

    International Nuclear Information System (INIS)

    Hernandez, Patxi; Kenny, Paul

    2011-01-01

    Traditionally the majority of building energy use has been linked to its operation (heating, cooling, lighting, etc.), and much attention has been directed to reduce this energy use through technical innovation, regulatory control and assessed through a wide range of rating methods. However buildings generally employ an increasing amount of materials and systems to reduce the energy use in operation, and energy embodied in these can constitute an important part of the building's life cycle energy use. For buildings with 'zero-energy' use in operation the embodied energy is indeed the only life cycle energy use. This is not addressed by current building energy assessment and rating methods. This paper proposes a methodology to extend building energy assessment and rating methods accounting for embodied energy of building components and systems. The methodology is applied to the EU Building Energy Rating method and, as an illustration, as implemented in Irish domestic buildings. A case study dwelling is used to illustrate the importance of embodied energy on life cycle energy performance, particularly relevant when energy use in operation tends to zero. The use of the Net Energy Ratio as an indicator to select appropriate building improvement measures is also presented and discussed. - Highlights: → The definitions for 'zero energy buildings' and current building energy ratings are examined. → There is a need to integrate a life cycle perspective within building energy ratings. → A life cycle building energy rating method (LC-BER), including embodied energy is presented. → Net Energy Ratio is proposed as an indicator to select building energy improvement options.

  14. Development of advanced methodology for defect assessment in FBR power plants

    International Nuclear Information System (INIS)

    Meshii, Toshiyuki; Asayama, Tai

    2001-03-01

    As a preparation for developing a code for FBR post construction code, (a) JSME Code NA1-2000 was reviewed on the standpoint of applying it to FBR power plants and the necessary methodologies for defect assessment for FBR plants were pointed out (b) large capacity-high speed fatigue crack propagation (FCP) testing system was developed and some data were acquired to evaluate the FCP characteristics under thermal stresses. Results showed that the extended research on the following items are necessary for developing FBR post construction code. (1) Development of assessment for multiple defects due to creep damage. Multiple defects due to creep damage are not considered in the existing code, which is established for nuclear power plants in service under negligible-creep temperature. Therefore method to assess the integrity of these multiple defects due to creep damage is necessary. (2) FCP resistance for small load. Since components of FBR power plants are designed to minimize thermal stresses, the accuracy of FCP resistance for small load is important to estimate the crack propagation under thermal stresses accurately. However, there is not a sufficient necessary FCP data for small loads, maybe because the data is time consuming. Therefore we developed a large capacity-high speed FCP testing system, made a guideline for accelerated test and acquired some data to meet the needs. Continuous efforts to accumulate small load FCP data for various materials are necessary. (author)

  15. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  16. Research Proposal: Methodology for Assessment Frameworks in Large-scale Infrastructural Water Projects

    NARCIS (Netherlands)

    Hommes, Saskia

    2005-01-01

    Water management is a central and ongoing issue in the Netherlands. Large infrastructural projects are being carried out and planned in a number of water systems. These initiatives operate within a complex web of interactions, between short- and long-term, economic costs and benefits, technical

  17. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  18. Theoretical and methodological basis of the comparative historical and legal method development

    Directory of Open Access Journals (Sweden)

    Д. А. Шигаль

    2015-05-01

    Full Text Available Problem setting. Development of any scientific method is always both a question of its structural and functional characteristics and place in the system of scientific methods, and a comment as for practicability of such methodological work. This paper attempts to give a detailed response to the major comments and objections arising in respect of the separation as an independent means of special and scientific knowledge of comparative historical and legal method. Recent research and publications analysis. Analyzing research and publications within the theme of the scientific article, it should be noted that attention to methodological issues of both general and legal science at the time was paid by such prominent foreign and domestic scholars as I. D. Andreev, Yu. Ya. Baskin, O. L. Bygych, M. A. Damirli, V. V. Ivanov, I. D. Koval'chenko, V. F. Kolomyitsev, D. V. Lukyanov, L. A. Luts, J. Maida, B. G. Mogilnytsky, N. M. Onishchenko, N. M. Parkhomenko, O. V. Petryshyn, S. P. Pogrebnyak, V. I. Synaisky, V. M. Syryh, O. F. Skakun, A. O. Tille, D. I. Feldman and others. It should be noted that, despite a large number of scientific papers in this field, the interest of research partnership in the methodology of history of state and law science still unfairly remains very low. Paper objective. The purpose of this scientific paper is theoretical and methodological rationale for the need of separation and development of comparative historical and legal method in the form of answers to more common questions and objections that arise in scientific partnership in this regard. Paper main body. Development of comparative historical and legal means of knowledge is quite justified because it meets the requirements of the scientific method efficiency, which criteria are the speed for achieving this goal, ease of use of one or another way of scientific knowledge, universality of research methods, convenience of techniques that are used and so on. Combining the

  19. Development of Six Sigma methodology for CNC milling process improvements

    Science.gov (United States)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  20. Development of methodology of financial assets accounting in IFRS context

    Directory of Open Access Journals (Sweden)

    V.I. Tsurkanu

    2018-04-01

    Full Text Available In the innovation economy the proportion of resources directed to investment is significantly increasing and therefore the process becomes an integral part of the economic activities of modern organizations. In that situation the organization acquire another type of assets called financial, which differ in their characteristics from tangible and intangible assets. The authors of the present study firstly prove the need for economic interpretation of the financial assets and allocation in the balance their own positions, after the recognition, on the basis of the characteristic of such assets and for accounting and reporting should be assessed. In this context, we reveal methods that can choose the organizations, using business management models implemented by IFRS 9 «Financial instruments» for evaluation of financial assets, depending on their category. Special attention is paid to improving the methodology of accounting for financial assets in accordance with their specific characteristics of recognition and measurement. These issues are investigated not only in theoretical terms, but also on the basis of the comparison of normative and legislative acts of the Republic of Moldova and Ukraine with the regulations of IFRS. In addition, whereas the accounting systems and financial reporting in these countries change in accordance with the requirements of the Directive 2013/34/EU, their impact on the accounting of financial assets is also taken into account. According to the results of the research, drafting conclusions and suggestions are of theoretical nature and are of practical importance.

  1. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  2. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  3. Development of margin assessment methodology of decay heat removal function against external hazards. (2) Tornado PRA methodology

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2014-01-01

    Probabilistic Risk Assessment (PRA) for external events has been recognized as an important safety assessment method after the TEPCO's Fukushima Daiichi nuclear power station accident. The PRA should be performed not only for earthquake and tsunami which are especially key events in Japan, but also the PRA methodology should be developed for the other external hazards (e.g. tornado). In this study, the methodology was developed for Sodium-cooled Fast Reactors paying attention to that the ambient air is their final heat sink for removing decay heat under accident conditions. First, tornado hazard curve was estimated by using data recorded in Japan. Second, important structures and components for decay heat removal were identified and an event tree resulting in core damage was developed in terms of wind load and missiles (i.e. steel pipes, boards and cars) caused by a tornado. Main damage cause for important structures and components is the missiles and the tornado missiles that can reach those components and structures placed on high elevations were identified, and the failure probabilities of the components and structures against the tornado missiles were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or outtake in the decay heat removal system, and a probability of failure caused by the missile impacts. Finally, the event tree was quantified. As a result, the core damage frequency was enough lower than 10 -10 /ry. (author)

  4. Quality assurance in a large research and development laboratory

    International Nuclear Information System (INIS)

    Neill, F.H.

    1980-01-01

    Developing a quality assurance program for a large research and development laboratory provided a unique opportunity for innovative planning. The quality assurance program that emerged has been tailored to meet the requirements of several sponsoring organizations and contains the flexibility for experimental programs ranging from large engineering-scale development projects to bench-scale basic research programs

  5. Demand bidding construction for a large consumer through a hybrid IGDT-probability methodology

    International Nuclear Information System (INIS)

    Zare, Kazem; Moghaddam, Mohsen Parsa; Sheikh El Eslami, Mohammad Kazem

    2010-01-01

    This paper provides a technique to derive the bidding strategy in the day-ahead market for a large consumer that procures its electricity demand in both day-ahead market and a subsequent adjustment market. It is considered that hourly market prices are normally distributed and this correlation is modeled by variance-covariance matrix. The uncertainty of procurement cost is modeled using concepts derived from information gap decision theory which allows deriving robust bidding strategies with respect to price volatility. First Order Reliability Method is applied to construct the robust bidding curve. The proposed technique is illustrated through a realistic case study. (author)

  6. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1986-01-01

    The analysis of the processes involved in the burial of nuclear wastes can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission

  7. Chapter 43: Assessment of NE Greenland: Prototype for development of Circum-ArcticResource Appraisal methodology

    Science.gov (United States)

    Gautier, D.L.; Stemmerik, L.; Christiansen, F.G.; Sorensen, K.; Bidstrup, T.; Bojesen-Koefoed, J. A.; Bird, K.J.; Charpentier, R.R.; Houseknecht, D.W.; Klett, T.R.; Schenk, C.J.; Tennyson, Marilyn E.

    2011-01-01

    Geological features of NE Greenland suggest large petroleum potential, as well as high uncertainty and risk. The area was the prototype for development of methodology used in the US Geological Survey (USGS) Circum-Arctic Resource Appraisal (CARA), and was the first area evaluated. In collaboration with the Geological Survey of Denmark and Greenland (GEUS), eight "assessment units" (AU) were defined, six of which were probabilistically assessed. The most prospective areas are offshore in the Danmarkshavn Basin. This study supersedes a previous USGS assessment, from which it differs in several important respects: oil estimates are reduced and natural gas estimates are increased to reflect revised understanding of offshore geology. Despite the reduced estimates, the CARA indicates that NE Greenland may be an important future petroleum province. ?? 2011 The Geological Society of London.

  8. METHODOLOGICAL ASPECTS OF RURAL DEVELOPMENT GOVERNANCE CASE STUDY

    Directory of Open Access Journals (Sweden)

    Vitalina TSYBULYAK

    2014-01-01

    Full Text Available The article discusses current approaches to the process of assessing rural development governance, reveals its advantages and disadvantages. The article as well presents performance system indicators of governance process by means of two elements of dynamics assessment, rural development (economic, financial, and social sphere, ecology and population health and management process (assessment of strategic plan (concept of development, program of socioeconomic development of rural areas, current activity of local authorities, in particular. More over, it is suggested to use typology of approaches (objective (evolutionary, command and control, economic (infrastructural, complex, and qualitative to definition of process essence of rural development governance and correlation of traditional functions, performed by the subjects of the governance process of rural development (state authorities institutions, local authorities institutions, economic entities, and community. Adjusting traditional functions, performed by governance subjects of local development, their supplementing with new ones, relevant to the present-to-date model of «shared governance» is an important element of analysis of assessment tools for effectiveness of rural development governance. In addition, the author defines functioning of two forms of rural population involvement into the process of rural development governance: active and passive. Active one suggests that rural population participate in making and implementing governance decisions (public meetings, organization of social discussions, and development of territory community self-governance; passive one suggests that the emphasis is placed only on information distribution among population (meetings with parliament members, direct phone lines with territory governors, publication of normative and legal acts and reports on budget execution

  9. Integration of the Scrum methodology in mechatronic product development

    OpenAIRE

    Mauri Also, Joan Josep

    2015-01-01

    The purpose of this study was to demonstrate if it would be possible for a mechatronic product development team to use Scrum, an Agile Development framework with both, the students of UVIC-UCC and the company ITQ GmbH, behind the student project called Mi5. The Agile philosophy and methods have revolutionized the software development industry in the last decade, and therefore it was of interest to see if this new way of working would be applicable in other disciplines. Thus, the study focu...

  10. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  11. Learning challenges and sustainable development: A methodological perspective.

    Science.gov (United States)

    Seppänen, Laura

    2017-01-01

    Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.

  12. The application of post yield fracture methodology to the evaluation of large structures

    International Nuclear Information System (INIS)

    Landes, J.D.

    1979-01-01

    The objective of this work is to determine how to use small specimens test results to measure fracture toughness values for application to the evaluation of large structural components. Linear elastic fracture mechanics concepts based on the crack tip stress intensity factor, K, have been extended into the post yield regime by the use of elastic-plastic characterizing parameters such as J integral and COD. One of the primary applications of this technology is the determination of fracture toughness values from small specimens tests taken primarily in the post yield regime which can be used to evaluate structures operating in an essentially linear elastic regime. The fracture toughness values may be either conservative or unconservative depending on the fracture mode; extreme care must be taken in interpretting these results. (orig.)

  13. SED fitting with MCMC: methodology and application to large galaxy surveys

    Science.gov (United States)

    Acquaviva, Viviana; Gawiser, Eric; Guaita, Lucia

    2012-08-01

    We present GalMC (Acquaviva et al. 2011), our publicly available Markov Chain Monte Carlo algorithm for SED fitting, show the results obtained for a stacked sample of Lyman Alpha Emitting galaxies at z ~ 3, and discuss the dependence of the inferred SED parameters on the assumptions made in modeling the stellar populations. We also introduce SpeedyMC, a version of GalMC based on interpolation of pre-computed template libraries. While the flexibility and number of SED fitting parameters is reduced with respect to GalMC, the average running time decreases by a factor of 20,000, enabling SED fitting of each galaxy in about one second on a 2.2GHz MacBook Pro laptop, and making SpeedyMC the ideal instrument to analyze data from large photometric galaxy surveys.

  14. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  15. Overview on hydrogen risk research and development activities: Methodology and open issues

    Energy Technology Data Exchange (ETDEWEB)

    Bentaib, Ahmed; Meynet, Nicolas; Bleyer, Alexande [Institut de Radioprotection et de Surete Nucleaire (IRSN), Severe Accident Department, Fontenay-aux-Roses (France)

    2015-02-15

    During the course of a severe accident in a light water nuclear reactor, large amounts of hydrogen can be generated and released into the containment during reactor core degradation. Additional burnable gases [hydrogen (H2) and carbon monoxide (CO)] may be released into the containment in the corium/concrete interaction. This could subsequently raise a combustion hazard. As the Fukushima accidents revealed, hydrogen combustion can cause high pressure spikes that could challenge the reactor buildings and lead to failure of the surrounding buildings. To prevent the gas explosion hazard, most mitigation strategies adopted by European countries are based on the implementation of passive autocatalytic recombiners (PARs). Studies of representative accident sequences indicate that, despite the installation of PARs, it is difficult to prevent at all times and locations, the formation of a combustible mixture that potentially leads to local flame acceleration. Complementary research and development (R and D) projects were recently launched to understand better the phenomena associated with the combustion hazard and to address the issues highlighted after the Fukushima Daiichi events such as explosion hazard in the venting system and the potential flammable mixture migration into spaces beyond the primary containment. The expected results will be used to improve the modeling tools and methodology for hydrogen risk assessment and severe accident management guidelines. The present paper aims to present the methodology adopted by Institut de Radioprotection et de Su.

  16. An Evidence Based Methodology to Facilitate Public Library Non-fiction Collection Development

    Directory of Open Access Journals (Sweden)

    Matthew Kelly

    2015-12-01

    Full Text Available Objective – This research was designed as a pilot study to test a methodology for subject based collection analysis for public libraries. Methods – WorldCat collection data from eight Australian public libraries was extracted using the Collection Evaluation application. The data was aggregated and filtered to assess how the sample’s titles could be compared against the OCLC Conspectus subject categories. A hierarchy of emphasis emerged and this was divided into tiers ranging from 1% of the sample. These tiers were further analysed to quantify their representativeness against both the sample’s titles and the subject categories taken as a whole. The interpretive aspect of the study sought to understand the types of knowledge embedded in the tiers and was underpinned by hermeneutic phenomenology. Results – The study revealed that there was a marked tendency for a small percentage of subject categories to constitute a large proportion of the potential topicality that might have been represented in these types of collections. The study also found that distribution of the aggregated collection conformed to a Power Law distribution (80/20 so that approximately 80% of the collection was represented by 20% of the subject categories. The study also found that there were significant commonalities in the types of subject categories that were found in the designated tiers and that it may be possible to develop ontologies that correspond to the collection tiers. Conclusions – The evidence-based methodology developed in this pilot study has the potential for further development to help to improve the practice of collection development. The introduction of the concept of the epistemic role played by collection tiers is a promising aid to inform our understanding of knowledge organization for public libraries. The research shows a way forward to help to link subjective decision making with a scientifically based approach to managing knowledge

  17. CFD methodology development for Singapore Green Mark Building application

    NARCIS (Netherlands)

    Chiu, P.H.; Raghavan, V.S.G.; Poh, H.J.; Tan, E.; Gabriela, O.; Wong, N.H.; van Hooff, T.; Blocken, B.; Li, R.; Leong-Kok, S.M.

    2017-01-01

    In the recent decade, investigation on the total building performance has become increasingly important for the environmental modelling community. With the advance of integrated design and modelling tool and Building Information Modelling (BIM) development, it is now possible to simulate and predict

  18. Development of a methodology for accident causation research

    Science.gov (United States)

    1983-06-01

    The obj ective of this study was to fully develop and apply a me thodology to : study accident causation, uhich was outlined in a previous study . " Causal" factors : are those pre-crash factors, which are statistically related to the accident rate :...

  19. A Methodology For Developing an Agent Systems Reference Architecture

    Science.gov (United States)

    2010-05-01

    agent framworks , we create an abstraction noting similarities and differences. The differences are documented as points of variation. The result...situated in the physical en- vironment. Addressing how conceptual components of an agent system is beneficial to agent system architects, developers, and

  20. Advances in Artificial Neural Networks - Methodological Development and Application

    Science.gov (United States)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  1. Organizational Culture and Scale Development: Methodological Challenges and Future Directions

    Directory of Open Access Journals (Sweden)

    Bavik Ali

    2014-12-01

    Full Text Available Defining and measuring organizational culture (OC is of paramount importance to organizations because a strong culture could potentially increase service quality and yield sustainable competitive advantages. However, such process could be challenging to managers because the scope of OC has been defined differently across disciplines and industries, which has led to the development of various scales for measuring OC. In addition, previously developed OC scales may also not be fully applicable in the hospitality and tourism context. Therefore, by highlighting the key factors affecting the business environment and the unique characteristics of hospitality industry, this paper aims to align the scope of OC closely with the industry and to put forth the need for a new OC scale that accurately responds to the context of the hospitality industry.

  2. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  3. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  4. Regular website transformation to mobile friendly methodology development

    OpenAIRE

    Miščenkov, Ilja

    2017-01-01

    Nowadays, rate of technology improvement grows faster than ever which results in increased mobile device usage. Internet users often choose to browse their favorite websites via computers as well as mobile devices, however, not every website is suited to be displayed on both types of technology. As an example the website of Vilnius University’s Mathematics and Informatics faculty. Therefore the objective of this work is to develop a step-by-step procedure which is used to turn a regular websi...

  5. An Eulerian two-phase model for steady sheet flow using large-eddy simulation methodology

    Science.gov (United States)

    Cheng, Zhen; Hsu, Tian-Jian; Chauchat, Julien

    2018-01-01

    A three-dimensional Eulerian two-phase flow model for sediment transport in sheet flow conditions is presented. To resolve turbulence and turbulence-sediment interactions, the large-eddy simulation approach is adopted. Specifically, a dynamic Smagorinsky closure is used for the subgrid fluid and sediment stresses, while the subgrid contribution to the drag force is included using a drift velocity model with a similar dynamic procedure. The contribution of sediment stresses due to intergranular interactions is modeled by the kinetic theory of granular flow at low to intermediate sediment concentration, while at high sediment concentration of enduring contact, a phenomenological closure for particle pressure and frictional viscosity is used. The model is validated with a comprehensive high-resolution dataset of unidirectional steady sheet flow (Revil-Baudard et al., 2015, Journal of Fluid Mechanics, 767, 1-30). At a particle Stokes number of about 10, simulation results indicate a reduced von Kármán coefficient of κ ≈ 0.215 obtained from the fluid velocity profile. A fluid turbulence kinetic energy budget analysis further indicates that the drag-induced turbulence dissipation rate is significant in the sheet flow layer, while in the dilute transport layer, the pressure work plays a similar role as the buoyancy dissipation, which is typically used in the single-phase stratified flow formulation. The present model also reproduces the sheet layer thickness and mobile bed roughness similar to measured data. However, the resulting mobile bed roughness is more than two times larger than that predicted by the empirical formulae. Further analysis suggests that through intermittent turbulent motions near the bed, the resolved sediment Reynolds stress plays a major role in the enhancement of mobile bed roughness. Our analysis on near-bed intermittency also suggests that the turbulent ejection motions are highly correlated with the upward sediment suspension flux, while

  6. Development and new applications of quantum chemical simulation methodology

    International Nuclear Information System (INIS)

    Weiss, A. K. H.

    2012-01-01

    The Division of Theoretical Chemistry at the University of Innsbruck is focused on the study of chemical compounds in aqueous solution, in terms of mainly hybrid quantum mechanical / molecular mechanical molecular dynamics simulations (QM/MM MD). Besides the standard means of data analysis employed for such simulations, this study presents several advanced and capable algorithms for the description of structural and dynamic properties of the simulated species and its hydration. The first part of this thesis further presents selected exemplary simulations, in particular a comparative study of Formamide and N-methylformamide, Guanidinium, and Urea. An included review article further summarizes the major advances of these studies. The computer programs developed in the course of this thesis are by now well established in the research field. The second part of this study presents the theory and a development guide for a quantum chemical program, QuMuLuS, that is by now used as a QM program for recent QM/MM simulations at the division. In its course, this part presents newly developed algorithms for electron integral evaluation and point charge embedding. This program is validated in terms of benchmark computations. The associated theory is presented on a detailed level, to serve as a source for contemporary and future studies in the division. In the third and final part, further investigations of related topics are addressed. This covers additional schemes of molecular simulation analysis, new software, as well as a mathematical investigation of a non-standard two-electron integral. (author)

  7. Combinations of options: Methodology for impact analysis. Development plan 1993

    International Nuclear Information System (INIS)

    1992-01-01

    The orientations favored by Hydro-Quebec in terms of electricity supply and demand are based on a few key selection criteria. These criteria, as described in its development plan, pertain to economic benefit for the utility and its customers, compatibility with sustainable development, minimization of costs to customers, preservation of the utility's financial health, generation of economic spinoffs, and ease of adaptation. Impacts are calculated to illustrate the selection criteria. The main methods, assumptions, and components used in evaluating the various impacts are described. The discounted overall cost for Hydro-Quebec and all of its customers, means of meeting electricity requirements, and the economic benefit for Hydro-Quebec of the various market development options are discussed. The indicators chosen for environmental impact assessment are set forth and the method used to calculate long-term supply costs is presented, along with the methods for calculating economic spinoffs. Finally, the concepts of energy mix and energy self-sufficiency are outlined. 1 tab

  8. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  9. Methodology Development for Assessment of Spaceport Technology Returns and Risks

    Science.gov (United States)

    Joglekar, Prafulla; Zapata, Edgar

    2001-01-01

    As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A

  10. Developing a methodology for identifying correlations between LERF and early fatality

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moo Sung; Ahn, Kwang Il

    2009-01-01

    The correlations between Large Early Release Frequency (LERF) and Early Fatality need to be investigated for risk-informed application and regulation. In RG-1.174, there are decision-making criteria using the measures of CDF and LERF, while there are no specific criteria on LERF. Since there are both huge uncertainty and large cost need in off-site consequence calculation, a LERF assessment methodology need to be developed and its correlation factor needs to be identified for risk-informed decision-making. This regards, the robust method for estimating off-site consequence has been performed for assessing health effects caused by radioisotopes released from severe accidents of nuclear power plants. And also, MACCS2 code are used for validating source term quantitatively regarding health effects depending on release characteristics of radioisotopes during severe accidents has been performed. This study developed a method for identifying correlations between LERF and Early Fatality and validates the results of the model using MACCS2 code. The results of this study may contribute to defining LERF and finding a measure for risk-informed regulations and risk-informed decision making

  11. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  12. Advances in Artificial Neural Networks – Methodological Development and Application

    Directory of Open Access Journals (Sweden)

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  13. Contracting Selection for the Development of the Range Rule Risk Methodology

    National Research Council Canada - National Science Library

    1997-01-01

    ...-Effectiveness Risk Tool and contractor selection for the development of the Range Rule Risk Methodology. The audit objective was to determine whether the Government appropriately used the Ordnance and Explosives Cost-Effectiveness Risk Tool...

  14. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  15. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  16. An architecture and methodology for the design and development of Technical Information Systems

    NARCIS (Netherlands)

    Capobianchi, R.; Mautref, M.; van Keulen, Maurice; Balsters, H.

    In order to meet demands in the context of Technical Information Systems (TIS) pertaining to reliability, extensibility, maintainability, etc., we have developed an architectural framework with accompanying methodological guidelines for designing such systems. With the framework, we aim at complex

  17. A Software Planning and Development Methodology with Resource Allocation Capability

    Science.gov (United States)

    1986-01-01

    vll ACKNOWLEDGEMENTS There are many people who must be acknowledged for the support they provided during my graduate program at Texas A&M Dr. Lee ...acquisition, research/development, and operations/ maintenance sources. The concept of a resource mm >^"^*»T’i»"<Wt"> i PH D« mm^ ivi i t-il^’lfn" i^ I...James, Unpublished ICAM Industry Days address. New Orleans, Louisiana, May 1982. IllllHUIIIIVf 127 46. Ledbetter , William N., et al., "Education

  18. Value at risk methodologies: Developments, implementation and evaluation

    OpenAIRE

    Dong, Simin

    2006-01-01

    Value at Risk (VaR) is a useful concept in risk disclosure, especially for financial institutions. In this paper, the origin and development as well as the regulatory requirement of VaR are discussed. Furthermore, a hypothetical foreign currency forward contract is used as an example to illustrate the implementation of VaR. Back testing is conducted to test the soundness of each VaR model. Analysis in this paper shows that historical simulation and Monte Carlo simulation approaches have more ...

  19. Development of multitracer methodology for the characterization of petroleum reservoirs

    International Nuclear Information System (INIS)

    Pereira, E.H.T.; Moreira, R.M.; Ferreira Pinto, A.M.; Floresta, D.L.

    2004-01-01

    Amongst other candidate tracers, the use of potassium thiocyanide labelled with 35 S (K 35 SCN) has been investigated. This species is highly water soluble, temperature resistant, and is not adsorbed in the extended solid surfaces of the formation pores. Being a beta emitter, it minimizes radiological protection problems but requires sampling for activity measurement in the laboratory. The paper describes the extraction of the elemental radiosulfur from KCl lattice and the development of an optimized route to synthesize the thiocyanide that avoids lengthy and numerous intermediate reactions and separations. Laboratory and ongoing field tests designed to validate the tracer are also described. (author)

  20. Contribution to developing the environment radiation protection methodology

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, A. [Institute of Atomic Power Engineering NRNU MEPhI (Russian Federation); Alexakhin, R.; Dubynina, M. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    The environment sustainable development and biota protection, including the environment radiation protection are issues of nowadays interest in the society. An activity is ongoing on the development of a system of radiation protection for non-human biota. Anthropocentric and eco-centric principles are widely discussed. ICRP Publications 103, 108, 114 and many other reports and articles refer to the topic of environmental protection, reference animals and plants set, corresponding transfer parameters, dose models and derived consideration reference levels. There is still an open field for discussion of methods and approaches to get well-established procedure to assess environmental risks of radiation impacts to different organisms, populations and ecosystems. A huge work has been done by the ICRP and other organizations and research groups to develop and systematize approaches for this difficult subject. This activity, however, is not everywhere well-known and perceived, and more efforts are needed to bring ideas of eco-centric strategy in the environment radiation protection not only to public but to specialists in many countries as well. One of the main points of interest is an assessment of critical doses and doses rates for flora and fauna species. Some aspects of a possible procedure to find their estimates are studied in this work, including criteria for datasets of good quality, models of dose dependence, sensitivity of different umbrella endpoints and methods of original massive datasets treatment. Estimates are done based on information gathered in a database on radiation-induced effects in plants. Data on biological effects in plants (umbrella endpoints of reproductive potential, survival, morbidity, morphological, biochemical, and genetic effects) in dependence on dose and dose rates of ionizing radiation have been collected from reviewed publications and maintained in MS Access format. The database now contains about 7000 datasets and 25000 records

  1. Development cooperation as methodology for teaching social responsibility to engineers

    Science.gov (United States)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  2. Methodology development to support NPR strategic planning. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget

  3. Development of a plastic fracture methodology for nuclear systems

    International Nuclear Information System (INIS)

    Marston, T.U.; Jones, R.L.; Kanninen, M.F.; Mowbray, D.F.

    1981-01-01

    This paper describes research conducted to develop a fundamental basis for flaw tolerance assessment procedures suitable for components exhibiting ductile behavior. The research was composed of an integrated combination of stable crack growth experiments and elastic-plastic analyses. A number of candidate fracture criteria were assembled and investigated to determine the proper basis for plastic fracture mechanics assessments. The results demonstrate that many different fracture criteria can be used as the basis of a resistance curve approach to predicting stable crack growth and fracture instability. While all have some disadvantages, none is completely unacceptable. On balance, the best criteria were found to be the J-integral for initiation and limited amounts of stable crack growth and the local crack-tip opening angle for extended amounts of stable growth. A combination of the two, which may preserve the advantages of each while reducing their disadvantages, also was suggested by these results. The influence of biaxial and mixed flat/shear fracture behavior was investigated and found to not alter the basic results. Further work in the development of simplified ductile fracture analyses for routine engineering assessments of nuclear pressure vessels and piping evolving from this research is also described

  4. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  5. Development Proliferation Resistance Assessment Methodology for Regulation Purposes

    International Nuclear Information System (INIS)

    Ham, Taekyu; Seo, Janghoon; Lee, Nayoung; Yoo, Hosik

    2015-01-01

    More than 45 countries are considering embarking on nuclear power programs. As a result, the world's nuclear power generating capacity is projected to continue to grow by 2030. The installed total nuclear capacity in 373 GWe in 2012 would reach 435 and 722 GWe by 2030 in low and high scenario predictions, respectively. In Korea, there are 23 nuclear power plants in operation. Thirteen more plants are either under construction or are being planned for completion by 2027. In addition, there are active researches is taking place into pyroprocessing technology for use in treating spent fuel and reducing storage. Measures for analyzing PR of a nuclear energy system were derived by collecting attributes that influence PR and then were categorized into groups. Three measures were then developed by a series of processes; legal and institutional framework, material characteristics, and safeguardability. Since, the extrinsic features are more practical to evaluate when a regulatory body evaluates a system

  6. Development of a methodology for maintenance optimization at Kozloduy NPP

    International Nuclear Information System (INIS)

    Kitchev, E.

    1997-01-01

    The paper presents the overview of a project for development of an applicable strategy and methods for Kozloduy NPP (KNPP) to optimize its maintenance program in order to meet the current risk based maintenance requirements. The strategy in a format of Integrated Maintenance Program (IMP) manual will define the targets of the optimization process, the major stages and elements of this process and their relationships. IMP embodies the aspects of the US NRC Maintenance Rule compliance and facilitates the integration of KNPP programs and processes which impact the plant maintenance and safety. The methods in a format of IMP Instructions (IM-PI) will define how the different IMP stages can be implemented and the IMP targets can be achieved at KNPP environment. (author). 8 refs

  7. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  8. Methodological NMR imaging developments to measure cerebral perfusion

    International Nuclear Information System (INIS)

    Pannetier, N.

    2010-12-01

    This work focuses on acquisition techniques and physiological models that allow characterization of cerebral perfusion by MRI. The arterial input function (AIF), on which many models are based, is measured by a technique of optical imaging at the carotid artery in rats. The reproducibility and repeatability of the AIF are discussed and a model function is proposed. Then we compare two techniques for measuring the vessel size index (VSI) in rats bearing a glioma. The reference technique, using a USPIO contrast agent (CA), faces the dynamic approach that estimates this parameter during the passage of a bolus of Gd. This last technique has the advantage of being used clinically. The results obtained at 4.7 T by both approaches are similar and use of VSI in clinical protocols is strongly encouraged at high field. The mechanisms involved (R1 and R2* relaxivities) were then studied using a multi gradient -echoes approach. A multi-echoes spiral sequence is developed and a method that allows the refocusing between each echo is presented. This sequence is used to characterize the impact of R1 effects during the passage of two successive injections of Gd. Finally, we developed a tool for simulating the NMR signal on a 2D geometry taking into account the permeability of the BBB and the CA diffusion in the interstitial space. At short TE, the effect of diffusion on the signal is negligible. In contrast, the effects of diffusion and permeability may be separated at long echo time. Finally we show that during the extravasation of the CA, the local magnetic field homogenization due to the decrease of the magnetic susceptibility difference at vascular interfaces is quickly balanced by the perturbations induced by the increase of the magnetic susceptibility difference at the cellular interfaces in the extravascular compartment. (author)

  9. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation

    Science.gov (United States)

    Tang, Liang; Cheng, Pengle

    2017-01-01

    Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC) is extracted with the periodic boundary condition applied on it. An elastic–plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM) model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM) model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis. PMID:28746390

  10. Homogenized modeling methodology for 18650 lithium-ion battery module under large deformation.

    Directory of Open Access Journals (Sweden)

    Liang Tang

    Full Text Available Effective lithium-ion battery module modeling has become a bottleneck for full-size electric vehicle crash safety numerical simulation. Modeling every single cell in detail would be costly. However, computational accuracy could be lost if the module is modeled by using a simple bulk material or rigid body. To solve this critical engineering problem, a general method to establish a computational homogenized model for the cylindrical battery module is proposed. A single battery cell model is developed and validated through radial compression and bending experiments. To analyze the homogenized mechanical properties of the module, a representative unit cell (RUC is extracted with the periodic boundary condition applied on it. An elastic-plastic constitutive model is established to describe the computational homogenized model for the module. Two typical packing modes, i.e., cubic dense packing and hexagonal packing for the homogenized equivalent battery module (EBM model, are targeted for validation compression tests, as well as the models with detailed single cell description. Further, the homogenized EBM model is confirmed to agree reasonably well with the detailed battery module (DBM model for different packing modes with a length scale of up to 15 × 15 cells and 12% deformation where the short circuit takes place. The suggested homogenized model for battery module makes way for battery module and pack safety evaluation for full-size electric vehicle crashworthiness analysis.

  11. Development of IRMA reagent and methodology for PSA

    International Nuclear Information System (INIS)

    Najafi, R.

    1997-01-01

    The PSA test is a solid phase two-site immunoassay. Rabbit anti PSA is coated or bound on surface of solid phase and monoclonal anti PSA labeled with 1-125. The PSA molecules present in the standard solution or serum are 'Sandwiched' between the two antibodies. After formation of coated antibody-antigen-labeled antibody complex, the unbound labeled antibody will removed by washing. The complex is measured by gamma counter. The concentration of analyte is proportional to the counts of test sample. In order to develop kits for IRMA PSA, it should be prepared three essential reagents Antibody coated solid phase, labeled antibody, standards and finally optimizing them to obtain an standard curve fit to measure specimen PSA in desired range of concentration. The type of solid phase and procedure(s) to coat or bind to antibody, is still main debatable subject in development and setting up RIA/IRMA kits. In our experiments, polystyrene beads, because of their easy to coat with antibody as well as easy to use, can be considered as a desired solid phase. Most antibodies are passively adsorbed to a plastic surface (e.g. Polystyrene, Propylene, and Polyvinyl chloride) from a diluted buffer. The antibody coated plastic surface, then acts as solid phase reagent. Poor efficiency and time required to reach equilibrium and also lack of reproducibility especially batch-to-batch variation between materials, are disadvantages in this simple coating procedure. Improvements can be made by coating second antibody on surface of beads, and reaction between second and primary antibodies. There is also possible to enhance more coating efficiency of beads by using Staphylococcus ureus-Protein A. Protein A is a major component of staphylococcus aureus cell wall which has an affinity for FC segment of immunoglobulin G (IgG) of some species, including human; rabbit; and mice. This property of Staphylococcal Protein A has made it a very useful tool in the purification of classes and subclasses

  12. The development of evaluation methodology for advanced interactive communication

    International Nuclear Information System (INIS)

    Okamoto, K.

    2005-01-01

    Face-to-face communication is one of the essential style of communication. Trough face-to-face communication, people exchange much information at a time, both verbal and non-verbal information, which is most effective to learn each other. The authors focused on the face-to-face communication, and developed an evaluation method to quantify the effectiveness of communication. We regard conversation as an exchange of keywords. The effectiveness of conversation is valued by the amount of the keywords, and the achievement of mutual understandings. Through two people's face-to-face communication, the author quantified the shared information by measuring the change of the amount of the participants' knowledge. The participants' knowledge is counted by the words they can give. We measured the change in their shared knowledge (number of the words they gave associated to the theme). And we also quantified the discords in their understandings against their partners by measuring the discords between the knowledge that they think they share and the knowledge that they really share. Through these data, we evaluate the effectiveness of communication and analyzed the trends of mutual understanding. (authors)

  13. A Methodology For The Development Of Complex Domain Specific Languages

    CERN Document Server

    Risoldi, Matteo; Falquet, Gilles

    2010-01-01

    The term Domain-Specific Modeling Language is used in software development to indicate a modeling (and sometimes programming) language dedicated to a particular problem domain, a particular problem representation technique and/or a particular solution technique. The concept is not new -- special-purpose programming language and all kinds of modeling/specification languages have always existed, but the term DSML has become more popular due to the rise of domain-specific modeling. Domain-specific languages are considered 4GL programming languages. Domain-specific modeling techniques have been adopted for a number of years now. However, the techniques and frameworks used still suffer from problems of complexity of use and fragmentation. Although in recent times some integrated environments are seeing the light, it is not common to see many concrete use cases in which domain-specific modeling has been put to use. The main goal of this thesis is tackling the domain of interactive systems and applying a DSML-based...

  14. Recent progress and developments in LWR-PV calculational methodology

    International Nuclear Information System (INIS)

    Maerker, R.E.; Broadhead, B.L.; Williams, M.L.

    1984-01-01

    New and improved techniques for calculating beltline surveillance activities and pressure vessel fluences with reduced uncertainties have recently been developed. These techniques involve the combining of monitored in-core power data with diffusion theory calculated pin-by-pin data to yield absolute source distributions in R-THETA and R-Z geometries suitable for discrete ordinate transport calculations. Effects of finite core height, whenever necessary, can be considered by the use of a three-dimensional fluence rate synthesis procedure. The effects of a time-dependent spatial source distribution may be readily evaluated by applying the concept of the adjoint function, and simplifying the procedure to such a degree that only one forward and one adjoint calculation are required to yield all the dosimeter activities for all beltline surveillance locations at once. The addition of several more adjoint calculations using various fluence rates as responses is all that is needed to determine all the pressure vessel group fluences for all beltline locations for an arbitrary source distribution

  15. A methodology for developing anisotropic AAA phantoms via additive manufacturing.

    Science.gov (United States)

    Ruiz de Galarreta, Sergio; Antón, Raúl; Cazón, Aitor; Finol, Ender A

    2017-05-24

    An Abdominal Aortic Aneurysm (AAA) is a permanent focal dilatation of the abdominal aorta at least 1.5 times its normal diameter. The criterion of maximum diameter is still used in clinical practice, although numerical studies have demonstrated the importance of biomechanical factors for rupture risk assessment. AAA phantoms could be used for experimental validation of the numerical studies and for pre-intervention testing of endovascular grafts. We have applied multi-material 3D printing technology to manufacture idealized AAA phantoms with anisotropic mechanical behavior. Different composites were fabricated and the phantom specimens were characterized by biaxial tensile tests while using a constitutive model to fit the experimental data. One composite was chosen to manufacture the phantom based on having the same mechanical properties as those reported in the literature for human AAA tissue; the strain energy and anisotropic index were compared to make this choice. The materials for the matrix and fibers of the selected composite are, respectively, the digital materials FLX9940 and FLX9960 developed by Stratasys. The fiber proportion for the composite is equal to 0.15. The differences between the composite behavior and the AAA tissue are small, with a small difference in the strain energy (0.4%) and a maximum difference of 12.4% in the peak Green strain ratio. This work represents a step forward in the application of 3D printing technology for the manufacturing of AAA phantoms with anisotropic mechanical behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Pediatric hospital medicine core competencies: development and methodology.

    Science.gov (United States)

    Stucky, Erin R; Ottolini, Mary C; Maniscalco, Jennifer

    2010-01-01

    Pediatric hospital medicine is the most rapidly growing site-based pediatric specialty. There are over 2500 unique members in the three core societies in which pediatric hospitalists are members: the American Academy of Pediatrics (AAP), the Academic Pediatric Association (APA) and the Society of Hospital Medicine (SHM). Pediatric hospitalists are fulfilling both clinical and system improvement roles within varied hospital systems. Defined expectations and competencies for pediatric hospitalists are needed. In 2005, SHM's Pediatric Core Curriculum Task Force initiated the project and formed the editorial board. Over the subsequent four years, multiple pediatric hospitalists belonging to the AAP, APA, or SHM contributed to the content of and guided the development of the project. Editors and collaborators created a framework for identifying appropriate competency content areas. Content experts from both within and outside of pediatric hospital medicine participated as contributors. A number of selected national organizations and societies provided valuable feedback on chapters. The final product was validated by formal review from the AAP, APA, and SHM. The Pediatric Hospital Medicine Core Competencies were created. They include 54 chapters divided into four sections: Common Clinical Diagnoses and Conditions, Core Skills, Specialized Clinical Services, and Healthcare Systems: Supporting and Advancing Child Health. Each chapter can be used independently of the others. Chapters follow the knowledge, skills, and attitudes educational curriculum format, and have an additional section on systems organization and improvement to reflect the pediatric hospitalist's responsibility to advance systems of care. These competencies provide a foundation for the creation of pediatric hospital medicine curricula and serve to standardize and improve inpatient training practices. (c) 2010 Society of Hospital Medicine.

  17. Development of a simplified statistical methodology for nuclear fuel rod internal pressure calculation

    International Nuclear Information System (INIS)

    Kim, Kyu Tae; Kim, Oh Hwan

    1999-01-01

    A simplified statistical methodology is developed in order to both reduce over-conservatism of deterministic methodologies employed for PWR fuel rod internal pressure (RIP) calculation and simplify the complicated calculation procedure of the widely used statistical methodology which employs the response surface method and Monte Carlo simulation. The simplified statistical methodology employs the system moment method with a deterministic statistical methodology employs the system moment method with a deterministic approach in determining the maximum variance of RIP. The maximum RIP variance is determined with the square sum of each maximum value of a mean RIP value times a RIP sensitivity factor for all input variables considered. This approach makes this simplified statistical methodology much more efficient in the routine reload core design analysis since it eliminates the numerous calculations required for the power history-dependent RIP variance determination. This simplified statistical methodology is shown to be more conservative in generating RIP distribution than the widely used statistical methodology. Comparison of the significances of each input variable to RIP indicates that fission gas release model is the most significant input variable. (author). 11 refs., 6 figs., 2 tabs

  18. Development of a methodology to evaluate material accountability in pyroprocess

    Science.gov (United States)

    Woo, Seungmin

    This study investigates the effect of the non-uniform nuclide composition in spent fuel on material accountancy in the pyroprocess. High-fidelity depletion simulations are performed using the Monte Carlo code SERPENT in order to determine nuclide composition as a function of axial and radial position within fuel rods and assemblies, and burnup. For improved accuracy, the simulations use short burnups step (25 days or less), Xe-equilibrium treatment (to avoid oscillations over burnup steps), axial moderator temperature distribution, and 30 axial meshes. Analytical solutions of the simplified depletion equations are built to understand the axial non-uniformity of nuclide composition in spent fuel. The cosine shape of axial neutron flux distribution dominates the axial non-uniformity of the nuclide composition. Combined cross sections and time also generate axial non-uniformity, as the exponential term in the analytical solution consists of the neutron flux, cross section and time. The axial concentration distribution for a nuclide having the small cross section gets steeper than that for another nuclide having the great cross section because the axial flux is weighted by the cross section in the exponential term in the analytical solution. Similarly, the non-uniformity becomes flatter as increasing burnup, because the time term in the exponential increases. Based on the developed numerical recipes and decoupling of the results between the axial distributions and the predetermined representative radial distributions by matching the axial height, the axial and radial composition distributions for representative spent nuclear fuel assemblies, the Type-0, -1, and -2 assemblies after 1, 2, and 3 depletion cycles, is obtained. These data are appropriately modified to depict processing for materials in the head-end process of pyroprocess that is chopping, voloxidation and granulation. The expectation and standard deviation of the Pu-to-244Cm-ratio by the single granule

  19. The development of a checklist to enhance methodological quality in intervention programs

    Directory of Open Access Journals (Sweden)

    Salvador Chacón-Moscoso

    2016-11-01

    Full Text Available The methodological quality of primary studies is an important issue when performing meta-analyses or systematic reviews. Nevertheless, there are no clear criteria for how methodological quality should be analyzed. Controversies emerge when considering the various theoretical and empirical definitions, especially in relation to three interrelated problems: the lack of representativeness, utility, and feasibility. In this article, we (a systematize and summarize the available literature about methodological quality in primary studies; (b propose a specific, parsimonious, 12-item checklist to empirically define the methodological quality of primary studies based on a content validity study; and (c present an inter-coder reliability study for the resulting 12 items. This paper provides a precise and rigorous description of the development of this checklist, highlighting the clearly specified criteria for the inclusion of items and a substantial inter-coder agreement in the different items. Rather than simply proposing another checklist, however, it then argues that the list constitutes an assessment tool with respect to the representativeness, utility, and feasibility of the most frequent methodological quality items in the literature, one that provides practitioners and researchers with clear criteria for choosing items that may be adequate to their needs. We propose individual methodological features as indicators of quality, arguing that these need to be taken into account when designing, implementing, or evaluating an intervention program. This enhances methodological quality of intervention programs and fosters the cumulative knowledge based on meta-analyses of these interventions. Future development of the checklist is discussed.

  20. A framework for assessing the adequacy and effectiveness of software development methodologies

    Science.gov (United States)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  1. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  2. Astatine-211 Radiochemistry: The Development Of Methodologies For High Activity Level Radiosynthesis

    International Nuclear Information System (INIS)

    Zalutsky, Michael R.

    2012-01-01

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for α-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the α-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At-labeled targeted

  3. ASTATINE-211 RADIOCHEMISTRY: THE DEVELOPMENT OF METHODOLOGIES FOR HIGH ACTIVITY LEVEL RADIOSYNTHESIS

    Energy Technology Data Exchange (ETDEWEB)

    MICHAEL R. ZALUTSKY

    2012-08-08

    Targeted radionuclide therapy is emerging as a viable approach for cancer treatment because of its potential for delivering curative doses of radiation to malignant cell populations while sparing normal tissues. Alpha particles such as those emitted by 211At are particularly attractive for this purpose because of their short path length in tissue and high energy, making them highly effective in killing cancer cells. The current impact of targeted radiotherapy in the clinical domain remains limited despite the fact that in many cases, potentially useful molecular targets and labeled compounds have already been identified. Unfortunately, putting these concepts into practice has been impeded by limitations in radiochemistry methodologies. A critical problem is that the synthesis of therapeutic radiopharmaceuticals provides additional challenges in comparison to diagnostic reagents because of the need to perform radio-synthesis at high levels of radioactivity. This is particularly important for {alpha}-particle emitters such as 211At because they deposit large amounts of energy in a highly focal manner. The overall objective of this project is to develop convenient and reproducible radiochemical methodologies for the radiohalogenation of molecules with the {alpha}-particle emitter 211At at the radioactivity levels needed for clinical studies. Our goal is to address two problems in astatine radiochemistry: First, a well known characteristic of 211At chemistry is that yields for electrophilic astatination reactions decline as the time interval after radionuclide isolation from the cyclotron target increases. This is a critical problem that must be addressed if cyclotrons are to be able to efficiently supply 211At to remote users. And second, when the preparation of high levels of 211At-labeled compounds is attempted, the radiochemical yields can be considerably lower than those encountered at tracer dose. For these reasons, clinical evaluation of promising 211At

  4. Methodology development for plutonium categorization and enhancement of proliferation resistance by P3 mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Saito, M.; Kimura, Y.; Sagara, H.; Han, C. Y. [Tokyo Institute of Technology, Tokyo (Japan); Koyama, S. [Japan Atomic Energy Agency, Ibaraki (Japan)

    2012-03-15

    'Protected Plutonium Production (P3)' has been proposed to enhance the proliferation resistance of plutonium by the transmutation of Minor Actinides (MA). For example, adding the small amount of Minor Actinides such as {sup 237}Np or {sup 241}Am with large neutron capture cross-section to the uranium fuel to enhance the production of {sup 238}Pu, which has high spontaneous fission neutron rate do deteriorate the quality of the nuclear weapon manufacture and maintenance technologically difficult, is very effective for improving the isotopic barrier for the proliferation of plutonium. To demonstrate the P3 mechanism experimentally, U samples with 2, 5 and 10% {sup 237}Np doping were irradiated in Advanced Thermal Reactor (ATR) of INL. The fuel test samples were removed from the core at 100, 200 and 300 effective full power days (EFPD), and then post irradiation examination was completed at Chemical Lab. in Idaho National Laboratory(INL). The theoretical results of P3 mechanism predict the experimental ones quite well. The evaluation function, 'Attractiveness', was introduced as the ratio of function of Rossi-alpha to the 'Technical Difficulties for Fission Explosive Device Use. 'Rossi-alpha defined as the ratio of super-criticality to prompt neutron lifetime is the meaningful feature of the explosive yield. The Technical Difficulties for Fission Explosive Device Use can be expressed by the function of specific decay heat , spontaneous fission neutron rate and radiation of plutonium metal. Original methodology to evaluate Attractiveness of Plutonium has been improved by considering the effect of the compression of Plutonium isotope and also pre-detonation probability due to spontaneous fission neutron ate, which was applied for the categorization of the plutonium from the conventional reactors and the innovative reactors based on P3 mechanism. In the present paper, the fundamentals of P3 mechanism, the experimental demonstration of P3

  5. Methodology development for plutonium categorization and enhancement of proliferation resistance by P3 mechanism

    International Nuclear Information System (INIS)

    Saito, M.; Kimura, Y.; Sagara, H.; Han, C. Y.; Koyama, S.

    2012-01-01

    'Protected Plutonium Production (P3)' has been proposed to enhance the proliferation resistance of plutonium by the transmutation of Minor Actinides (MA). For example, adding the small amount of Minor Actinides such as 237 Np or 241 Am with large neutron capture cross-section to the uranium fuel to enhance the production of 238 Pu, which has high spontaneous fission neutron rate do deteriorate the quality of the nuclear weapon manufacture and maintenance technologically difficult, is very effective for improving the isotopic barrier for the proliferation of plutonium. To demonstrate the P3 mechanism experimentally, U samples with 2, 5 and 10% 237 Np doping were irradiated in Advanced Thermal Reactor (ATR) of INL. The fuel test samples were removed from the core at 100, 200 and 300 effective full power days (EFPD), and then post irradiation examination was completed at Chemical Lab. in Idaho National Laboratory(INL). The theoretical results of P3 mechanism predict the experimental ones quite well. The evaluation function, 'Attractiveness', was introduced as the ratio of function of Rossi-alpha to the 'Technical Difficulties for Fission Explosive Device Use. 'Rossi-alpha defined as the ratio of super-criticality to prompt neutron lifetime is the meaningful feature of the explosive yield. The Technical Difficulties for Fission Explosive Device Use can be expressed by the function of specific decay heat , spontaneous fission neutron rate and radiation of plutonium metal. Original methodology to evaluate Attractiveness of Plutonium has been improved by considering the effect of the compression of Plutonium isotope and also pre-detonation probability due to spontaneous fission neutron ate, which was applied for the categorization of the plutonium from the conventional reactors and the innovative reactors based on P3 mechanism. In the present paper, the fundamentals of P3 mechanism, the experimental demonstration of P3 mechanism in ATR of INL and the methodology

  6. Mapping plant species ranges in the Hawaiian Islands: developing a methodology and associated GIS layers

    Science.gov (United States)

    Price, Jonathan P.; Jacobi, James D.; Gon, Samuel M.; Matsuwaki, Dwight; Mehrhoff, Loyal; Wagner, Warren; Lucas, Matthew; Rowe, Barbara

    2012-01-01

    This report documents a methodology for projecting the geographic ranges of plant species in the Hawaiian Islands. The methodology consists primarily of the creation of several geographic information system (GIS) data layers depicting attributes related to the geographic ranges of plant species. The most important spatial-data layer generated here is an objectively defined classification of climate as it pertains to the distribution of plant species. By examining previous zonal-vegetation classifications in light of spatially detailed climate data, broad zones of climate relevant to contemporary concepts of vegetation in the Hawaiian Islands can be explicitly defined. Other spatial-data layers presented here include the following: substrate age, as large areas of the island of Hawai'i, in particular, are covered by very young lava flows inimical to the growth of many plant species; biogeographic regions of the larger islands that are composites of multiple volcanoes, as many of their species are restricted to a given topographically isolated mountain or a specified group of them; and human impact, which can reduce the range of many species relative to where they formerly were found. Other factors influencing the geographic ranges of species that are discussed here but not developed further, owing to limitations in rendering them spatially, include topography, soils, and disturbance. A method is described for analyzing these layers in a GIS, in conjunction with a database of species distributions, to project the ranges of plant species, which include both the potential range prior to human disturbance and the projected present range. Examples of range maps for several species are given as case studies that demonstrate different spatial characteristics of range. Several potential applications of species-range maps are discussed, including facilitating field surveys, informing restoration efforts, studying range size and rarity, studying biodiversity, managing

  7. DEVELOPMENT OF METHODOLOGY FOR THE CALCULATION OF THE PROJECT INNOVATION INDICATOR AND ITS CRITERIA COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mariya Vishnevskaya

    2017-12-01

    Full Text Available Two main components of the problem studied in the article are revealed. At the practical level, the provision of the convenient tools allowing a comprehensive evaluation the proposed innovative project in terms of its possibilities for inclusion in the portfolio or development program, and on the level of science – the need for improvement and complementing the existing methodology of assessment of innovative projects attractiveness in the context of their properties and a specific set of components. The research is scientifically applied since the problem solution involves the science-based development of a set of techniques, allowing the practical use of knowledge gained from large information arrays at the initialization stage. The purpose of the study is the formation of an integrated indicator of the project innovation, with a substantive justification of the calculation method, as a tool for the evaluation and selection of projects to be included in the portfolio of projects and programs. The theoretical and methodological basis of the research is the conceptual provisions and scientific developments of experts on project management issues, published in monographs, periodicals, materials of scientific and practical conferences on the topic of research. The tasks were solved using the general scientific and special methods, mathematical modelling methods based on the system approach. Results. A balanced system of parametric single indicators of innovation is presented – the risks, personnel, quality, innovation, resources, and performers, which allows getting a comprehensive idea of any project already in the initial stages. The choice of a risk tolerance as a key criterion of the “risks” element and the reference characteristics is substantiated, in relation to which it can be argued that the potential project holds promise. A tool for calculating the risk tolerance based on the use of matrices and vector analysis is proposed

  8. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  9. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part I

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    This work presents a new strategy for fault diagnosis in large chemical processes (E.E. Tarifa, Fault diagnosis in complex chemistries plants: plants of large dimensions and batch processes. Ph.D. thesis, Universidad Nacional del Litoral, Santa Fe, 1995). A special decomposition of the plant is made in sectors. Afterwards each sector is studied independently. These steps are carried out in the off-line mode. They produced vital information for the diagnosis system. This system works in the on-line mode and is based on a two-tier strategy. When a fault is produced, the upper level identifies the faulty sector. Then, the lower level carries out an in-depth study that focuses only on the critical sectors to identify the fault. The loss of information produced by the process partition may cause spurious diagnosis. This problem is overcome at the second level using qualitative simulation and fuzzy logic. In the second part of this work, the new methodology is tested to evaluate its performance in practical cases. A multiple stage flash desalination system (MSF) is chosen because it is a complex system, with many recycles and variables to be supervised. The steps for the knowledge base generation and all the blocks included in the diagnosis system are analyzed. Evaluation of the diagnosis performance is carried out using a rigorous dynamic simulator

  10. A methodology for fault diagnosis in large chemical processes and an application to a multistage flash desalination process: Part II

    International Nuclear Information System (INIS)

    Tarifa, Enrique E.; Scenna, Nicolas J.

    1998-01-01

    In Part I, an efficient method for identifying faults in large processes was presented. The whole plant is divided into sectors by using structural, functional, or causal decomposition. A signed directed graph (SDG) is the model used for each sector. The SDG represents interactions among process variables. This qualitative model is used to carry out qualitative simulation for all possible faults. The output of this step is information about the process behaviour. This information is used to build rules. When a symptom is detected in one sector, its rules are evaluated using on-line data and fuzzy logic to yield the diagnosis. In this paper the proposed methodology is applied to a multiple stage flash (MSF) desalination process. This process is composed of sequential flash chambers. It was designed for a pilot plant that produces drinkable water for a community in Argentina; that is, it is a real case. Due to the large number of variables, recycles, phase changes, etc., this process is a good challenge for the proposed diagnosis method

  11. Development of the methodology and approaches to validate safety and accident management

    International Nuclear Information System (INIS)

    Asmolov, V.G.

    1997-01-01

    The article compares the development of the methodology and approaches to validate the nuclear power plant safety and accident management in Russia and advanced industrial countries. It demonstrates that the development of methods of safety validation is dialectically related to the accumulation of the knowledge base on processes and events during NPP normal operation, transients and emergencies, including severe accidents. The article describes the Russian severe accident research program (1987-1996), the implementation of which allowed Russia to reach the world level of the safety validation efforts, presents future high-priority study areas. Problems related to possible approaches to the methodological accident management development are discussed. (orig.)

  12. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed performance assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. It is intended that assessment of the base, scenario would form the core of any future performance assessment. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs which are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. Variant scenarios are defined by FEPs which represent a significant perturbation to the natural system evolution, for example the occurrence of a large seismic event. A variant scenario defined by a single initiating FEP is characterised by a sequence of events. This is represented as a 'timeline' which forms the basis for modelling that scenario. To generate a variant scenario defined by two initiating FEPs, a methodology is presented for combining the timelines for the two underlying 'single-FEP' variants. The resulting series of event sequences can be generated automatically. These sequences are then reviewed, in order to reduce the number of timelines requiring detailed consideration. This is achieved in two ways: by aggregating sequences which have similar consequence in terms of safety performance; and by combining successive intervals along a timeline where appropriate. In the context of a performance assessment, the aim is to determine the conditional risk and appropriate weight for each

  13. Which spatial discretization for distributed hydrological models? Proposition of a methodology and illustration for medium to large-scale catchments

    Directory of Open Access Journals (Sweden)

    J. Dehotin

    2008-05-01

    Full Text Available Distributed hydrological models are valuable tools to derive distributed estimation of water balance components or to study the impact of land-use or climate change on water resources and water quality. In these models, the choice of an appropriate spatial discretization is a crucial issue. It is obviously linked to the available data, their spatial resolution and the dominant hydrological processes. For a given catchment and a given data set, the "optimal" spatial discretization should be adapted to the modelling objectives, as the latter determine the dominant hydrological processes considered in the modelling. For small catchments, landscape heterogeneity can be represented explicitly, whereas for large catchments such fine representation is not feasible and simplification is needed. The question is thus: is it possible to design a flexible methodology to represent landscape heterogeneity efficiently, according to the problem to be solved? This methodology should allow a controlled and objective trade-off between available data, the scale of the dominant water cycle components and the modelling objectives.

    In this paper, we propose a general methodology for such catchment discretization. It is based on the use of nested discretizations. The first level of discretization is composed of the sub-catchments, organised by the river network topology. The sub-catchment variability can be described using a second level of discretizations, which is called hydro-landscape units. This level of discretization is only performed if it is consistent with the modelling objectives, the active hydrological processes and data availability. The hydro-landscapes take into account different geophysical factors such as topography, land-use, pedology, but also suitable hydrological discontinuities such as ditches, hedges, dams, etc. For numerical reasons these hydro-landscapes can be further subdivided into smaller elements that will constitute the

  14. Development of a Methodology to Gather Seated Anthropometry in a Microgravity Environment

    Science.gov (United States)

    Rajulu, Sudhakar; Young, Karen; Mesloh, Miranda

    2009-01-01

    The Constellation Program's Crew Exploration Vehicle (CEV) is required to accommodate the full population range of crewmembers according to the anthropometry requirements stated in the Human-Systems Integration Requirement (HSIR) document (CxP70024). Seated height is one of many critical dimensions of importance to the CEV designers in determining the optimum seat configuration in the vehicle. Changes in seated height may have a large impact to the design, accommodation, and safety of the crewmembers. Seated height can change due to elongation of the spine when crewmembers are exposed to microgravity. Spinal elongation is the straightening of the natural curvature of the spine and the expansion of inter-vertebral disks. This straightening occurs due to fluid shifts in the body and the lack of compressive forces on the spinal vertebrae. Previous studies have shown that as the natural curvature of the spine straightens, an increase in overall height of 3% of stature occurs which has been the basis of the current HSIR requirements. However due to variations in the torso/leg ratio and impact of soft tissue, data is nonexistent as to how spinal elongation specifically affects the measurement of seated height. In order to obtain this data, an experiment was designed to collect spinal elongation data while in a seated posture in microgravity. The purpose of this study was to provide quantitative data that represents the amount of change that occurs in seated height due to spinal elongation in microgravity environments. Given the schedule and budget constraints of ISS and Shuttle missions and the uniqueness of the problem, a methodology had to be developed to ensure that the seated height measurements were accurately collected. Therefore, simulated microgravity evaluations were conducted to test the methodology and procedures of the experiment. This evaluation obtained seat pan pressure and seated height data to a) ensure that the lap restraint provided sufficient

  15. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  16. The Development and Significance of Standards for Smoking-Machine Methodology

    Directory of Open Access Journals (Sweden)

    Baker R

    2014-12-01

    Full Text Available Bialous and Yach have recently published an article in Tobacco Control in which they claim that all smoking-machine standards stem from a method developed unilaterally by the tobacco industry within the Cooperation Centre for Scientific Research Relative to Tobacco (CORESTA. Using a few highly selective quotations from internal tobacco company memos, they allege, inter alia, that the tobacco industry has changed the method to suit its own needs, that because humans do not smoke like machines the standards are of little value, and that the tobacco industry has unjustifiably made health claims about low “tar” cigarettes. The objectives of this paper are to review the development of smoking-machine methodology and standards, involvement of relative parties, outline the significance of the results and explore the validity of Bialous and Yach's claims. The large volume of published scientific information on the subject together with other information in the public domain has been consulted. When this information is taken into account it becomes obvious that the very narrow and restricted literature base of Bialous and Yach's analysis has resulted in them, perhaps inadvertedly, making factual errors, drawing wrong conclusions and writing inaccurate statements on many aspects of the subject. The first smoking-machine standard was specified by the Federal Trade Commission (FTC, a federal government agency in the USA, in 1966. The CORESTA Recommended Method, similar in many aspects to that of the FTC, was developed in the late 1960s and published in 1969. Small differences in the butt lengths, smoke collection and analytical procedures in methods used in various countries including Germany, Canada and the UK, developed later, resulted in about a 10% difference in smoke “tar” yields. These differences in methodology were harmonised in a common International Organisation for Standardisation (ISO Standard Method in 1991, after a considerable amount

  17. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  18. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  19. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  20. The economics of climate change mitigation in developing countries - methodological and empirical results

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs.

  1. The economics of climate change mitigation in developing countries -methodological and empirical results

    International Nuclear Information System (INIS)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs

  2. Methodological and Methodical Principles of the Empirical Study of Spiritual Development of a Personality

    Directory of Open Access Journals (Sweden)

    Olga Klymyshyn

    2017-06-01

    Full Text Available The article reveals the essence of the methodological principles of the spiritual development of a personality. The results of the theoretical analysis of psychological content of spirituality from the positions of system and structural approach to studying of a personality, age patterns of the mental personality development, the sacramental nature of human person, mechanisms of human spiritual development are taken into consideration. The interpretation of spirituality and the spiritual development of a personality is given. Initial principles of the organization of the empirical research of the spiritual development of a personality (ontogenetic, sociocultural, self-determination, system are presented. Such parameters of the estimation of a personality’s spiritual development as general index of the development of spiritual potential, indexes of the development of ethical, aesthetical, cognitive, existential components of spirituality, index of religiousness of a personality are described. Methodological support of psychological diagnostic research is defined.

  3. The status of proliferation resistance evaluation methodology development in GEN IV international forum

    International Nuclear Information System (INIS)

    Inoue, Naoko; Kawakubo, Yoko; Seya, Michio; Suzuki, Mitsutoshi; Kuno, Yusuke; Senzaki, Masao

    2010-01-01

    The Generation IV Nuclear Energy Systems International Forum (GIF) Proliferation Resistance and Physical Protection Working Group (PR and PP WG) was established in December 2002 in order to develop the PR and PP evaluation methodology for GEN IV nuclear energy systems. The methodology has been studied and established by international consensus. The PR and PP WG activities include development of the measures and metrics; establishment of the framework of PR and PP evaluation, the demonstration study using Example Sodium Fast Reactor (ESFR), which included the development of three evaluation approaches; the Case Study using ESFR and four kinds of threat scenarios; the joint study with GIF System Steering Committees (SSCs) of the six reactor design concepts; and the harmonization study with the IAEA's International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO). This paper reviews the status of GIF PR and PP studies and identifies the challenges and directions for applying the methodology to evaluate future nuclear energy systems in Japan. (author)

  4. Recent developments in methodology for dynamic qualification of nuclear plant equipment

    International Nuclear Information System (INIS)

    Kana, D.D.; Pomerening, D.J.

    1984-01-01

    Dynamic qualification of nuclear plant electrical and mechanical equipment is performed basically under guidelines given in IEEE Standards 323 and 344, and a variety of NRC regulatory guides. Over the last fifteen years qualification methodology prescribed by these documents has changed significantly as interpretations, equipment capability, and imagination of the qualification engineers have progressed. This progress has been sparked by concurrent NRC and industry sponsored research programs that have identified anomalies and developed new methodologies for resolving them. Revisions of the standards have only resulted after a lengthy debate of all such new information and subsequent judgment of its validity. The purpose of this paper is to review a variety of procedural improvements and developments in qualification methodology that are under current consideration as revisions to the standards. Many of the improvements and developments have resulted from recent research programs. All are very likely to appear in one type of standard or another in the near future

  5. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    Science.gov (United States)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  6. Developing Agent-Oriented Video Surveillance System through Agent-Oriented Methodology (AOM

    Directory of Open Access Journals (Sweden)

    Cheah Wai Shiang

    2016-12-01

    Full Text Available Agent-oriented methodology (AOM is a comprehensive and unified agent methodology for agent-oriented software development. Although AOM is claimed to be able to cope with a complex system development, it is still not yet determined up to what extent this may be true. Therefore, it is vital to conduct an investigation to validate this methodology. This paper presents the adoption of AOM in developing an agent-oriented video surveillance system (VSS. An intruder handling scenario is designed and implemented through AOM. AOM provides an alternative method to engineer a distributed security system in a systematic manner. It presents the security system at a holistic view; provides a better conceptualization of agent-oriented security system and supports rapid prototyping as well as simulation of video surveillance system.

  7. Development and delivery of a workshop methodology: planning for biomass power plant projects

    Energy Technology Data Exchange (ETDEWEB)

    Gray, A.J.; Delbridge, P.; Trevorrow, E.; Pile, C.

    2001-07-01

    This report gives details of the approach used to develop a workshop methodology to help planners and stakeholders address key issues that may arise when submitting a planning application for a biomass power plant in the light of the UK government's energy and climate change targets. The results of interviews with stakeholders (central government, regulatory authorities, developers, planners, non-governmental organisations, local community, resident groups) are summarised, and the NIMBY (not in my back yard) syndrome, the lack of trust in the developer, and lack of awareness of the use of biomass are discussed. Details are given of the design and testing of the workshop methodology and the resulting workshop methodology and workbook guide aimed at understanding the stakeholder issues and concerns through stakeholder discussions.

  8. The Methodology of the Process of Formation of Innovation Management of Enterprises’ Development

    Directory of Open Access Journals (Sweden)

    Prokhorova Viktoriia V.

    2017-12-01

    Full Text Available The article is aimed at forming the methodology of process of innovation management of enterprises’ development in modern conditions. A study on formation of the essence of methodology was carried out, the stages of development of methods and means of scientific cognition were analyzed. The basic components of formation of methodology of innovation management of development of enterprises have been defined, i.e.: methods, types, principles, components, systematized aggregate. The relations of empirical and theoretical methods of scientific cognition were considered and defined. It has been determined that the increase of the volume and scope of scientific views, as well as the deepening of scientific knowledge in the disclosure of laws and regularities of functioning of real natural and social world, lead to the objective fact that is the desire of scientists to analyze methods and means by which modern innovative knowledge and views in the enterprise management system can be acquired and formed.

  9. Development and application of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, H.Y.

    1981-01-01

    A methodology is developed to identify and flag significant trends related to the safety and availability of U.S. commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs, and significant errors affecting systems and which are caused by response action are highly associated with startup reactor mode in BWRS

  10. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  11. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  12. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  13. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1985-01-01

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are gound-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission. The approach followed consists of a description of the overall system (waste, facility, and site), scenario selection and screening, consequence modeling (source term, ground-water flow, radionuclide transport, biosphere transport, and health effects), and uncertainty and sensitivity analysis

  14. Development of performance assessment methodology for nuclear waste isolation in geologic media

    Science.gov (United States)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  15. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    International Nuclear Information System (INIS)

    Salazar, Jesus; McVay, Duane A.; Lee, W. John

    2010-01-01

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  16. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  17. Development of a field measurement methodology for studying the thermal indoor environment in hybrid GEOTABS buildings

    DEFF Research Database (Denmark)

    Kazanci, Ongun Berk; Khovalyg, Dolaana; Olesen, Bjarne W.

    2018-01-01

    buildings. The three demonstration buildings were an office building in Luxembourg, an elderly care home in Belgium, and an elementary school in Czech Republic. All of these buildings are equipped with hybrid GEOTABS systems; however, they vary in size and function, which requires a unique measurement...... methodology for studying them. These buildings already have advanced Building Management Systems (BMS); however, a more detailed measurement plan was needed for the purposes of the project to document the current performance of these systems regarding thermal indoor environment and energy performance......, and to be able to document the improvements after the implementation of the MPC. This study provides the details of the developed field measurement methodology for each of these buildings to study the indoor environmental quality (IEQ) in details. The developed measurement methodology can be applied to other...

  18. Fatigue methodology for life predictions for the wheel-rail contact area in large offshore turret bearings

    Directory of Open Access Journals (Sweden)

    T. Lassen

    2016-10-01

    Full Text Available The present report presents a fatigue life prediction method for large roller bearings applied in the turret turn table for large loading buoy units. The contact points between wheel and rail in these bearings are subjected to a multi-axial fluctuating stress situation and both surface wear and fatigue cracking may occur. A methodology based on the Dang Van fatigue criterion is adopted. The criterion is based on an equivalent stress defined as a combination of the fluctuation of the shear stress from its mean value at a critical plane and the associated hydrostatic stress at the given time. The present work is supporting the theoretical model by extensive laboratory testing. Both full scale testing of wheel on rail and small scale testing for characterizing the steel material are carried out. An experimental program was carried out with the high strength stainless steel S165M. The Dang Van stress concept is applied in combination with the Random Fatigue Limit Method (RFLM for life data analyses. This approach gives the opportunity to include both finite lives and the run-outs in a rational manner without any presumption of the existence of a fatigue limit in advance of the data. This gives a non-linear S-N curve for a log-log scale in the very high cycle regime close to the fatigue limit. It is demonstrated how the scatter in fatigue limit decreases when the Dang Van stress concept is applied and that the fatigue limit is occurring beyond 107 cycles

  19. A methodology for understanding the impacts of large-scale penetration of micro-combined heat and power

    International Nuclear Information System (INIS)

    Tapia-Ahumada, K.; Pérez-Arriaga, I.J.; Moniz, E.J.

    2013-01-01

    Co-generation at small kW-e scale has been stimulated in recent years by governments and energy regulators as one way to increasing energy efficiency and reducing CO 2 emissions. If a widespread adoption should be realized, their effects from a system's point of view are crucial to understand the contributions of this technology. Based on a methodology that uses long-term capacity planning expansion, this paper explores some of the implications for an electric power system of having a large number of micro-CHPs. Results show that fuel cells-based micro-CHPs have the best and most consistent performance for different residential demands from the customer and system's perspectives. As the penetration increases at important levels, gas-based technologies—particularly combined cycle units—are displaced in capacity and production, which impacts the operation of the electric system during summer peak hours. Other results suggest that the tariff design impacts the economic efficiency of the system and the operation of micro-CHPs under a price-based strategy. Finally, policies aimed at micro-CHPs should consider the suitability of the technology (in size and heat-to-power ratio) to meet individual demands, the operational complexities of a large penetration, and the adequacy of the economic signals to incentivize an efficient and sustainable operation. - Highlights: • Capacity displacements and daily operation of an electric power system are explored. • Benefits depend on energy mix, prices, and micro-CHP technology and control scheme. • Benefits are observed mostly in winter when micro-CHP heat and power are fully used. • Micro-CHPs mostly displace installed capacity from natural gas combined cycle units. • Tariff design impacts economic efficiency of the system and operation of micro-CHPs

  20. The Methodology Applied in DPPH, ABTS and Folin-Ciocalteau Assays Has a Large Influence on the Determined Antioxidant Potential.

    Science.gov (United States)

    Abramovič, Helena; Grobin, Blaž; Poklar, Nataša; Cigić, Blaž

    2017-06-01

    Antioxidant potential (AOP) is not only the property of the matrix analyzed but also depends greatly on the methodology used. The chromogenic radicals 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS•+), 2,2-diphenyl-1-picrylhydrazyl (DPPH•) and Folin-Ciocalteu (FC) assay were applied to estimate how the method and the composition of the assay solvent influence the AOP determined for coffee, tea, beer, apple juice and dietary supplements. Large differences between the AOP values depending on the reaction medium were observed, with the highest AOP determined mostly in the FC assay. In reactions with chromogenic radicals several fold higher values of AOP were obtained in buffer pH 7.4 than in water or methanol. The type of assay and solvent composition have similar influences on the reactivity of a particular antioxidant, either pure or as part of a complex matrix. The reaction kinetics of radicals with antioxidants in samples reveals that AOP depends strongly on incubation time, yet differently for each sample analyzed and the assay applied.

  1. METHODOLOGY & CALCULATIONS FOR THE ASSIGNMENT OF WASTE GROUPS FOR THE LARGE UNDERGROUND WASTE STORAGE TANKS AT THE HANFORD SITE

    Energy Technology Data Exchange (ETDEWEB)

    BARKER, S.A.

    2006-07-27

    Waste stored within tank farm double-shell tanks (DST) and single-shell tanks (SST) generates flammable gas (principally hydrogen) to varying degrees depending on the type, amount, geometry, and condition of the waste. The waste generates hydrogen through the radiolysis of water and organic compounds, thermolytic decomposition of organic compounds, and corrosion of a tank's carbon steel walls. Radiolysis and thermolytic decomposition also generates ammonia. Nonflammable gases, which act as dilutents (such as nitrous oxide), are also produced. Additional flammable gases (e.g., methane) are generated by chemical reactions between various degradation products of organic chemicals present in the tanks. Volatile and semi-volatile organic chemicals in tanks also produce organic vapors. The generated gases in tank waste are either released continuously to the tank headspace or are retained in the waste matrix. Retained gas may be released in a spontaneous or induced gas release event (GRE) that can significantly increase the flammable gas concentration in the tank headspace as described in RPP-7771. The document categorizes each of the large waste storage tanks into one of several categories based on each tank's waste characteristics. These waste group assignments reflect a tank's propensity to retain a significant volume of flammable gases and the potential of the waste to release retained gas by a buoyant displacement event. Revision 5 is the annual update of the methodology and calculations of the flammable gas Waste Groups for DSTs and SSTs.

  2. Development of methodology for separation and recovery of uranium from nuclear wastewater

    International Nuclear Information System (INIS)

    Satpati, S.K.; Roy, S.B.; Pal, Sangita; Tewari, P.K.

    2015-01-01

    Uranium plays a key role in nuclear power supply, demand of which is growing up with time because of its prospective features. Persistent increase in different nuclear activities leads to increase generation of nuclear wastewater containing uranium. Separation and recovery of the uranium from its unconventional source like nuclear wastewater is worth to explore for addressing the reutilisation of the uranium source. It is also necessary to improve remediation technology of nuclear industries for environmental protection. Development of a suitable process methodology is essential for the purpose to supersede the conventional methodology. In the article, recent developments in several possible methodologies for separation of uranium from dilute solution have been discussed with their merits and demerits. Sorption technique as solid phase extraction methodology has been chosen with suitable polymer matrix and functional moiety based on wastewater characteristics. Polyhydroxamic Acid, PHOA sorbent synthesized following eco-friendly procedure is a promising polymeric chelating sorbents for remediation of nuclear wastewaters and recovery of uranium. Sorption and elution characteristics of the PHOA have been evaluated and illustrated for separation and recovery of uranium from a sample nuclear wastewater. For the remediation of nuclear wastewater SPE technique applying the PHOA, a polymeric sorbent is found to be a potentially suitable methodology. (author)

  3. Methodology for considering environments and culture in developing information security systems

    OpenAIRE

    Mwakalinga, G Jeffy; Kowalski, Stewart; Yngström, Louise

    2009-01-01

    In this paper, we describe a methodology for considering culture of users and environments when developing information security systems. We discuss the problem of how researchers and developers of security for information systems have had difficulties in considering culture of users and environments when they develop information security systems. This has created environments where people serve technology instead of technology serving people. Users have been considered just as any other compo...

  4. Application of realistic (best- estimate) methodologies for large break loss of coolant (LOCA) safety analysis: licensing of Westinghouse ASTRUM evaluation model in Spain

    International Nuclear Information System (INIS)

    Lage, Carlos; Frepoli, Cesare

    2010-01-01

    When the LOCA Final Acceptance Criteria for Light Water Reactors was issued in Appendix K of 10CFR50 both the USNRC and the industry recognized that the rule was highly conservative. At that time, however, the degree of conservatism in the analysis could not be quantified. As a result, the USNRC began a research program to identify the degree of conservatism in those models permitted in the Appendix K rule and to develop improved thermal-hydraulic computer codes so that realistic accident analysis calculations could be performed. The overall results of this research program quantified the conservatism in the Appendix K rule and confirmed that some relaxation of the rule can be made without a loss in safety to the public. Also, from a risk-informed perspective it is recognized that conservatism is not always a complete defense for lack of sophistication in models. In 1988, as a result of the improved understanding of LOCA phenomena, the USNRC staff amended the requirements of 10 CFR 50.46 and Appendix K, 'ECCS Evaluation Models', so that a realistic evaluation model may be used to analyze the performance of the ECCS during a hypothetical LOCA. Under the amended rules, best-estimate plus uncertainty (BEPU) thermal-hydraulic analysis may be used in place of the overly prescriptive set of models mandated by Appendix K rule. Further guidance for the use of best-estimate codes was provided in Regulatory Guide 1.157 To demonstrate use of the revised ECCS rule, the USNRC and its consultants developed a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology as an approach for defining and qualifying a best-estimate thermal-hydraulic code and quantifying the uncertainties in a LOCA analysis. More recently the CSAU principles have been generalized in the Evaluation Model Development and Assessment Process (EMDAP) of Regulatory Guide 1.203. ASTRUM is the Westinghouse Best Estimate Large Break LOCA evaluation model applicable to two-, three

  5. Development of high performance core for large fast breeder reactors

    International Nuclear Information System (INIS)

    Inoue, Kotaro; Kawashima, Katsuyuki; Watari, Yoshio.

    1982-01-01

    Subsequently to the fast breeder prototype reactor ''Monju'', the construction of a demonstration reactor with 1000 MWe output is planned. This research aims at the establishment of the concept of a large core with excellent fuel breeding property and safety for a demonstration and commercial reactors. For the purpose, the optimum specification of fuel design as a large core was clarified, and the new construction of a core was examined, in which a disk-shaped blanket with thin peripheral edge is introduced at the center of a core. As the result, such prospect was obtained that the time for fuel doubling would be 1/2, and the energy generated in a core collapse accident would be about 1/5 as compared with a large core using the same fuel as ''Monju''. Generally, as a core is enlarged, the rate of breeding lowers. If a worst core collapse accident occurs, the scale of accident will be very large in the case of a ''Monju'' type large core. In an unhomogeneous core, an internal blanket is provided in the core for the purpose of improving the breeding property and safety. Hitachi Ltd. developed the concept of a large core unhomogeneous in axial direction and proposed it. The research on the fuel design for a large core, an unhomogeneous core and its core collapse accident is reported. (Kako, I.)

  6. Human behaviour can trigger large carnivore attacks in developed countries.

    Science.gov (United States)

    Penteriani, Vincenzo; Delgado, María del Mar; Pinchera, Francesco; Naves, Javier; Fernández-Gil, Alberto; Kojola, Ilpo; Härkönen, Sauli; Norberg, Harri; Frank, Jens; Fedriani, José María; Sahlén, Veronica; Støen, Ole-Gunnar; Swenson, Jon E; Wabakken, Petter; Pellegrini, Mario; Herrero, Stephen; López-Bao, José Vicente

    2016-02-03

    The media and scientific literature are increasingly reporting an escalation of large carnivore attacks on humans in North America and Europe. Although rare compared to human fatalities by other wildlife, the media often overplay large carnivore attacks on humans, causing increased fear and negative attitudes towards coexisting with and conserving these species. Although large carnivore populations are generally increasing in developed countries, increased numbers are not solely responsible for the observed rise in the number of attacks by large carnivores. Here we show that an increasing number of people are involved in outdoor activities and, when doing so, some people engage in risk-enhancing behaviour that can increase the probability of a risky encounter and a potential attack. About half of the well-documented reported attacks have involved risk-enhancing human behaviours, the most common of which is leaving children unattended. Our study provides unique insight into the causes, and as a result the prevention, of large carnivore attacks on people. Prevention and information that can encourage appropriate human behaviour when sharing the landscape with large carnivores are of paramount importance to reduce both potentially fatal human-carnivore encounters and their consequences to large carnivores.

  7. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    Science.gov (United States)

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to

  8. Optimization methodology for large scale fin geometry on the steel containment of a Public Acceptable Simple SMR (PASS)

    International Nuclear Information System (INIS)

    Kim, Do Yun; NO, Hee Cheon; Kim, Ho Sik

    2015-01-01

    Highlights: • Optimization methodology for fin geometry on the steel containment is established. • Optimum spacing is 7 cm in PASS containment. • Optimum thickness is 0.9–1.8 cm when a fin height is 10–25 cm. • Optimal fin geometry is determined in given fin height by overall effectiveness correlation. • 13% of material volume and 43% of containment volume are reduced by using fins. - Abstracts: Heat removal capability through a steel containment is important in accident situations to preserve the integrity of a nuclear power plant which adopts a steel containment concept. A heat transfer rate will be enhanced by using fins on the external surface of the steel containment. The fins, however, cause to increase flow resistance and to deteriorate the heat transfer rate at the same time. Therefore, this study investigates an optimization methodology of large scale fin geometry for a vertical base where a natural convection flow regime is turbulent. Rectangular plate fins adopted in the steel containment of a Public Acceptable Simple SMR (PASS) is used as a reference. The heat transfer rate through the fins is obtained from CFD tools. In order to optimize fin geometry, an overall effectiveness concept is introduced as a fin performance parameter. The optimizing procedure is starting from finding optimum spacing. Then, optimum thickness is calculated and finally optimal fin geometry is suggested. Scale analysis is conducted to show the existence of an optimum spacing which turns out to be 7 cm in case of PASS. Optimum thickness is obtained by the overall effectiveness correlation, which is derived from a total heat transfer coefficient correlation. The total heat transfer coefficient correlation of a vertical fin array is suggested considering both of natural convection and radiation. However, the optimum thickness is changed as a fin height varies. Therefore, optimal fin geometry is obtained as a function of a fin height. With the assumption that the heat

  9. Development of Methodologies for the Estimation of Thermal Properties Associated with Aerospace Vehicles

    Science.gov (United States)

    Scott, Elaine P.

    1996-01-01

    A thermal stress analysis is an important aspect in the design of aerospace structures and vehicles such as the High Speed Civil Transport (HSCT) at the National Aeronautics and Space Administration Langley Research Center (NASA-LaRC). These structures are complex and are often composed of numerous components fabricated from a variety of different materials. The thermal loads on these structures induce temperature variations within the structure, which in turn result in the development of thermal stresses. Therefore, a thermal stress analysis requires knowledge of the temperature distributions within the structures which consequently necessitates the need for accurate knowledge of the thermal properties, boundary conditions and thermal interface conditions associated with the structural materials. The goal of this proposed multi-year research effort was to develop estimation methodologies for the determination of the thermal properties and interface conditions associated with aerospace vehicles. Specific objectives focused on the development and implementation of optimal experimental design strategies and methodologies for the estimation of thermal properties associated with simple composite and honeycomb structures. The strategy used in this multi-year research effort was to first develop methodologies for relatively simple systems and then systematically modify these methodologies to analyze complex structures. This can be thought of as a building block approach. This strategy was intended to promote maximum usability of the resulting estimation procedure by NASA-LARC researchers through the design of in-house experimentation procedures and through the use of an existing general purpose finite element software.

  10. Cognitive Sensitivity in Sibling Interactions: Development of the Construct and Comparison of Two Coding Methodologies

    Science.gov (United States)

    Prime, Heather; Perlman, Michal; Tackett, Jennifer L.; Jenkins, Jennifer M.

    2014-01-01

    Research Findings: The goal of this study was to develop a construct of sibling cognitive sensitivity, which describes the extent to which children take their siblings' knowledge and cognitive abilities into account when working toward a joint goal. In addition, the study compared 2 coding methodologies for measuring the construct: a thin…

  11. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  12. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  13. Methodological principles to study formation and development of floristic law in Ukraine

    Directory of Open Access Journals (Sweden)

    А. К. Соколова

    2014-06-01

    Full Text Available The paper investigates the problems associated with the determination of methods to study establishment of floristic law in Ukraine. It makes an investigation into the types of methods, establishes their interrelation and functional value. In addition, it analyzes the system of methodological reasons for development of ecological and floristic law and gives additional reasons.

  14. Methodologies Developed for EcoCity Related Projects: New Borg El Arab, an Egyptian Case Study

    Directory of Open Access Journals (Sweden)

    Carmen Antuña-Rozado

    2016-08-01

    Full Text Available The aim of the methodologies described here is to propose measures and procedures for developing concepts and technological solutions, which are adapted to the local conditions, to build sustainable communities in developing countries and emerging economies. These methodologies are linked to the EcoCity framework outlined by VTT Technical Research Centre of Finland Ltd. for sustainable community and neighbourhood regeneration and development. The framework is the result of a long experience in numerous EcoCity related projects, mainly Nordic and European in scope, which has been reformulated in recent years to respond to the local needs in the previously mentioned countries. There is also a particular emphasis on close collaboration with local partners and major stakeholders. In order to illustrate how these methodologies can support EcoCity concept development and implementation, results from a case study in Egypt will be discussed. The referred case study relates to the transformation of New Borg El Arab (NBC, near Alexandria, into an EcoCity. The viability of the idea was explored making use of different methodologies (Roadmap, Feasibility Study, and Residents Energy Survey and Building Consumption Assessment and considering the Residential, Commercial/Public Facilities, Industrial, Services/Utilities, and Transport sectors.

  15. The study of methodologies of software development for the next generation of HEP detector software

    International Nuclear Information System (INIS)

    Ding Yuzheng; Wang Taijie; Dai Guiliang

    1997-01-01

    The author discusses the characteristics of the next generation of HEP (High Energy Physics) detector software, and describes the basic strategy for the usage of object oriented methodologies, languages and tools in the development of the next generation of HEP detector software

  16. Trends in Large Proposal Development at Major Research Institutions

    Science.gov (United States)

    Mulfinger, Lorraine M.; Dressler, Kevin A.; James, L. Eric; Page, Niki; Serrano, Eduardo; Vazquez, Jorge

    2016-01-01

    Research administrator interest in large research proposal development and submission support is high, arguably in response to the bleak funding landscape for research and federal agency trends toward making more frequent larger awards. In response, a team from Penn State University and Huron Consulting Group initiated a baseline study to…

  17. Development and Current Status of Skull-Image Superimposition - Methodology and Instrumentation.

    Science.gov (United States)

    Lan, Y

    1992-12-01

    This article presents a review of the literature and an evaluation on the development and application of skull-image superimposition technology - both instrumentation and methodology - contributed by a number of scholars since 1935. Along with a comparison of the methodologies involved in the two superimposition techniques - photographic and video - the author characterized the techniques in action and the recent advances in computer image superimposition processing technology. The major disadvantage of conventional approaches is its relying on subjective interpretation. Through painstaking comparison and analysis, computer image processing technology can make more conclusive identifications by direct testing and evaluating the various programmed indices. Copyright © 1992 Central Police University.

  18. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  19. Progress in the development of methodology for fusion safety systems studies

    International Nuclear Information System (INIS)

    Ho, S.K.; Cambi, G.; Ciattaglia, S.; Fujii-e, Y.; Seki, Y.

    1994-01-01

    The development of fusion safety systems-study methodology, including the aspects of schematic classification of overall fusion safety system, qualitative assessment of fusion system for identification of critical accident scenarios, quantitative analysis of accident consequences and risk for safety design evaluation, and system-level analysis of accident consequences and risk for design optimization, by a consortium of international efforts is presented. The potential application of this methodology into reactor design studies will facilitate the systematic assessment of safety performance of reactor designs and enhance the impacts of safety considerations on the selection of design configurations

  20. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  1. Development of risk assessment methodology against natural external hazards for sodium-cooled fast reactors: project overview and strong Wind PRA methodology - 15031

    International Nuclear Information System (INIS)

    Yamano, H.; Nishino, H.; Kurisaka, K.; Okano, Y.; Sakai, T.; Yamamoto, T.; Ishizuka, Y.; Geshi, N.; Furukawa, R.; Nanayama, F.; Takata, T.; Azuma, E.

    2015-01-01

    This paper describes mainly strong wind probabilistic risk assessment (PRA) methodology development in addition to the project overview. In this project, to date, the PRA methodologies against snow, tornado and strong wind were developed as well as the hazard evaluation methodologies. For the volcanic eruption hazard, ash fallout simulation was carried out to contribute to the development of the hazard evaluation methodology. For the forest fire hazard, the concept of the hazard evaluation methodology was developed based on fire simulation. Event sequence assessment methodology was also developed based on plant dynamics analysis coupled with continuous Markov chain Monte Carlo method in order to apply to the event sequence against snow. In developing the strong wind PRA methodology, hazard curves were estimated by using Weibull and Gumbel distributions based on weather data recorded in Japan. The obtained hazard curves were divided into five discrete categories for event tree quantification. Next, failure probabilities for decay heat removal related components were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or out-take in the decay heat removal system, and fragility caused by the missile impacts. Finally, based on the event tree, the core damage frequency was estimated about 6*10 -9 /year by multiplying the discrete hazard probabilities in the Gumbel distribution by the conditional decay heat removal failure probabilities. A dominant sequence was led by the assumption that the operators could not extinguish fuel tank fire caused by the missile impacts and the fire induced loss of the decay heat removal system. (authors)

  2. A quantitative, non-destructive methodology for habitat characterisation and benthic monitoring at offshore renewable energy developments.

    Directory of Open Access Journals (Sweden)

    Emma V Sheehan

    2010-12-01

    Full Text Available Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs. Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects. The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth, then subsequently successfully deployed in demanding conditions at the deep (>50 m high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current, the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath

  3. A Quantitative, Non-Destructive Methodology for Habitat Characterisation and Benthic Monitoring at Offshore Renewable Energy Developments

    Science.gov (United States)

    Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.

    2010-01-01

    Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and

  4. DEVELOPMENT OF A METHODOLOGY TO ASSESS PROLIFERATION RESISTANCE AND PHYSICAL PROTECTION FOR GENERATION IV SYSTEMS

    International Nuclear Information System (INIS)

    Nishimura, R.; Bari, R.; Peterson, P.; Roglans-Ribas, J.; Kalenchuk, D.

    2004-01-01

    Enhanced proliferation resistance and physical protection (PR and PP) is one of the technology goals for advanced nuclear concepts, such as Generation IV systems. Under the auspices of the Generation IV International Forum, the Office of Nuclear Energy, Science and Technology of the U.S. DOE, the Office of Nonproliferation Policy of the National Nuclear Security Administration, and participating organizations from six other countries are sponsoring an international working group to develop an evaluation methodology for PR and PP. This methodology will permit an objective PR and PP comparison between alternative nuclear systems (e.g., different reactor types or fuel cycles) and support design optimization to enhance robustness against proliferation, theft and sabotage. The paper summarizes the proposed assessment methodology including the assessment framework, measures used to express the PR and PP characteristics of the system, threat definition, system element and target identification, pathway identification and analysis, and estimation of the measures

  5. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  6. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development.

    Science.gov (United States)

    Tøndel, Kristin; Niederer, Steven A; Land, Sander; Smith, Nicolas P

    2014-05-20

    Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input-output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on

  7. Blending critical realist and emancipatory practice development methodologies: making critical realism work in nursing research.

    LENUS (Irish Health Repository)

    Parlour, Randal

    2012-12-01

    This paper examines the efficacy of facilitation as a practice development intervention in changing practice within an Older Person setting and in implementing evidence into practice. It outlines the influences exerted by the critical realist paradigm in guiding emancipatory practice development activities and, in particular, how the former may be employed within an emancipatory practice development study to elucidate and increase understanding pertinent to causation and outcomes. The methodology is based upon an emancipatory practice development approach set within a realistic evaluation framework. This allows for systematic analysis of the social and contextual elements that influence the explication of outcomes associated with facilitation. The study is concentrated upon five practice development cycles, within which a sequence of iterative processes is integrated. The authors assert that combining critical realist and emancipatory processes offers a robust and practical method for translating evidence and implementing changes in practice, as the former affirms or falsifies the influence that emancipatory processes exert on attaining culture shift, and enabling transformation towards effective clinical practice. A new framework for practice development is proposed that establishes methodological coherency between emancipatory practice development and realistic evaluation. This augments the existing theoretical bases for both these approaches by contributing new theoretical and methodological understandings of causation.

  8. Surreptitious, Evolving and Participative Ontology Development: An End-User Oriented Ontology Development Methodology

    Science.gov (United States)

    Bachore, Zelalem

    2012-01-01

    Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…

  9. Developing a methodology for identifying action zones to protect and manage groundwater well fields

    Science.gov (United States)

    Bellier, Sandra; Viennot, Pascal; Ledoux, Emmanuel; Schott, Celine

    2013-04-01

    Implementation of a long term action plan to manage and protect well fields is a complex and very expensive process. In this context, the relevance and efficiency of such action plans on water quality should be evaluated. The objective of this study is to set up a methodology to identify relevant actions zones in which environmental changes may significantly impact the quantity or quality of pumped water. In the Seine-et-Marne department (France), under French environmental laws three sectors integrating numerous well-field pumping in Champigny's limestone aquifer are considered as priority. This aquifer, located at south-east of Paris, supplies more than one million people with drinking water. Catchments areas of these abstractions are very large (2000 km2) and their intrinsic vulnerability was established by a simple parametric approach that does not permit to consider the complexity of hydrosystem. Consequently, a methodology based on a distributed modeling of the process of the aquifer was developed. The basin is modeled using the hydrogeological model MODCOU, developed in MINES ParisTech since the 1980s. It simulates surface and groundwater flow in aquifer systems and allows to represent the local characteristics of the hydrosystem (aquifers communicating by leakage, rivers infiltration, supply from sinkholes and locally perched or dewatering aquifers). The model was calibrated by matching simulated river discharge hydrographs and piezometric heads with observed ones since the 1970s. Thanks to this modelling tool, a methodology based on the transfer of a theoretical tracer through the hydrosystem from the ground surface to the outlets was implemented to evaluate the spatial distribution of the contribution areas at contrasted, wet or dry recharge periods. The results show that the surface of areas contributing to supply most catchments is lower than 300 km2 and the major contributory zones are located along rivers. This finding illustrates the importance of

  10. An approach to SOA development methodology: SOUP comparison with RUP and XP

    Directory of Open Access Journals (Sweden)

    Sandra Svanidzaitė

    2014-08-01

    Full Text Available Service oriented architecture (SOA is an architecture for distributed applications composed of distributed services with weak coupling that are designed to meet business requirements. One of the research priorities in the field of SOA is creating such software design and development methodology (SDDM that takes into account all principles of this architecture and allows for effective and efficient application development. A lot of investigation has been carried out to find out whether can one of popular SDDM, such as agile methodologies or RUP suits, be adapted for SOA or there is a need to create some new SOA-oriented SDDM. This paper compares one of SOA-oriented SDDM – SOUP – with RUP and XP methodologies. The aim is to find out whether the SOUP methodology is already mature enough to assure successful development of SOA applications. This aim is accomplished by comparing activities, artifacts of SOUP and RUP and emphasizing which XP practices are used in SOUP.DOI: http://dx.doi.org/10.15181/csat.v2i1.77 

  11. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  12. Methodology for oil field development; Metodologia para o desenvolvimento de campos de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Galeano, Yadira Diaz

    1998-07-01

    The main scope of this work is to study and develop a methodology which allows the elaboration of project for oil field development. There fore it is necessary to consider to consider the integration of the human, technological and economical issues that are important parameters in the engineering project. The spiral concept was applied for the project in order to coordinate, in a reasonable and logical way, the activities involved in the field development, as well as the hierarchical analysis method for the decision making process. The development of an oil field is divided in viability study, preliminary project, final project, project implementation, production and field abandonment cycles. The main components for each cycle are external aspects, environmental criteria, reservoir management, and drilling, completion and well workover, production systems, exportation systems, and risk and economical analysis. The proposed methodology establishes a general scheme for planning and it presents applicable procedures for any field. (author)

  13. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    Science.gov (United States)

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  14. Development of cryogenic installations for large liquid argon neutrino detectors

    CERN Document Server

    Adamowski, M; Geynisman, M; Hentschel, S; Montanari, D; Nessi, M; Norris, B

    2015-01-01

    A proposal for a very large liquid argon (68,000 kg) based neutrino detector is being studied. To validate the design principles and the detector technology, and to gain experience in the development of the cryostats and the cryogenic systems needed for such large experiments, several smaller scale installations will be developed and implemented, at Fermilab and CERN. The cryogenic systems for these installations will be developed, constructed, installed and commissioned by an international engineering team. These installations shall bring the required cooling power under specific conditions to the experiments for the initial cool-down and the long term operation, and shall also guarantee the correct distribution of the cooling power within the cryostats to ensure a homogeneous temperature distribution within the cryostat itself. The cryogenic systems shall also include gaseous and liquid phase argon purification devices to be used to reach and maintain the very stringent purity requirements needed for these...

  15. An approach to SOA development methodology: SOUP comparison with RUP and XP

    OpenAIRE

    Sandra Svanidzaitė

    2014-01-01

    Service oriented architecture (SOA) is an architecture for distributed applications composed of distributed services with weak coupling that are designed to meet business requirements. One of the research priorities in the field of SOA is creating such software design and development methodology (SDDM) that takes into account all principles of this architecture and allows for effective and efficient application development. A lot of investigation has been carried out to find out whether can o...

  16. Methodology of gender research and local development concepts: report on workshop, 11-12 November 1999

    OpenAIRE

    Klein-Hessling, Ruth

    2000-01-01

    Der Workshop "Methodology of Gender Research and Local Development Concepts" wurde aus Anlass eines Besuches zweier Angehöriger der 'Ahfad University of Women' Omdurmman, Sudan von der 'Gender Division of the Sociology of Development Research Centre' an der Universität Bielefeld organisiert und von ungefähr 30 Teilnehmern besucht. Erfahrungen von empirischen Feldforschungsarbeiten aus dem Sudan, Kenia, Ruanda, Westafrika und Südasien bildeten den Ausgangspunkt für Diskussionen über methodolo...

  17. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  18. Technical Support Document: Development of the Advanced Energy Design Guide for Large Hospitals - 50% Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Bonnema, E.; Leach, M.; Pless, S.

    2013-06-01

    This Technical Support Document describes the process and methodology for the development of the Advanced Energy Design Guide for Large Hospitals: Achieving 50% Energy Savings Toward a Net Zero Energy Building (AEDG-LH) ASHRAE et al. (2011b). The AEDG-LH is intended to provide recommendations for achieving 50% whole-building energy savings in large hospitals over levels achieved by following Standard 90.1-2004. The AEDG-LH was created for a 'standard' mid- to large-size hospital, typically at least 100,000 ft2, but the strategies apply to all sizes and classifications of new construction hospital buildings. Its primary focus is new construction, but recommendations may be applicable to facilities undergoing total renovation, and in part to many other hospital renovation, addition, remodeling, and modernization projects (including changes to one or more systems in existing buildings).

  19. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  20. Bibliographic survey on methodologies for development of health database of the population in case of cancer occurrences

    International Nuclear Information System (INIS)

    Cavinato, Christianne C.; Andrade, Delvonei A. de; Sabundjian, Gaiane; Diz, Maria Del Pilar E.

    2014-01-01

    The objective is to make a survey of existing methodologies and for the development of public health database, focusing on health (fatal and nonfatal cancer) of the population surrounding a nuclear facility, for purposes of calculating the environmental cost of the same. From methodologies found to develop this type of database, a methodology will be developed to be applied to the internal public of IPEN/CNEN-SP, Brazil, as a pre-test for the acquisition of health information desired

  1. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    Science.gov (United States)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  2. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  3. Development of 3D pseudo pin-by-pin calculation methodology in ANC

    International Nuclear Information System (INIS)

    Zhang, B.; Mayhue, L.; Huria, H.; Ivanov, B.

    2012-01-01

    Advanced cores and fuel assembly designs have been developed to improve operational flexibility, economic performance and further enhance safety features of nuclear power plants. The simulation of these new designs, along with strong heterogeneous fuel loading, have brought new challenges to the reactor physics methodologies currently employed in the industrial codes for core analyses. Control rod insertion during normal operation is one operational feature in the AP1000 R plant of Westinghouse next generation Pressurized Water Reactor (PWR) design. This design improves its operational flexibility and efficiency but significantly challenges the conventional reactor physics methods, especially in pin power calculations. The mixture loading of fuel assemblies with significant neutron spectrums causes a strong interaction between different fuel assembly types that is not fully captured with the current core design codes. To overcome the weaknesses of the conventional methods, Westinghouse has developed a state-of-the-art 3D Pin-by-Pin Calculation Methodology (P3C) and successfully implemented in the Westinghouse core design code ANC. The new methodology has been qualified and licensed for pin power prediction. The 3D P3C methodology along with its application and validation will be discussed in the paper. (authors)

  4. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    Science.gov (United States)

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Contextual assessment of organisational culture - methodological development in two case studies

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.

    2002-01-01

    Despite the acknowledged significance of organisational culture in the nuclear field, previous cultural studies have concentrated on purely safety related matters, or been only descriptive in nature. New kinds of methods, taking into account the overall objectives of the organisation, were needed to assess culture and develop its working practices appropriately. VTT developed the Contextual Assessment of Organisational Culture (CAOC) methodology during the FINNUS programme. The methodology utilises two concepts, organisational culture and core task. The core task can be defined as the core demands and content of work that the organisation has to accomplish in order to be effective. The core task concept is used in assessing the central dimensions of the organisation's culture. Organisational culture is defined as a solution the company has generated in order to fulfil the perceived demands of its core task. The CAOC-methodology was applied in two case studies, in the Radiation and Nuclear Safety Authority of Finland and in the maintenance unit of Loviisa NPP. The aim of the studies was not only to assess the given culture, but also to give the personnel new concepts and new tools for reflecting on their organisation, their jobs and on appropriate working practices. The CAOC-methodology contributes to the design and redesign of work in complex sociotechnical systems. It strives to enhance organisations' capability to assess their current working practices and the meanings attached to them and compare these to the actual demands of their basic mission and so change unadaptive practices. (orig.)

  6. Methodology for Developing the REScheckTM Software through Version 4.2

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Connell, Linda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Gowri, Krishnan [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lucas, R. G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schultz, Robert W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Zachary T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wiberg, John D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2009-08-01

    This report explains the methodology used to develop Version 4.2 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, and 2006 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these five editions is similar. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  7. Impacts of Outer Continental Shelf (OCS) development on recreation and tourism. Volume 3. Detailed methodology

    Energy Technology Data Exchange (ETDEWEB)

    1987-04-01

    The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of different approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.

  8. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  9. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  10. Development of a design methodology for hydraulic pipelines carrying rectangular capsules

    International Nuclear Information System (INIS)

    Asim, Taimoor; Mishra, Rakesh; Abushaala, Sufyan; Jain, Anuj

    2016-01-01

    The scarcity of fossil fuels is affecting the efficiency of established modes of cargo transport within the transportation industry. Efforts have been made to develop innovative modes of transport that can be adopted for economic and environmental friendly operating systems. Solid material, for instance, can be packed in rectangular containers (commonly known as capsules), which can then be transported in different concentrations very effectively using the fluid energy in pipelines. For economical and efficient design of such systems, both the local flow characteristics and the global performance parameters need to be carefully investigated. Published literature is severely limited in establishing the effects of local flow features on system characteristics of Hydraulic Capsule Pipelines (HCPs). The present study focuses on using a well validated Computational Fluid Dynamics (CFD) tool to numerically simulate the solid-liquid mixture flow in both on-shore and off-shore HCPs applications including bends. Discrete Phase Modelling (DPM) has been employed to calculate the velocity of the rectangular capsules. Numerical predictions have been used to develop novel semi-empirical prediction models for pressure drop in HCPs, which have then been embedded into a robust and user-friendly pipeline optimisation methodology based on Least-Cost Principle. - Highlights: • Local flow characteristics in a pipeline transporting rectangular capsules. • Development of prediction models for the pressure drop contribution of capsules. • Methodology developed for sizing of Hydraulic Capsule Pipelines. • Implementation of the developed methodology to obtain optimal pipeline diameter.

  11. Load shape development for Swedish commercial and public buildings - methodologies and results

    Energy Technology Data Exchange (ETDEWEB)

    Noren, C.

    1999-06-01

    The knowledge concerning electricity consumption, and especially load demand, in Swedish commercial buildings is very limited. The current study deals with methods for electricity consumption indicator development and application of the different methodologies on measured data. Typical load shapes and consumption indicators are developed for four different types of commercial buildings: schools, hotels, grocery stores and department stores. Two different methodologies for consumption indicator development are presented and discussed. The influence on load demand from different factors such as, installations, outdoor temperature and building activities is studied. It is suggested that building floor area is not an accurate determinant of building electricity consumption and it is necessary to consider other factors as those just mentioned to understand commercial building electricity consumption. The application of the two methodologies on measured data shows that typical load shapes can be developed with reasonable accuracy. For most of the categories it is possible to use the typical load shapes for approximation of whole-building load shapes with error rates about 10-25% depending on day-type and building type. Comparisons of the developed load shapes with measured data show good agreement 49 refs, 22 figs, 3 tabs

  12. Systematic methodological review: developing a framework for a qualitative semi-structured interview guide.

    Science.gov (United States)

    Kallio, Hanna; Pietilä, Anna-Maija; Johnson, Martin; Kangasniemi, Mari

    2016-12-01

    To produce a framework for the development of a qualitative semi-structured interview guide. Rigorous data collection procedures fundamentally influence the results of studies. The semi-structured interview is a common data collection method, but methodological research on the development of a semi-structured interview guide is sparse. Systematic methodological review. We searched PubMed, CINAHL, Scopus and Web of Science for methodological papers on semi-structured interview guides from October 2004-September 2014. Having examined 2,703 titles and abstracts and 21 full texts, we finally selected 10 papers. We analysed the data using the qualitative content analysis method. Our analysis resulted in new synthesized knowledge on the development of a semi-structured interview guide, including five phases: (1) identifying the prerequisites for using semi-structured interviews; (2) retrieving and using previous knowledge; (3) formulating the preliminary semi-structured interview guide; (4) pilot testing the guide; and (5) presenting the complete semi-structured interview guide. Rigorous development of a qualitative semi-structured interview guide contributes to the objectivity and trustworthiness of studies and makes the results more plausible. Researchers should consider using this five-step process to develop a semi-structured interview guide and justify the decisions made during it. © 2016 John Wiley & Sons Ltd.

  13. Methodological developments in searching for studies for systematic reviews: past, present and future?

    Science.gov (United States)

    Lefebvre, Carol; Glanville, Julie; Wieland, L Susan; Coles, Bernadette; Weightman, Alison L

    2013-09-25

    The Cochrane Collaboration was established in 1993, following the opening of the UK Cochrane Centre in 1992, at a time when searching for studies for inclusion in systematic reviews was not well-developed. Review authors largely conducted their own searches or depended on medical librarians, who often possessed limited awareness and experience of systematic reviews. Guidance on the conduct and reporting of searches was limited. When work began to identify reports of randomized controlled trials (RCTs) for inclusion in Cochrane Reviews in 1992, there were only approximately 20,000 reports indexed as RCTs in MEDLINE and none indexed as RCTs in Embase. No search filters had been developed with the aim of identifying all RCTs in MEDLINE or other major databases. This presented The Cochrane Collaboration with a considerable challenge in identifying relevant studies.Over time, the number of studies indexed as RCTs in the major databases has grown considerably and the Cochrane Central Register of Controlled Trials (CENTRAL) has become the best single source of published controlled trials, with approximately 700,000 records, including records identified by the Collaboration from Embase and MEDLINE. Search filters for various study types, including systematic reviews and the Cochrane Highly Sensitive Search Strategies for RCTs, have been developed. There have been considerable advances in the evidence base for methodological aspects of information retrieval. The Cochrane Handbook for Systematic Reviews of Interventions now provides detailed guidance on the conduct and reporting of searches. Initiatives across The Cochrane Collaboration to improve the quality inter alia of information retrieval include: the recently introduced Methodological Expectations for Cochrane Intervention Reviews (MECIR) programme, which stipulates 'mandatory' and 'highly desirable' standards for various aspects of review conduct and reporting including searching, the development of Standard Training

  14. Application of low-cost methodologies for mobile phone app development.

    Science.gov (United States)

    Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon

    2014-12-09

    The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low

  15. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  16. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    International Nuclear Information System (INIS)

    Dorp, F. van

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways (a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  17. Development of large area resistive electrodes for ATLAS NSW Micromegas

    Science.gov (United States)

    Ochi, Atsuhiko

    2018-02-01

    Micromegas with resistive anodes will be used for the NSW upgrades of the ATLAS experiment at LHC. Resistive electrodes are used in MPGD devices to prevent sparks in high-rate operation. Large-area resistive electrodes for Micromegas have been developed using two different technologies: screen printing and carbon sputtering. The maximum resistive foil size is 45 × 220 cm with a printed pattern of 425-μm pitch strips. These technologies are also suitable for mass production. Prototypes of a production model series have been successfully produced. In this paper, we report the development, the production status, and the test results of resistive Micromegas.

  18. The Desired Image of the Future Economy of the Industrial Region: Development Trends and Evaluation Methodology

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2017-09-01

    Full Text Available In the article, the authors emphasize that industrial regions play an important role in the increasing of technological independence of Russia. We show that the decline in the share of processing industries in the gross regional product can not be treated as a negative de-industrialization of the economy. The article proves that the increase in the speed of changements, instability of socio-economic systems, the diverse risks predetermine the need to develop new methodological approaches to predictive research. The studies aimed at developing a technology for the design of the desired image of the future and the methodology for its evaluation are of high importance. For the initial stage of the research, the authors propose the methodological approach for assessing the desired image of the future of metallurgy as one of the most important industry of the region. We propose the term of «technological image of the regional metallurgy». We show that repositioning the image of the regional metallurgical complex is quite a long process. This have determined the need to define the stages of repositioning. The proposed methodology of the evaluation of desired future includes the methodological provisions to quantify the characteristics of goals achieved at the respective stages of the repositioning of the metallurgy. The methodological approach to the design of the desired image of the future implies the following stages: the identification of the priority areas of the technological development of regional metallurgy on the basis of bibliometric and patent analysis; the evaluation of dynamics of the development of the structure of metal products domestic consumption based on comparative analysis and relevant analytical methods as well as its forecasting; the design of the factor model, allowing to identify the parameters quantifying the technological image of the regional metallurgy based on the principal components method,; systematization of

  19. Methodologies Related to Computational models in View of Developing Anti-Alzheimer Drugs: An Overview.

    Science.gov (United States)

    Baheti, Kirtee; Kale, Mayura Ajay

    2018-04-17

    Since last two decades, there has been more focus on the development strategies related to Anti-Alzheimer's drug research. This may be attributed to the fact that most of the Alzheimer's cases are still mostly unknown except for a few cases, where genetic differences have been identified. With the progress of the disease, the symptoms involve intellectual deterioration, memory impairment, abnormal personality and behavioural patterns, confusion, aggression, mood swings, irritability Current therapies available for this disease give only symptomatic relief and do not focus on manipulations of biololecular processes. Nearly all the therapies to treat Alzheimer's disease, target to change the amyloid cascade which is considered to be an important in AD pathogenesis. New drug regimens are not able to keep pace with the ever-increasing understanding about dementia at molecular level. Looking into these aggravated problems, we though to put forth molecular modeling as a drug discovery approach for developing novel drugs to treat Alzheimer disease. The disease is incurable and it gets worst as it advances and finally causes death. Due to this, the design of drugs to treat this disease has become an utmost priority for research. One of the most important emerging technologies applied for this has been Computer-assisted drug design (CADD). It is a research tool that employs large scale computing strategies in an attempt to develop a model receptor site which can be used for designing of an anti-Alzheimer drug. The various models of amyloid-based calcium channels have been computationally optimized. Docking and De novo evolution are used to design the compounds. These are further subjected to absorption, distribution, metabolism, excretion and toxicity (ADMET) studies to finally bring about active compounds that are able to cross BBB. Many novel compounds have been designed which might be promising ones for the treatment of AD. The present review describes the research

  20. Nirex methodology for scenario and conceptual model development. An international peer review

    International Nuclear Information System (INIS)

    1999-06-01

    Nirex has responsibilities for nuclear waste management in the UK. The company's top level objectives are to maintain technical credibility on deep disposal, to gain public acceptance for a deep geologic repository, and to provide relevant advice to customers on the safety implications of their waste packaging proposals. Nirex utilizes peer reviews as appropriate to keep its scientific tools up-to-date and to periodically verify the quality of its products. The NEA formed an International Review Team (IRT) consisting of four internationally recognised experts plus a member of the NEA Secretariat. The IRT performed an in-depth analysis of five Nirex scientific reports identified in the terms of reference of the review. The review was to primarily judge whether the Nirex methodology provides an adequate framework to support the building of a future licensing safety case. Another objective was to judge whether the methodology could aid in establishing a better understanding, and, ideally, enhance acceptance of a repository among stakeholders. Methodologies for conducting safety assessments include at a very basic level the identification of features, events, and processes (FEPs) relevant to the system at hand, their convolution in scenarios for analysis, and the formulation of conceptual models to be addressed through numerical modelling. The main conclusion of the IRT is that Nirex has developed a potentially sound methodology for the identification and analysis of FEPs and for the identification of conceptual model needs and model requirements. The work is still in progress and is not yet complete. (R.P.)

  1. Development of a novel set of criteria to select methodology for designing product service systems

    Directory of Open Access Journals (Sweden)

    Tuananh Tran

    2016-04-01

    Full Text Available This paper proposes eight groups of twenty nine scoring criteria that can help designers and practitioners to compare and select an appropriate methodology for a certain problem in designing product service system (PSS. PSS has been researched for more than a decade and is now becoming more and more popular in academia as well as industry. Despite that fact, the adoption of PSS is still limited for its potential. One of the main reasons is that designing PSS itself is a challenge. Designers and developers face difficulties in choosing appropriate PSS design methodologies for their projects so that they can design effective PSS offerings. By proposing eight groups of twenty nine scoring criteria, this paper enables a “step by step” process to identify the most appropriate design methodology for a company’s PSS problem. An example is also introduced to illustrate the use of the proposed scoring criteria and provide a clear picture of how different design methodologies can be utilized at their best in terms of application.

  2. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    Science.gov (United States)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  3. Development of a standard methodology for optimizing remote visual display for nuclear-maintenance tasks

    International Nuclear Information System (INIS)

    Clarke, M.M.; Garin, J.; Preston-Anderson, A.

    1981-01-01

    The aim of the present study is to develop a methodology for optimizing remote viewing systems for a fuel recycle facility (HEF) being designed at Oak Ridge National Laboratory (ORNL). An important feature of this design involves the Remotex concept: advanced servo-controlled master/slave manipulators, with remote television viewing, will totally replace direct human contact with the radioactive environment. Therefore, the design of optimal viewing conditions is a critical component of the overall man/machine system. A methodology has been developed for optimizing remote visual displays for nuclear maintenance tasks. The usefulness of this approach has been demonstrated by preliminary specification of optimal closed circuit TV systems for such tasks

  4. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  5. U.S. Geological Survey Methodology Development for Ecological Carbon Assessment and Monitoring

    Science.gov (United States)

    Zhu, Zhi-Liang; Stackpoole, S.M.

    2009-01-01

    Ecological carbon sequestration refers to transfer and storage of atmospheric carbon in vegetation, soils, and aquatic environments to help offset the net increase from carbon emissions. Understanding capacities, associated opportunities, and risks of vegetated ecosystems to sequester carbon provides science information to support formulation of policies governing climate change mitigation, adaptation, and land-management strategies. Section 712 of the Energy Independence and Security Act (EISA) of 2007 mandates the Department of the Interior to develop a methodology and assess the capacity of our nation's ecosystems for ecological carbon sequestration and greenhouse gas (GHG) flux mitigation. The U.S. Geological Survey (USGS) LandCarbon Project is responding to the Department of Interior's request to develop a methodology that meets specific EISA requirements.

  6. Development of methodologies for coupled water-hammer analysis of piping systems and supports

    International Nuclear Information System (INIS)

    Kamil, H.; Gantayat, A.; Attia, A.; Goulding, H.

    1983-01-01

    The paper presents the results of an investigation on the development of methodologies for coupled water-hammer analyses. The study was conducted because the present analytical methods for calculation of loads on piping systems and supports resulting from water-hammer phenomena are overly conservative. This is mainly because the methods do not usually include interaction between the fluid and the piping and thus predict high loads on piping systems and supports. The objective of the investigation presented in this paper was to develop methodologies for coupled water-hammer analyses, including fluid-structure interaction effects, to be able to obtain realistic loads on piping systems and supports, resulting in production of more economical designs. (orig./RW)

  7. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    Science.gov (United States)

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  8. SITE-94. Scenario development FEP audit list preparation: methodology and presentation

    International Nuclear Information System (INIS)

    Stenhouse, M.; Chapman, N.; Sumerling, T.

    1993-04-01

    This report concerns a study which is part of the SKI performance assessment project SITE-94. SITE-94 is a performance assessment of a hypothetical repository at a real site. The main objective of the project is to determine how site specific data should be assimilated into the performance assessment process and to evaluate how uncertainties inherent in site characterization will influence performance assessment results. Other important elements of SITE-94 are the development of a practical and defensible methodology for defining, constructing and analyzing scenarios, the development of approaches for treatment of uncertainties , evaluation of canister integrity, and the development and application of an appropriate quality assurance plan for performance assessments

  9. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    Science.gov (United States)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are

  10. Radiological risk assessment for the public under the loss of medium and large sources using bayesian methodology

    International Nuclear Information System (INIS)

    Kim, Joo Yeon; Jang, Han Ki; Lee, Jai Ki

    2005-01-01

    Bayesian methodology is appropriated for use in PRA because subjective knowledges as well as objective data are applied to assessment. In this study, radiological risk based on Bayesian methodology is assessed for the loss of source in field radiography. The exposure scenario for the lost source presented in U.S. NRC is reconstructed by considering the domestic situation and Bayes theorem is applied to updating of failure probabilities of safety functions. In case of updating of failure probabilities, it shows that 5% Bayes credible intervals using Jeffreys prior distribution are lower than ones using vague prior distribution. It is noted that Jeffreys prior distribution is appropriated in risk assessment for systems having very low failure probabilities. And, it shows that the mean of the expected annual dose for the public based on Bayesian methodology is higher than the dose based on classical methodology because the means of the updated probabilities are higher than classical probabilities. The database for radiological risk assessment are sparse in domestic. It summarizes that Bayesian methodology can be applied as an useful alternative for risk assessment and the study on risk assessment will be contributed to risk-informed regulation in the field of radiation safety

  11. Impression management : developing and illustrating a scheme of analysis for narrative disclosures – a methodological note

    OpenAIRE

    Brennan, Niamh; Guillamon-Saorin, Encarna; Pierce, Aileen

    2009-01-01

    Purpose – This paper develops a holistic measure for analysing impression management and for detecting bias introduced into corporate narratives as a result of impression management. Design/methodology/approach – Prior research on the seven impression management methods in the literature is summarised. Four of the less-researched methods are described in detail, and are illustrated with examples from UK Annual Results’ Press Releases (ARPRs). A method of computing a holistic composite impr...

  12. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  13. Development of a Pattern Recognition Methodology for Determining Operationally Optimal Heat Balance Instrumentation Calibration Schedules

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Beran; John Christenson; Dragos Nica; Kenny Gross

    2002-12-15

    The goal of the project is to enable plant operators to detect with high sensitivity and reliability the onset of decalibration drifts in all of the instrumentation used as input to the reactor heat balance calculations. To achieve this objective, the collaborators developed and implemented at DBNPS an extension of the Multivariate State Estimation Technique (MSET) pattern recognition methodology pioneered by ANAL. The extension was implemented during the second phase of the project and fully achieved the project goal.

  14. DEVELOPMENT OF METHODOLOGY FOR TRAFFIC ACCIDENT FORECASTING AT VARIOUS TYPICAL URBAN AREAS

    OpenAIRE

    D. V. Kapsky

    2012-01-01

    The paper provides investigation results pertaining to development of methodology for forecasting traffic accidents using a “conflict zone” method that considers potential danger for two typical urban areas, namely: signaled crossings and bumps that are made in the areas of zebra crossings and it also considers various types and kinds of conflicts. The investigations have made it possible to obtain various indices of threshold sensitivity in respect of  potential risks  and in relation to tra...

  15. Analysis and development of numerical methodologies for simulation of flow control with dielectric barrier discharge actuators

    OpenAIRE

    Abdollahzadehsangroudi, Mohammadmahdi

    2014-01-01

    The aim of this thesis is to investigate and develop different numerical methodologies for modeling the Dielectric Barrier discharge (DBD) plasma actuators for flow control purposes. Two different modeling approaches were considered; one based on Plasma-fluid model and the other based on a phenomenological model. A three component Plasma fluid model based on the transport equations of charged particles was implemented in this thesis in OpenFOAM, using several techniques to redu...

  16. Development of high purity large forgings for nuclear power plants

    International Nuclear Information System (INIS)

    Tanaka, Yasuhiko; Sato, Ikuo

    2011-01-01

    The recent increase in the size of energy plants has been supported by the development of manufacturing technology for high purity large forgings for the key components of the plant. To assure the reliability and performance of the large forgings, refining technology to make high purity steels, casting technology for gigantic ingots, forging technology to homogenize the material and consolidate porosity are essential, together with the required heat treatment and machining technologies. To meet these needs, the double degassing method to reduce impurities, multi-pouring methods to cast the gigantic ingots, vacuum carbon deoxidization, the warm forging process and related technologies have been developed and further improved. Furthermore, melting facilities including vacuum induction melting and electro slag re-melting furnaces have been installed. By using these technologies and equipment, large forgings have been manufactured and shipped to customers. These technologies have also been applied to the manufacture of austenitic steel vessel components of the fast breeder reactors and components for fusion experiments.

  17. Development of high purity large forgings for nuclear power plants

    Science.gov (United States)

    Tanaka, Yasuhiko; Sato, Ikuo

    2011-10-01

    The recent increase in the size of energy plants has been supported by the development of manufacturing technology for high purity large forgings for the key components of the plant. To assure the reliability and performance of the large forgings, refining technology to make high purity steels, casting technology for gigantic ingots, forging technology to homogenize the material and consolidate porosity are essential, together with the required heat treatment and machining technologies. To meet these needs, the double degassing method to reduce impurities, multi-pouring methods to cast the gigantic ingots, vacuum carbon deoxidization, the warm forging process and related technologies have been developed and further improved. Furthermore, melting facilities including vacuum induction melting and electro slag re-melting furnaces have been installed. By using these technologies and equipment, large forgings have been manufactured and shipped to customers. These technologies have also been applied to the manufacture of austenitic steel vessel components of the fast breeder reactors and components for fusion experiments.

  18. Developing the P2/6 methodology [to assess the security capability of modern distributed generation

    Energy Technology Data Exchange (ETDEWEB)

    Allan, Ron; Strbac, Goran; Djapic, Predrag; Jarrett, Keith [Manchester Univ. Inst. of Science and Technology, Manchester (United Kingdom)

    2004-04-29

    The main objective of the project was to use the methodology developed in the previous Methodology project (ETSU/FES Project K/EL/00287) to assess the security capability of modern distributed generation in order to review Table 2 and related text of Engineering Recommendation P2/5, and to propose information and results that could be used to create a new P2/6 that takes into account modern types of generating units; unit numbers; unit availabilities; and capacities. Technical issues raised in the previous study but held over until this project include: Treatment of single unit generation systems; Effect of shape of load duration curves; Persistence of intermittent generation, T{sub m}; Time resolution of intermittent generation output profiles; Ride-through capability; Risk to loss of supply. Three main ways of implementing the methodology were recommended: Look-up table(s), Graphical, and Computer program. The specification for the computer program was to produce a simple spreadsheet application package that an engineer with a reasonably knowledge of the approach could use. This prototype package has been developed in conjunction with Workstream 3. Its objective is to calculate the capability contribution to security of supply from distributed generation connected to a particular demand group. The application has been developed using Microsoft Excel and Visual Basic for Applications. New Tables for inclusion in P2/6 are included. (UK)

  19. Development of methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWRs

    International Nuclear Information System (INIS)

    Shvarts, S.; Gerber, D.A.; House, K.; Hirschberg, P.

    1994-01-01

    The objective of this paper is to describe a methodology for evaluating and monitoring steam generator feedwater nozzle cracking in PWR plants. This methodology is based in part on plant test data obtained from a recent Diablo Canyon Power Plant (DCPP) Unit 1 heatup. Temperature sensors installed near the nozzle-to-pipe weld were monitored during the heatup, along with operational parameters such as auxiliary feedwater (AFW) flow rate and steam generator temperature. A thermal stratification load definition was developed from this data. Steady state characteristics of this data were used in a finite element analysis to develop relationship between AFW flow and stratification interface level. Fluctuating characteristics of this data were used to determine transient parameters through the application of a Green's Function approach. The thermal stratification load definition from the test data was used in a three-dimensional thermal stress analysis to determine stress cycling and consequent fatigue damage or crack growth during AFW flow fluctuations. The implementation of the developed methodology in the DCPP and Sequoyah Nuclear Plant (SNP) fatigue monitoring systems is described

  20. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  1. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  2. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  3. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  4. Development of a methodology for the safety assessment of near surface disposal facilities for radioactive waste

    International Nuclear Information System (INIS)

    Simon, I.; Cancio, D.; Alonso, L.F.; Agueero, A.; Lopez de la Higuera, J.; Gil, E.; Garcia, E.

    2000-01-01

    The Project on the Environmental Radiological Impact in CIEMAT is developing, for the Spanish regulatory body Consejo de Seguridad Nuclear (CSN), a methodology for the Safety Assessment of near surface disposal facilities. This method has been developed incorporating some elements developed through the participation in the IAEA's ISAM Programme (Improving Long Term Safety Assessment Methodologies for Near Surface Radioactive Waste Disposal Facilities). The first step of the approach is the consideration of the assessment context, including the purpose of the assessment, the end-Points, philosophy, disposal system, source term and temporal scales as well as the hypothesis about the critical group. Once the context has been established, and considering the peculiarities of the system, an specific list of features, events and processes (FEPs) is produced. These will be incorporated into the assessment scenarios. The set of scenarios will be represented in the conceptual and mathematical models. By the use of mathematical codes, calculations are performed to obtain results (i.e. in terms of doses) to be analysed and compared against the criteria. The methodology is being tested by the application to an hypothetical engineered disposal system based on an exercise within the ISAM Programme, and will finally be applied to the Spanish case. (author)

  5. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  6. Development of large size NC trepanning and horning machine

    International Nuclear Information System (INIS)

    Wada, Yoshiei; Aono, Fumiaki; Siga, Toshihiko; Sudo, Eiichi; Takasa, Seiju; Fukuyama, Masaaki; Sibukawa, Koichi; Nakagawa, Hirokatu

    2010-01-01

    Due to the recent increase in world energy demand, construction of considerable number of nuclear and fossil power plant has been proceeded and is further planned. High generating capacity plant requires large forged components such as monoblock turbine rotor shafts and the dimensions of them tend to increase. Some of these components have center bore for material test, NDE and other use. In order to cope with the increase in production of these large forgings with center bores, a new trepanning machine, which exclusively bore a deep hole, was developed in JSW taking account of many accumulated experiences and know-how of experts. The machine is the world largest 400t trepanning and horning machine with numerical control and has many advantage in safety, the machining precision, machining efficiency, operability, labor-saving, and energy saving. Furthermore, transfer of the technical skill became easy through concentrated monitoring system based on numerically analysed experts' know-how. (author)

  7. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  8. Development of a predictive methodology for identifying high radon exhalation potential areas

    International Nuclear Information System (INIS)

    Ielsch, G.

    2001-01-01

    Radon 222 is a radioactive natural gas originating from the decay of radium 226 which itself originates from the decay of uranium 23 8 naturally present in rocks and soil. Inhalation of radon gas and its decay products is a potential health risk for man. Radon can accumulate in confined environments such as buildings, and is responsible for one third of the total radiological exposure of the general public to radiation. The problem of how to manage this risk then arises. The main difficulty encountered is due to the large variability of exposure to radon across the country. A prediction needs to be made of areas with the highest density of buildings with high radon levels. Exposure to radon varies depending on the degree of confinement of the habitat, the lifestyle of the occupants and particularly emission of radon from the surface of the soil on which the building is built. The purpose of this thesis is to elaborate a methodology for determining areas presenting a high potential for radon exhalation at the surface of the soil. The methodology adopted is based on quantification of radon exhalation at the surface, starting from a precise characterization of the main local geological and pedological parameters that control the radon source and its transport to the ground/atmosphere interface. The methodology proposed is innovative in that it combines a cartographic analysis, parameters integrated into a Geographic Information system, and a simplified model for vertical transport of radon by diffusion through pores in the soil. This methodology has been validated on two typical areas, in different geological contexts, and gives forecasts that generally agree with field observations. This makes it possible to identify areas with a high exhalation potential within a range of a few square kilometers. (author)

  9. Risk-informed analysis of the large break loss of coolant accident and PCT margin evaluation with the RISMC methodology

    International Nuclear Information System (INIS)

    Liang, T.H.; Liang, K.S.; Cheng, C.K.; Pei, B.S.; Patelli, E.

    2016-01-01

    Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10 −3 . • The occurrence probability of the deterministic licensing sequence is 5.46 * 10 −5 . - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability of the

  10. Risk-informed analysis of the large break loss of coolant accident and PCT margin evaluation with the RISMC methodology

    Energy Technology Data Exchange (ETDEWEB)

    Liang, T.H. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Liang, K.S., E-mail: ksliang@alum.mit.edu [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Cheng, C.K.; Pei, B.S. [Institute of Nuclear Engineering and Science, National Tsing Hua University, 101 Sec. 2, Kuang-Fu Road, Hsinchu 30013, Taiwan (China); Patelli, E. [Institute of Risk and Uncertainty, University of Liverpool, Room 610, Brodie Tower, L69 3GQ (United Kingdom)

    2016-11-15

    Highlights: • With RISMC methodology, both aleatory and epistemic uncertainties have been considered. • 14 probabilistically significant sequences have been identified and quantified. • A load spectrum for LBLOCA has been conducted with CPCT and SP of each dominant sequence. • Comparing to deterministic methodologies, the risk-informed PCT margin can be greater by 44–62 K. • The SP of the referred sequence to cover 99% in the load spectrum is only 5.07 * 10{sup −3}. • The occurrence probability of the deterministic licensing sequence is 5.46 * 10{sup −5}. - Abstract: For general design basis accidents, such as SBLOCA and LBLOCA, the traditional deterministic safety analysis methodologies are always applied to analyze events based on a so called surrogate or licensing sequence, without considering how low this sequence occurrence probability is. In the to-be-issued 10 CFR 50.46a, the LBLOCA will be categorized as accidents beyond design basis and the PCT margin shall be evaluated in a risk-informed manner. According to the risk-informed safety margin characterization (RISMC) methodology, a process has been suggested to evaluate the risk-informed PCT margin. Following the RISMC methodology, a load spectrum of PCT for LBLOCA has been generated for the Taiwan’s Maanshan Nuclear Power plant and 14 probabilistic significant sequences have been identified. It was observed in the load spectrum that the conditional PCT generally ascends with the descending sequence occurrence probability. With the load spectrum covering both aleatory and epistemic uncertainties, the risk-informed PCT margin can be evaluated by either expecting value estimation method or sequence probability coverage method. It was found that by comparing with the traditional deterministic methodology, the PCT margin evaluated by the RISMC methodology can be greater by 44–62 K. Besides, to have a cumulated occurrence probability over 99% in the load spectrum, the occurrence probability

  11. Understanding care in the past to develop caring science of the future: a historical methodological approach.

    Science.gov (United States)

    Nyborg, Vibeke N; Hvalvik, Sigrun; McCormack, Brendan

    2018-05-31

    In this paper, we explore how the development of historical research methodologies during the last centuries can contribute to more diverse and interdisciplinary research in future caring science, especially towards a care focus that is more person-centred. The adding of a historical approach by professional historians to the theory of person-centredness and person-centred care can develop knowledge that enables a more holistic understanding of the patient and the development of the patient perspective from the past until today. Thus, the aim was to show how developments within historical methodology can help us to understand elements of care in the past to further develop caring science in future. Historical research methodologies have advocated a "history from below" perspective, and this has enabled the evolution of systematic approaches to historical research that can be explored and critically analysed. Linked with this, the development of a more social and cultural oriented understanding of historical research has enabled historians to explore and add knowledge from a broader societal perspective. By focusing on the life of ordinary people and taking social and cultural aspects into account when trying to reconstruct the past, we can get a deeper understanding of health, care and medical development. However, an interdisciplinary research focus on person-centredness and person-centred care that includes professional historians can be challenging. In this paper, we argue that a historical perspective is necessary to meet the challenges we face in future delivery of health care to all people, in all parts of society in an ever more global world. © 2018 Nordic College of Caring Science.

  12. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  13. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  14. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  15. A methodology to modify land uses in a transit oriented development scenario.

    Science.gov (United States)

    Sahu, Akshay

    2018-05-01

    Developing nations are adopting transit oriented development (TOD) strategies to decongest their transportation systems. These strategies are often adopted after the preparation of land use plans. The goal of this study was to build a methodology to modify these land uses using soft computing. This can help to achieve alternate land use plans relevant to TOD. The methodology incorporates TOD characteristics and objectives. Global TOD parameters (density, diversity, and distance to transit) were studied. Expert opinions gave weights and ranges for the parameters in an Indian TOD scenario. Rules to allocate land use was developed. Objective functions were defined. Four objectives were used. First was to maximize employment density, residential density and percent of mix land use. Second was to shape density and diversity with respect to distance. Third was to minimize degree of land use change, and fourth was to increase compactness of the land use allocation. The methodology was applied to two sectors of Naya Raipur, the new planned administrative capital of the state of Chhattisgarh, India. The city has implemented TOD in the form of Bus rapid transit system (BRTS) over an existing land use. Thousand random plans were generated through the methodology. Top 30 plans were selected as parent population for modifications through genetic algorithm (GA). Alternate plans were generated at the end of GA cycle. The best alternate plan was compared with successful BRTS and TOD land uses for its merits and demerits. It was also compared with the initial land use plan for empirical validation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Development of Methodology and Field Deployable Sampling Tools for Spent Nuclear Fuel Interrogation in Liquid Storage

    International Nuclear Information System (INIS)

    Berry, T.; Milliken, C.; Martinez-Rodriguez, M.; Hathcock, D.; Heitkamp, M.

    2012-01-01

    This project developed methodology and field deployable tools (test kits) to analyze the chemical and microbiological condition of the fuel storage medium and determine the oxide thickness on the spent fuel basin materials. The overall objective of this project was to determine the amount of time fuel has spent in a storage basin to determine if the operation of the reactor and storage basin is consistent with safeguard declarations or expectations. This project developed and validated forensic tools that can be used to predict the age and condition of spent nuclear fuels stored in liquid basins based on key physical, chemical and microbiological basin characteristics. Key parameters were identified based on a literature review, the parameters were used to design test cells for corrosion analyses, tools were purchased to analyze the key parameters, and these were used to characterize an active spent fuel basin, the Savannah River Site (SRS) L-Area basin. The key parameters identified in the literature review included chloride concentration, conductivity, and total organic carbon level. Focus was also placed on aluminum based cladding because of their application to weapons production. The literature review was helpful in identifying important parameters, but relationships between these parameters and corrosion rates were not available. Bench scale test systems were designed, operated, harvested, and analyzed to determine corrosion relationships between water parameters and water conditions, chemistry and microbiological conditions. The data from the bench scale system indicated that corrosion rates were dependent on total organic carbon levels and chloride concentrations. The highest corrosion rates were observed in test cells amended with sediment, a large microbial inoculum and an organic carbon source. A complete characterization test kit was field tested to characterize the SRS L-Area spent fuel basin. The sampling kit consisted of a TOC analyzer, a YSI

  17. DEVELOPMENT OF METHODOLOGY AND FIELD DEPLOYABLE SAMPLING TOOLS FOR SPENT NUCLEAR FUEL INTERROGATION IN LIQUID STORAGE

    Energy Technology Data Exchange (ETDEWEB)

    Berry, T.; Milliken, C.; Martinez-Rodriguez, M.; Hathcock, D.; Heitkamp, M.

    2012-06-04

    This project developed methodology and field deployable tools (test kits) to analyze the chemical and microbiological condition of the fuel storage medium and determine the oxide thickness on the spent fuel basin materials. The overall objective of this project was to determine the amount of time fuel has spent in a storage basin to determine if the operation of the reactor and storage basin is consistent with safeguard declarations or expectations. This project developed and validated forensic tools that can be used to predict the age and condition of spent nuclear fuels stored in liquid basins based on key physical, chemical and microbiological basin characteristics. Key parameters were identified based on a literature review, the parameters were used to design test cells for corrosion analyses, tools were purchased to analyze the key parameters, and these were used to characterize an active spent fuel basin, the Savannah River Site (SRS) L-Area basin. The key parameters identified in the literature review included chloride concentration, conductivity, and total organic carbon level. Focus was also placed on aluminum based cladding because of their application to weapons production. The literature review was helpful in identifying important parameters, but relationships between these parameters and corrosion rates were not available. Bench scale test systems were designed, operated, harvested, and analyzed to determine corrosion relationships between water parameters and water conditions, chemistry and microbiological conditions. The data from the bench scale system indicated that corrosion rates were dependent on total organic carbon levels and chloride concentrations. The highest corrosion rates were observed in test cells amended with sediment, a large microbial inoculum and an organic carbon source. A complete characterization test kit was field tested to characterize the SRS L-Area spent fuel basin. The sampling kit consisted of a TOC analyzer, a YSI

  18. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  19. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  20. Status of large scale wind turbine technology development abroad?

    Institute of Scientific and Technical Information of China (English)

    Ye LI; Lei DUAN

    2016-01-01

    To facilitate the large scale (multi-megawatt) wind turbine development in China, the foreign e?orts and achievements in the area are reviewed and summarized. Not only the popular horizontal axis wind turbines on-land but also the o?shore wind turbines, vertical axis wind turbines, airborne wind turbines, and shroud wind turbines are discussed. The purpose of this review is to provide a comprehensive comment and assessment about the basic work principle, economic aspects, and environmental impacts of turbines.

  1. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  2. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  3. Development of a methodology for doss assessment viewing the use of NORM on building materials

    International Nuclear Information System (INIS)

    Souza, Antonio Fernando Costa de

    2009-01-01

    The objective of this study was to develop a methodology for estimating the radiological impact on man of the residues of naturally occurring radioactive materials (NORMs) that potentially can be used for the construction of homes and roads. Residues of this type, which are being produced in great quantities by the Brazilian mining industry, are typically deposited in non-appropriated conditions such that they may have a long-time adverse impact on the environment, and hence on man. A mathematical model was developed to calculate the doses resulting from the use of NORM residues, thus allowing a preliminary analysis of the possibility to recycle the residues. The model was used to evaluate the external dose due gamma radiation, the dose to skin caused by beta radiation, and the internal dose due to inhalation of radon and its decay products. The model was verified by comparisons with results of other studies about doses due to gamma and beta radiation from finite and infinite radioactive sources, with relatively good agreement. In order to validate the proposed methodology, a comparison was made against experimental results for a house constructed in accordance with CNEN regulations using building materials containing NORM residues. Comparisons were made of the dose due to gamma radiation and the radon concentration in the internal environment. Finally, the methodology was used also to estimate the dose caused by gamma radiation from a road constructed in the state of Rondonia, Brazil, which made use of another NORM residue. (author)

  4. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  5. Development of probabilistic assessment methodology for geologic disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Kimura, H.; Takahashi, T.

    1998-01-01

    The probabilistic assessment methodology is essential to evaluate uncertainties of long-term radiological consequences associated with geologic disposal of radioactive wastes. We have developed a probabilistic assessment methodology to estimate the influences of parameter uncertainties/variabilities. An exposure scenario considered here is based on a groundwater migration scenario. A computer code system GSRW-PSA thus developed is based on a non site-specific model, and consists of a set of sub modules for sampling of model parameters, calculating the release of radionuclides from engineered barriers, calculating the transport of radionuclides through the geosphere, calculating radiation exposures of the public, and calculating the statistical values relating the uncertainties and sensitivities. The results of uncertainty analyses for α-nuclides quantitatively indicate that natural uranium ( 238 U) concentration is suitable for an alternative safety indicator of long-lived radioactive waste disposal, because the estimated range of individual dose equivalent due to 238 U decay chain is narrower that that due to other decay chain ( 237 Np decay chain). It is internationally necessary to have detailed discussion on the PDF of model parameters and the PSA methodology to evaluated the uncertainties due to conceptual models and scenarios. (author)

  6. A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems

    Science.gov (United States)

    Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio

    2013-01-01

    Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131

  7. Methodology to develop a training program as a tool for energy management

    Directory of Open Access Journals (Sweden)

    Mónica Rosario Berenguer-Ungaro

    2017-12-01

    Full Text Available The paperaims to present the methodology to develop a training program improve labor skills that enhance the efficient use of energy resources, which aims to make training a timely and meet the training needs as they arise and that the protagonist of it is he who receives training. It is based on the training-action and action research method and model for evaluating training Krikpatrick, it evaluates four levels, reaction, learning, behavior and results. The methodology is structured in three stages: 1 diagnosis of knowledge, 2 intervention based on the results and 3 evaluation and feedback for continuous improvement. Each stage has identified the objectives and implementation tools. Evaluation is transverse to the entire program and it is through it that decisions for feedback loops are taken.

  8. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  9. Calculation and evaluation methodology of the flawed pipe and the compute program development

    International Nuclear Information System (INIS)

    Liu Chang; Qian Hao; Yao Weida; Liang Xingyun

    2013-01-01

    Background: The crack will grow gradually under alternating load for a pressurized pipe, whereas the load is less than the fatigue strength limit. Purpose: Both calculation and evaluation methodology for a flawed pipe that have been detected during in-service inspection is elaborated here base on the Elastic Plastic Fracture Mechanics (EPFM) criteria. Methods: In the compute, the depth and length interaction of a flaw has been considered and a compute program is developed per Visual C++. Results: The fluctuating load of the Reactor Coolant System transients, the initial flaw shape, the initial flaw orientation are all accounted here. Conclusions: The calculation and evaluation methodology here is an important basis for continue working or not. (authors)

  10. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  11. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  12. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2017-01-01

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  13. Development of a methodology for radionuclide impurity analysis in radiopharmaceuticals using gamma spectrometry

    International Nuclear Information System (INIS)

    Paula, Eduardo Bonfim de; Araujo, Miriam Taina Ferreira de; Delgado, Jose Ubiratan; Poledna, Roberto; Lins, Ronaldo; Leiras, Anderson; Silva, Carlos Jose da; Oliveira, Antonio Eduardo de

    2016-01-01

    The LNMRI has sought to develop a methodology for the identification and accurate detection of gamma radiation impurities stations in metrological level, aiming to meet the recommendations not only of international pharmacopoeia, as well as the CNEN and ANVISA regarding the quality control can ensure patients the doses received by the practices are as low as feasible. As an initial target, it was possible to obtain an efficiency curve with uncertainty around 1% necessary to initiate future measurements of interest applied to nuclear medicine and to start the development of impurities analysis technique. (author)

  14. Further developments of multiphysics and multiscale methodologies for coupled nuclear reactor simulations

    International Nuclear Information System (INIS)

    Gomez Torres, Armando Miguel

    2011-01-01

    This doctoral thesis describes the methodological development of coupled neutron-kinetics/thermal-hydraulics codes for the design and safety analysis of reactor systems taking into account the feedback mechanisms on the fuel rod level, according to different approaches. A central part of this thesis is the development and validation of a high fidelity simulation tool, DYNSUB, which results from the ''two-way-coupling'' of DYN3D-SP3 and SUBCHANFLOW. It allows the determination of local safety parameters through a detailed description of the core behavior under stationary and transient conditions at fuel rod level.

  15. Health effects of ambient air pollution – recent research development and contemporary methodological challenges

    Directory of Open Access Journals (Sweden)

    Ren Cizao

    2008-11-01

    Full Text Available Abstract Exposure to high levels of air pollution can cause a variety of adverse health outcomes. Air quality in developed countries has been generally improved over the last three decades. However, many recent epidemiological studies have consistently shown positive associations between low-level exposure to air pollution and health outcomes. Thus, adverse health effects of air pollution, even at relatively low levels, remain a public concern. This paper aims to provide an overview of recent research development and contemporary methodological challenges in this field and to identify future research directions for air pollution epidemiological studies.

  16. Preliminary methodology to assess the national and regional impact of U.S. wind energy development on birds and bats

    Science.gov (United States)

    Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.

    2015-01-01

    The U.S. Geological Survey has developed a methodology to assess the impacts of wind energy development on wildlife; it is a probabilistic, quantitative assessment methodology that can communicate to decision makers and the public the magnitude of these effects on species populations. The methodology is currently applicable to birds and bats, focuses primarily on the effects of collisions, and can be applied to any species that breeds in, migrates through, or otherwise uses any part of the United States. The methodology is intended to assess species at the national scale and is fundamentally different from existing methods focusing on impacts at individual facilities.

  17. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    International Nuclear Information System (INIS)

    Barabas, Roberta de C.; Sabundjian, Gaiane

    2015-01-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  18. The development of a neuroscience-based methodology for the nuclear energy learning/teaching process

    Energy Technology Data Exchange (ETDEWEB)

    Barabas, Roberta de C.; Sabundjian, Gaiane, E-mail: robertabarabas@usp.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    When compared to other energy sources such as fossil fuels, coal, oil, and gas, nuclear energy has perhaps the lowest impact on the environment. Moreover, nuclear energy has also benefited other fields such as medicine, pharmaceutical industry, and agriculture, among others. However, despite all benefits that result from the peaceful uses of nuclear energy, the theme is still addressed with prejudice. Education may be the starting point for public acceptance of nuclear energy as it provides pedagogical approaches, learning environments, and human resources, which are essential conditions for effective learning. So far nuclear energy educational researches have been conducted using only conventional assessment methods. The global educational scenario has demonstrated absence of neuroscience-based methods for the teaching of nuclear energy, and that may be an opportunity for developing new strategic teaching methods that will help demystifying the theme consequently improving public acceptance of this type of energy. This work aims to present the first step of a methodology in progress based on researches in neuroscience to be applied to Brazilian science teachers in order to contribute to an effective teaching/learning process. This research will use the Implicit Association Test (IAT) to verify implicit attitudes of science teachers concerning nuclear energy. Results will provide data for the next steps of the research. The literature has not reported a similar neuroscience-based methodology applied to the nuclear energy learning/teaching process; therefore, this has demonstrated to be an innovating methodology. The development of the methodology is in progress and the results will be presented in future works. (author)

  19. Development of a Methodology for Hydrogeological Characterization of Faults: Progress of the Project in Berkeley, California

    Science.gov (United States)

    Goto, J.; Moriya, T.; Yoshimura, K.; Tsuchi, H.; Karasaki, K.; Onishi, T.; Ueta, K.; Tanaka, S.; Kiho, K.

    2010-12-01

    The Nuclear Waste Management Organization of Japan (NUMO), in collaboration with Lawrence Berkeley National Laboratory (LBNL), has carried out a project to develop an efficient and practical methodology to characterize hydrologic property of faults since 2007, exclusively for the early stage of siting a deep underground repository. A preliminary flowchart of the characterization program and a classification scheme of fault hydrology based on the geological feature have been proposed. These have been tested through the field characterization program on the Wildcat Fault in Berkeley, California. The Wildcat Fault is a relatively large non-active strike-slip fault which is believed to be a subsidiary of the active Hayward Fault. Our classification scheme assumes the contrasting hydrologic features between the linear northern part and the split/spread southern part of the Wildcat Fault. The field characterization program to date has been concentrated in and around the LBNL site on the southern part of the fault. Several lines of electrical and reflection seismic surveys, and subsequent trench investigations, have revealed the approximate distribution and near-surface features of the Wildcat Fault (see also Onishi, et al. and Ueta, et al.). Three 150m deep boreholes, WF-1 to WF-3, have been drilled on a line normal to the trace of the fault in the LBNL site. Two vertical holes were placed to characterize the undisturbed Miocene sedimentary formations at the eastern and western sides of the fault (WF-1 and WF-2 respectively). WF-2 on the western side intersected the rock formation, which was expected only in WF-1, and several of various intensities. Therefore, WF-3, originally planned as inclined to penetrate the fault, was replaced by the vertical hole further to the west. It again encountered unexpected rocks and faults. Preliminary results of in-situ hydraulic tests suggested that the transmissivity of WF-1 is ten to one hundred times higher than WF-2. The monitoring

  20. THEORETIC AND METHODOLOGIC BASICS OF DEVELOPMENT OF THE NATIONAL LOGISTICS SYSTEM IN THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    R. B. Ivut

    2016-01-01

    Full Text Available The article presents the results of a study, the aim of which is the formation of the theoretical and methodological foundations in the framework of scientific maintenance for the further development processes of the national logistics system in the Republic of Belarus. The relevance of the study relates to the fact that at present the introduction of the concept of logistics and the formation of the optimal infrastructure for its implementation are the key factors for economic development of Belarus as a transit country. At the same time the pace of development of the logistic activities in the country is currently slightly lower in comparison with the neighboring countries, as evidenced by the dynamics of the country’s position in international rankings (in particular, according to the LPI index. Overcoming these gaps requires improved competitiveness of the logistics infrastructure in the international market. This, in turn, is possible due to the clear formulation and adherence of the effective functioning principles for macro logistics system of Belarus, as well as by increasing the quality of logistics design by means of applying econometric models and methods presented in the article. The proposed auctorial approach is the differentiation of the general principles of logistics specific to the logistics systems of all levels, and the specific principles of development of the macro level logistics system related to improving its transit attractiveness for international freight carriers. The study also systematizes the model for determining the optimal location of logistics facilities. Particular attention is paid to the methodological basis of the analysis of transport terminals functioning as part of the logistics centers both in the stages of design and operation. The developed theoretical and methodological recommendations are universal and can be used in the design of the logistics infrastructure for various purposes and functions

  1. Development of large-scale functional brain networks in children.

    Directory of Open Access Journals (Sweden)

    Kaustubh Supekar

    2009-07-01

    Full Text Available The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y and 22 young-adults (ages 19-22 y. Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  2. Development of large-scale functional brain networks in children.

    Science.gov (United States)

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-07-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y) and 22 young-adults (ages 19-22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  3. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  4. Teaching Theory of Science and Research Methodology to Nursing Students: A Practice-Developing Approach

    DEFF Research Database (Denmark)

    Sievert, Anne; Chaiklin, Seth

    , in a principled way, to select subject-matter content for a course for nursing students on theory of science and research methodology. At the same time, the practical organisation of the project was motivated by a practice-developing research perspective. The purpose of the presentation is to illustrate how...... the idea of practice-developing research was realised in this concrete project. A short introduction is first given to explain the practical situation that motivated the need and interest to select subject matter for teaching. Then, the main part of the presentation explains the considerations involved...... developed. On the basis of this presentation, it should be possible to get a concrete image of one form for practice-developing research. The presentation concludes with a discussion that problematises the sense in which general knowledge about development of nursing school teaching practice has been...

  5. Using practice development methodology to develop children's centre teams: ideas for the future.

    Science.gov (United States)

    Hemingway, Ann; Cowdell, Fiona

    2009-09-01

    The Children's Centre Programme is a recent development in the UK and brings together multi-agency teams to work with disadvantaged families. Practice development methods enable teams to work together in new ways. Although the term practice development remains relatively poorly defined, its key properties suggest that it embraces engagement, empowerment, evaluation and evolution. This paper introduces the Children's Centre Programme and practice development methods and aims to discuss the relevance of using this method to develop teams in children's centres through considering the findings from an evaluation of a two-year project to develop inter-agency public health teams. The evaluation showed that practice development methods can enable successful team development and showed that through effective facilitation, teams can change their practice to focus on areas of local need. The team came up with their own process to develop a strategy for their locality.

  6. Methodological Aspects of Modeling Development and Viability of Systems and Counterparties in the Digital Economy

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2018-03-01

    Full Text Available The aim of the article is to study and generalize methodological approaches to modeling economic development and viability of economic systems with consideration for risk, changing their goals, status, and behavior in the digital economy. The definition of categories of economic development and viability is offered, the directions of their research by means of mathematical modeling are grounded. The system of characteristics and markers of the external economic environment under conditions of digitalization of economic activity is analyzed. The theoretical foundations and methodology for mathematical modeling of development of economic systems as well as ensuring their viability and security under conditions of introducing infrastructure of information society and digital economy on the principles of the information and knowledge approach are considered. It is proved that in an information society, predictive model technologies are a growing safety resource. There studied prerequisites for replacing the traditional integration concept of evaluation, analysis, modeling, management, and administration of economic development based on a threat-oriented approach to the definition of security protectors, information, and knowledge. There proposed a concept of creating a database of models for examining trends and patterns of economic development, which, unlike traditional trend models of dynamics, identifies and iteratively conceptualizes processes based on a set of knowledgeable predictors based on the use of data mining and machine learning tools, including in-depth training.

  7. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F van [NAGRA (Switzerland); and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  8. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  9. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  10. Application of code scaling, applicability and uncertainty methodology to large break LOCA analysis of two loop PWR

    International Nuclear Information System (INIS)

    Mavko, B.; Stritar, A.; Prosek, A.

    1993-01-01

    In NED 119, No. 1 (May 1990) a series of six papers published by a Technical Program Group presented a new methodology for the safety evaluation of emergency core cooling systems in nuclear power plants. This paper describes the application of that new methodology to the LB LOCA analysis of the two loop Westinghouse power plant. Results of the original work were used wherever possible, so that the analysis was finished in less than one man year of work. Steam generator plugging level and safety injection flow rate were used as additional uncertainty parameters, which had not been used in the original work. The computer code RELAP5/MOD2 was used. Response surface was generated by the regression analysis and by the artificial neural network like Optimal Statistical Estimator method. Results were compared also to the analytical calculation. (orig.)

  11. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  12. Development of a new methodology for the creation of water temperature scenarios using frequency analysis tool.

    Science.gov (United States)

    Val, Jonatan; Pino, María Rosa; Chinarro, David

    2018-03-15

    Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global

  13. Developing high coercivity in large diameter cobalt nanowire arrays

    Science.gov (United States)

    Montazer, A. H.; Ramazani, A.; Almasi Kashi, M.; Zavašnik, J.

    2016-11-01

    Regardless of the synthetic method, developing high magnetic coercivity in ferromagnetic nanowires (NWs) with large diameters has been a challenge over the past two decades. Here, we report on the synthesis of highly coercive cobalt NW arrays with diameters of 65 and 80 nm, which are embedded in porous anodic alumina templates with high-aspect-ratio pores. Using a modified electrochemical deposition method enabled us to reach room temperature coercivity and remanent ratio up to 3000 Oe and 0.70, respectively, for highly crystalline as-synthesized hcp cobalt NW arrays with a length of 8 μm. The first-order reversal curve (FORC) analysis showed the presence of both soft and hard magnetic phases along the length of the resulting NWs. To develop higher coercive fields, the length of the NWs was then gradually reduced in order from bottom to top, thereby reaching NW sections governed by the hard phase. Consequently, this resulted in record high coercivities of 4200 and 3850 Oe at NW diameters of 65 and 80 nm, respectively. In this case, the FORC diagrams confirmed a significant reduction in interactions between the magnetic phases of the remaining sections of NWs. At this stage, x-ray diffraction (XRD) and dark-field transmission electron microscopy analyses indicated the formation of highly crystalline bamboo-like sections along the [0 0 2] direction during a progressive pulse-controlled electrochemical growth of NW arrays under optimized parameters. Our results both provide new insights into the growth process, crystalline characteristics and magnetic phases along the length of large diameter NW arrays and, furthermore, develop the performance of pure 3d transition magnetic NWs.

  14. Physiological and methodological aspects of rate of force development assessment in human skeletal muscle.

    Science.gov (United States)

    Rodríguez-Rosell, David; Pareja-Blanco, Fernando; Aagaard, Per; González-Badillo, Juan José

    2017-12-20

    Rate of force development (RFD) refers to the ability of the neuromuscular system to increase contractile force from a low or resting level when muscle activation is performed as quickly as possible, and it is considered an important muscle strength parameter, especially for athletes in sports requiring high-speed actions. The assessment of RFD has been used for strength diagnosis, to monitor the effects of training interventions in both healthy populations and patients, discriminate high-level athletes from those of lower levels, evaluate the impairment in mechanical muscle function after acute bouts of eccentric muscle actions and estimate the degree of fatigue and recovery after acute exhausting exercise. Notably, the evaluation of RFD in human skeletal muscle is a complex task as influenced by numerous distinct methodological factors including mode of contraction, type of instruction, method used to quantify RFD, devices used for force/torque recording and ambient temperature. Another important aspect is our limited understanding of the mechanisms underpinning rapid muscle force production. Therefore, this review is primarily focused on (i) describing the main mechanical characteristics of RFD; (ii) analysing various physiological factors that influence RFD; and (iii) presenting and discussing central biomechanical and methodological factors affecting the measurement of RFD. The intention of this review is to provide more methodological and analytical coherency on the RFD concept, which may aid to clarify the thinking of coaches and sports scientists in this area. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  15. Development of a simplified methodology for the isotopic determination of fuel spent in Light Water Reactors

    International Nuclear Information System (INIS)

    Hernandez N, H.; Francois L, J.L.

    2005-01-01

    The present work presents a simplified methodology to quantify the isotopic content of the spent fuel of light water reactors; their application is it specific to the Laguna Verde Nucleo electric Central by means of a balance cycle of 18 months. The methodology is divided in two parts: the first one consists on the development of a model of a simplified cell, for the isotopic quantification of the irradiated fuel. With this model the burnt one is simulated 48,000 MWD/TU of the fuel in the core of the reactor, taking like base one fuel assemble type 10x10 and using a two-dimensional simulator for a fuel cell of a light water reactor (CPM-3). The second part of the methodology is based on the creation from an isotopic decay model through an algorithm in C++ (decay) to evaluate the amount, by decay of the radionuclides, after having been irradiated the fuel until the time in which the reprocessing is made. Finally the method used for the quantification of the kilograms of uranium and obtained plutonium of a normalized quantity (1000 kg) of fuel irradiated in a reactor is presented. These results will allow later on to make analysis of the final disposition of the irradiated fuel. (Author)

  16. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    Santana, Priscila do Carmo

    2010-01-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  17. Methodological Aspects in Forecasting Innovation Development of Dairy Cattle Breeding in the Region

    Directory of Open Access Journals (Sweden)

    Natal’ya Aleksandrovna Medvedeva

    2016-07-01

    Full Text Available Due to the fact that Russia is now a member of the World Trade Organization, long-term forecasting becomes an objectively necessary condition that helps choose an effective science-based long-term strategy for development of dairy cattle breeding that would take into consideration intellectual and innovative characteristics. Current structure of available statistical information does not meet modern challenges of innovation development and does not reflect adequately the trends of ongoing changes. The paper suggests a system of indicators to analyze the status, development and prospects of dairy cattle breeding in the region; this system provides timely identification of emerging risks and threats of deviation from the specified parameters. The system included indicators contained in the current statistical reporting and new indicators of innovation development of the industry, the quality of human capital and the level of government support. When designing the system of indicators, we used several methodological aspects of the Oslo Manual, which the Federal State Statistics Service considers to be an official methodological document concerning the collection of information about innovation activities. A structured system of indicators shifts the emphasis in the analysis of the final results to the conditions and prerequisites that help achieve forecast performance indicators in the functioning of Russia’s economy under WTO rules and make substantiated management decisions

  18. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  19. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  20. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.