WorldWideScience

Sample records for modeling methodology development

  1. Methodological Developments in Geophysical Assimilation Modeling

    Science.gov (United States)

    Christakos, George

    2005-06-01

    This work presents recent methodological developments in geophysical assimilation research. We revisit the meaning of the term "solution" of a mathematical model representing a geophysical system, and we examine its operational formulations. We argue that an assimilation solution based on epistemic cognition (which assumes that the model describes incomplete knowledge about nature and focuses on conceptual mechanisms of scientific thinking) could lead to more realistic representations of the geophysical situation than a conventional ontologic assimilation solution (which assumes that the model describes nature as is and focuses on form manipulations). Conceptually, the two approaches are fundamentally different. Unlike the reasoning structure of conventional assimilation modeling that is based mainly on ad hoc technical schemes, the epistemic cognition approach is based on teleologic criteria and stochastic adaptation principles. In this way some key ideas are introduced that could open new areas of geophysical assimilation to detailed understanding in an integrated manner. A knowledge synthesis framework can provide the rational means for assimilating a variety of knowledge bases (general and site specific) that are relevant to the geophysical system of interest. Epistemic cognition-based assimilation techniques can produce a realistic representation of the geophysical system, provide a rigorous assessment of the uncertainty sources, and generate informative predictions across space-time. The mathematics of epistemic assimilation involves a powerful and versatile spatiotemporal random field theory that imposes no restriction on the shape of the probability distributions or the form of the predictors (non-Gaussian distributions, multiple-point statistics, and nonlinear models are automatically incorporated) and accounts rigorously for the uncertainty features of the geophysical system. In the epistemic cognition context the assimilation concept may be used to

  2. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  3. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  4. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  5. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  6. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.; Berg, van den J.; Kaymak, U.; Udo, H.; Verreth, J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decisionmaking. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  8. A generic methodology for developing fuzzy decision models

    NARCIS (Netherlands)

    Bosma, R.H.; Berg, van den J.; Kaymak, Uzay; Udo, H.M.J.; Verreth, J.A.J.

    2012-01-01

    An important paradigm in decision-making models is utility-maximization where most models do not include actors’ motives. Fuzzy set theory on the other hand offers a method to simulate human decision-making. However, the literature describing expert-driven fuzzy logic models, rarely gives precise

  9. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  10. Insight into model mechanisms through automatic parameter fitting: a new methodological framework for model development.

    Science.gov (United States)

    Tøndel, Kristin; Niederer, Steven A; Land, Sander; Smith, Nicolas P

    2014-05-20

    Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input-output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on

  11. Development and demonstration of a validation methodology for vehicle lateral dynamics simulation models

    Energy Technology Data Exchange (ETDEWEB)

    Kutluay, Emir

    2013-02-01

    In this thesis a validation methodology to be used in the assessment of the vehicle dynamics simulation models is presented. Simulation of vehicle dynamics is used to estimate the dynamic responses of existing or proposed vehicles and has a wide array of applications in the development of vehicle technologies. Although simulation environments, measurement tools and mathematical theories on vehicle dynamics are well established, the methodical link between the experimental test data and validity analysis of the simulation model is still lacking. The developed validation paradigm has a top-down approach to the problem. It is ascertained that vehicle dynamics simulation models can only be validated using test maneuvers although they are aimed for real world maneuvers. Test maneuvers are determined according to the requirements of the real event at the start of the model development project and data handling techniques, validation metrics and criteria are declared for each of the selected maneuvers. If the simulation results satisfy these criteria, then the simulation is deemed ''not invalid''. If the simulation model fails to meet the criteria, the model is deemed invalid, and model iteration should be performed. The results are analyzed to determine if the results indicate a modeling error or a modeling inadequacy; and if a conditional validity in terms of system variables can be defined. Three test cases are used to demonstrate the application of the methodology. The developed methodology successfully identified the shortcomings of the tested simulation model, and defined the limits of application. The tested simulation model is found to be acceptable but valid only in a certain dynamical range. Several insights for the deficiencies of the model are reported in the analysis but the iteration step of the methodology is not demonstrated. Utilizing the proposed methodology will help to achieve more time and cost efficient simulation projects with

  12. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  13. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  14. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  15. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  16. Methodological Aspects of Modeling Development and Viability of Systems and Counterparties in the Digital Economy

    Directory of Open Access Journals (Sweden)

    Vitlinskyy Valdemar V.

    2018-03-01

    Full Text Available The aim of the article is to study and generalize methodological approaches to modeling economic development and viability of economic systems with consideration for risk, changing their goals, status, and behavior in the digital economy. The definition of categories of economic development and viability is offered, the directions of their research by means of mathematical modeling are grounded. The system of characteristics and markers of the external economic environment under conditions of digitalization of economic activity is analyzed. The theoretical foundations and methodology for mathematical modeling of development of economic systems as well as ensuring their viability and security under conditions of introducing infrastructure of information society and digital economy on the principles of the information and knowledge approach are considered. It is proved that in an information society, predictive model technologies are a growing safety resource. There studied prerequisites for replacing the traditional integration concept of evaluation, analysis, modeling, management, and administration of economic development based on a threat-oriented approach to the definition of security protectors, information, and knowledge. There proposed a concept of creating a database of models for examining trends and patterns of economic development, which, unlike traditional trend models of dynamics, identifies and iteratively conceptualizes processes based on a set of knowledgeable predictors based on the use of data mining and machine learning tools, including in-depth training.

  17. Nirex methodology for scenario and conceptual model development. An international peer review

    International Nuclear Information System (INIS)

    1999-06-01

    Nirex has responsibilities for nuclear waste management in the UK. The company's top level objectives are to maintain technical credibility on deep disposal, to gain public acceptance for a deep geologic repository, and to provide relevant advice to customers on the safety implications of their waste packaging proposals. Nirex utilizes peer reviews as appropriate to keep its scientific tools up-to-date and to periodically verify the quality of its products. The NEA formed an International Review Team (IRT) consisting of four internationally recognised experts plus a member of the NEA Secretariat. The IRT performed an in-depth analysis of five Nirex scientific reports identified in the terms of reference of the review. The review was to primarily judge whether the Nirex methodology provides an adequate framework to support the building of a future licensing safety case. Another objective was to judge whether the methodology could aid in establishing a better understanding, and, ideally, enhance acceptance of a repository among stakeholders. Methodologies for conducting safety assessments include at a very basic level the identification of features, events, and processes (FEPs) relevant to the system at hand, their convolution in scenarios for analysis, and the formulation of conceptual models to be addressed through numerical modelling. The main conclusion of the IRT is that Nirex has developed a potentially sound methodology for the identification and analysis of FEPs and for the identification of conceptual model needs and model requirements. The work is still in progress and is not yet complete. (R.P.)

  18. Non-linear mixed effects modeling - from methodology and software development to driving implementation in drug development science.

    Science.gov (United States)

    Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis

    2005-04-01

    Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.

  19. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier

    2009-01-01

    Under the UNEP-SETAC Life Cycle Initiative there is an aim to develop an internationally backed recommended practice of life cycle impact assessment addressing methodological issues like choice of characterization model and characterization factors. In this context, an international comparison...... was performed of characterization models for toxic impacts from chemicals in life cycle assessment. Six commonly used characterization models were compared and in a sequence of workshops. Crucial fate, exposure and effect aspects were identified for which the models differed in their treatment. The models were....... The USEtox™ model has been used to calculate characterization factors for several thousand substances and is currently under review with the intention that it shall form the basis of the recommendations from the UNEP-SETAC Life Cycle Initiative regarding characterization of toxic impacts in Life Cycle...

  20. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  1. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  2. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.

  3. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    Science.gov (United States)

    Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  4. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega

    2017-01-01

    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  5. MODELS AND METHODS OF SAFETY-ORIENTED PROJECT MANAGEMENT OF DEVELOPMENT OF COMPLEX SYSTEMS: METHODOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Олег Богданович ЗАЧКО

    2016-03-01

    Full Text Available The methods and models of safety-oriented project management of the development of complex systems are proposed resulting from the convergence of existing approaches in project management in contrast to the mechanism of value-oriented management. A cognitive model of safety oriented project management of the development of complex systems is developed, which provides a synergistic effect that is to move the system from the original (pre condition in an optimal one from the viewpoint of life safety - post-project state. The approach of assessment the project complexity is proposed, which consists in taking into account the seasonal component of a time characteristic of life cycles of complex organizational and technical systems with occupancy. This enabled to take into account the seasonal component in simulation models of life cycle of the product operation in complex organizational and technical system, modeling the critical points of operation of systems with occupancy, which forms a new methodology for safety-oriented management of projects, programs and portfolios of projects with the formalization of the elements of complexity.

  6. The use of mental models in chemical risk protection: developing a generic workplace methodology.

    Science.gov (United States)

    Cox, Patrick; Niewöhmer, Jörg; Pidgeon, Nick; Gerrard, Simon; Fischhoff, Baruch; Riley, Donna

    2003-04-01

    We adopted a comparative approach to evaluate and extend a generic methodology to analyze the different sets of beliefs held about chemical hazards in the workplace. Our study mapped existing knowledge structures about the risks associated with the use of perchloroethylene and rosin-based solder flux in differing workplaces. "Influence diagrams" were used to represent beliefs held by chemical experts; "user models" were developed from data elicited from open-ended interviews with the workplace users of the chemicals. The juxtaposition of expert and user understandings of chemical risks enabled us to identify knowledge gaps and misunderstandings and to reinforce appropriate sets of safety beliefs and behavior relevant to chemical risk communications. By designing safety information to be more relevant to the workplace context of users, we believe that employers and employees may gain improved knowledge about chemical hazards in the workplace, such that better chemical risk management, self-protection, and informed decision making develop over time.

  7. Development of a practical methodology for integrating shoreline oil-holding capacity into modeling

    International Nuclear Information System (INIS)

    Schmidt Etkin, D.; French-McCay, D.; Rowe, J.; Michel, J.; Boufadel, M.; Li, H.

    2008-01-01

    The factors that influence the behaviour of oil in the aftermath of an oil spill on water include oil type and characteristics; oil thickness on the shoreline; time until shoreline impact; timing with regards to tides; weathering during and after the spill; and nearshore wave energy. The oil behaviour also depends on the shoreline characteristics, particularly porosity and permeability. The interactions of spilled oil with sediments on beaches must be well understood in order to model the oil spill trajectory, fate and risk. The movement of oil can be most accurately simulated if the algorithm incorporates an estimate of shoreline oil retention. This paper presented a literature review of relevant shoreline oiling studies and considered the relevance of study findings for inclusion in modelling. Survey data from a detailed shoreline cleanup assessment team (SCAT) were analyzed for patterns in oil penetration and oil-holding capacity by shoreline sediment type and oil type for potential use in modelling algorithms. A theoretical beach hydraulics model was then developed for use in a stochastic spill model. Gaps in information were identified, including the manner in which wave action and other environmental variables have an impact on the dynamic processes involved in shoreline oiling. The methodology presented in this paper can be used to estimate the amount of oil held by a shoreline upon impact to allow a trajectory model to more accurately project the total spread of oil. 27 refs., 13 tabs., 3 figs

  8. Safeguards methodology development history

    International Nuclear Information System (INIS)

    Chapman, L.D.; Bennett, H.A.; Engi, D.; Grady, L.M.; Hulme, B.L.; Sasser, D.W.

    1979-01-01

    The development of models for the evaluation and design of fixed-site nuclear facility, physical protection systems was under way in 1974 at Sandia Laboratories and has continued to the present. A history of the evolution of these models and the model descriptions are presented. Several models have been and are continuing to be applied to evaluate and design facility protection systems

  9. Competence development organizations in project management on the basis of genomic model methodologies

    OpenAIRE

    Бушуев, Сергей Дмитриевич; Рогозина, Виктория Борисовна; Ярошенко, Юрий Федерович

    2013-01-01

    The matrix technology for identification of organisational competencies in project management is presented in the article. Matrix elements are the components of organizational competence in the field of project management and project management methodology represented in the structure of the genome. The matrix model of competence in the framework of the adopted methodologies and scanning method for identifying organizational competences formalised. Proposed methods for building effective proj...

  10. Scenario Methodology for Modelling of Future Landscape Developments as Basis for Assessing Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Matthias Rosenberg

    2014-04-01

    Full Text Available The ecosystems of our intensively used European landscapes produce a variety of natural goods and services for the benefit of humankind, and secure the basics and quality of life. Because these ecosystems are still undergoing fundamental changes, the interest of the society is to know more about future developments and their ecological impacts. To describe and analyze these changes, scenarios can be developed and an assessment of the ecological changes can be carried out subsequently. In the project „Landscape Saxony 2050“; a methodology for the construction of exploratory scenarios was worked out. The presented methodology provides a possibility to identify the driving forces (socio-cultural, economic and ecological conditions of the landscape development. It allows to indicate possible future paths which lead to a change of structures and processes in the landscape and can influence the capability to provide ecosystem services. One essential component of the applied technique is that an approach for the assessment of the effects of the landscape changes on ecosystem services is integrated into the developed scenario methodology. Another is, that the methodology is strong designed as participatory, i.e. stakeholders are integrated actively. The method is a seven phase model which provides the option for the integration of the stakeholders‘ participation at all levels of scenario development. The scenario framework was applied to the district of Görlitz, an area of 2100 sq km located at the eastern border of Germany. The region is affected by strong demographic as well as economic changes. The core issue focused on the examination of landscape change in terms of biodiversity. Together with stakeholders, a trend scenario and two alternative scenarios were developed. The changes of the landscape structure are represented in story lines, maps and tables. On basis of the driving forces of the issue areas „cultural / social values“ and

  11. Development of A Methodology for Assessing Various Accident Management Strategies Using Decision Tree Models

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Nam Yeong; Kim, Jin Tae; Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of); Jerng, Dong Wook [Chung-Ang University, Seoul (Korea, Republic of)

    2016-05-15

    The purpose of ASP (Accident Sequence Precursor) analysis is to evaluate operational accidents in full power and low power operation by using PRA (Probabilistic Risk Assessment) technologies. The awareness of the importance of ASP analysis has been on rise. The methodology for ASP analysis has been developed in Korea, KINS (Korea Institute of Nuclear Safety) has managed KINS-ASP program since it was developed. In this study, we applied ASP analysis into operational accidents in full power and low power operation to quantify CCDP (Conditional Core Damage Probability). To reflect these 2 cases into PRA model, we modified fault trees and event trees of the existing PRA model. Also, we suggest the ASP regulatory system in the conclusion. In this study, we reviewed previous studies for ASP analysis. Based on it, we applied it into operational accidents in full power and low power operation. CCDP of these 2 cases are 1.195E-06 and 2.261E-03. Unlike other countries, there is no regulatory basis of ASP analysis in Korea. ASP analysis could detect the risk by assessing the existing operational accidents. ASP analysis can improve the safety of nuclear power plant by detecting, reviewing the operational accidents, and finally removing potential risk. Operator have to notify regulatory institute of operational accident before operator takes recovery work for the accident. After follow-up accident, they have to check precursors in data base to find similar accident.

  12. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  13. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    International Nuclear Information System (INIS)

    Aldemir, Tunc; Denning, Richard; Catalyurek, Umit; Unwin, Stephen

    2015-01-01

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  14. Development and validation of a new turbocharger simulation methodology for marine two stroke diesel engine modelling and diagnostic applications

    International Nuclear Information System (INIS)

    Sakellaridis, Nikolaos F.; Raptotasios, Spyridon I.; Antonopoulos, Antonis K.; Mavropoulos, Georgios C.; Hountalas, Dimitrios T.

    2015-01-01

    Engine cycle simulation models are increasingly used in diesel engine simulation and diagnostic applications, reducing experimental effort. Turbocharger simulation plays an important role in model's ability to accurately predict engine performance and emissions. The present work describes the development of a complete engine simulation model for marine Diesel engines based on a new methodology for turbocharger modelling utilizing physically based meanline models for compressor and turbine. Simulation accuracy is evaluated against engine bench measurements. The methodology was developed to overcome the problem of limited experimental maps availability for compressor and turbine, often encountered in large marine diesel engine simulation and diagnostic studies. Data from the engine bench are used to calibrate the models, as well as to estimate turbocharger shaft mechanical efficiency. Closed cycle and gas exchange are modelled using an existing multizone thermodynamic model. The proposed methodology is applied on a 2-stroke marine diesel engine and its evaluation is based on the comparison of predictions against measured engine data. It is demonstrated model's ability to predict engine response with load variation regarding both turbocharger performance and closed cycle parameters, as well as NOx emission trends, making it an effective tool for both engine diagnostic and optimization studies. - Highlights: • Marine two stroke diesel engine simulation model. • Turbine and compressor simulation using physical meanline models. • Methodology to derive T/C component efficiency and T/C shaft mechanical efficiency. • Extensive validation of predictions against experimental data.

  15. Methodologies Related to Computational models in View of Developing Anti-Alzheimer Drugs: An Overview.

    Science.gov (United States)

    Baheti, Kirtee; Kale, Mayura Ajay

    2018-04-17

    Since last two decades, there has been more focus on the development strategies related to Anti-Alzheimer's drug research. This may be attributed to the fact that most of the Alzheimer's cases are still mostly unknown except for a few cases, where genetic differences have been identified. With the progress of the disease, the symptoms involve intellectual deterioration, memory impairment, abnormal personality and behavioural patterns, confusion, aggression, mood swings, irritability Current therapies available for this disease give only symptomatic relief and do not focus on manipulations of biololecular processes. Nearly all the therapies to treat Alzheimer's disease, target to change the amyloid cascade which is considered to be an important in AD pathogenesis. New drug regimens are not able to keep pace with the ever-increasing understanding about dementia at molecular level. Looking into these aggravated problems, we though to put forth molecular modeling as a drug discovery approach for developing novel drugs to treat Alzheimer disease. The disease is incurable and it gets worst as it advances and finally causes death. Due to this, the design of drugs to treat this disease has become an utmost priority for research. One of the most important emerging technologies applied for this has been Computer-assisted drug design (CADD). It is a research tool that employs large scale computing strategies in an attempt to develop a model receptor site which can be used for designing of an anti-Alzheimer drug. The various models of amyloid-based calcium channels have been computationally optimized. Docking and De novo evolution are used to design the compounds. These are further subjected to absorption, distribution, metabolism, excretion and toxicity (ADMET) studies to finally bring about active compounds that are able to cross BBB. Many novel compounds have been designed which might be promising ones for the treatment of AD. The present review describes the research

  16. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  17. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  18. Development of CCF modeling and analysis methodology for diverse system status

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Tae Jin; Byun, Si Sub; Yoon, Tae Kwan [Soongsil University, Seoul (Korea); Moon, Jae Pil [Seoul National University, Seoul (Korea)

    1999-04-01

    The objectives of this project is to develop a procedure for modeling and analyzing CCF efficiently according to various system status. CCF events change according to the change of the system status due to maintenance, accidents, or alternating success criteria for various missions. The objective of the first year's research is to develope a CCF model for various system status. We reviewed and evaluated current CCF models, and analyze their merits and deficiency in modeling various system status. An approximate model is developed as a CCF model. The model is compatible with MGL model. Extensive sensitivity study shows the accuracy and efficiency of the proposed model. Second year's research aims to the development of an integrated CCF procedure for PSA and risk monitor. We develope an adaptive method for the approximate model in a k/m/G system with multiple common cause groups. The accuracy of the method is proved by comparing with the implicit method. Next, we develope a method for modeling CCF in a fault tree. Three alternatives are considered. It is proved to be most efficient to model the CCF events under the gate of individual component failure. The we provides a method for estimating the CCF probability, and develope a software for this purpose. We finally provide a fundamental procedure for modeling CCF in a risk monitor. The modeling procedure is applied to HPSI system, and it is proved to be efficient and accurate. (author). 48 refs., 11 figs., 53 tabs.

  19. Soft Systems Methodology and Problem Framing: Development of an Environmental Problem Solving Model Respecting a New Emergent Reflexive Paradigm.

    Science.gov (United States)

    Gauthier, Benoit; And Others

    1997-01-01

    Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)

  20. Development of an Evidence-Based Calibration Methodology Dedicated to Energy Audit of Office Buildings. Part 1: Methodology and Modeling.

    OpenAIRE

    Bertagnolio, Stéphane; Andre, Philippe

    2010-01-01

    1 Introduction To promote improvements in the HVAC installations of existing buildings, the article 9 of the EPBD directive establishes mandatory audits and inspections of air-conditioning systems. The development of auditing tools and procedures and the training of future auditors are the main objectives of the HARMONAC project launched in 2007. Four audit stages are generally distinguished: benchmarking, inspection, detailed audit and investment grade audit. Answering the questions en...

  1. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Energy Technology Data Exchange (ETDEWEB)

    Georgieva, Emiliya Lyudmilova

    2016-06-06

    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  2. Development of a new damage function model for power plants: Methodology and applications

    International Nuclear Information System (INIS)

    Levy, J.I.; Hammitt, J.K.; Yanagisawa, Y.; Spengler, J.D.

    1999-01-01

    Recent models have estimated the environmental impacts of power plants, but differences in assumptions and analytical methodologies have led to diverging findings. In this paper, the authors present a new damage function model that synthesizes previous efforts and refines components that have been associated with variations in impact estimates. Their model focuses on end-use emissions and quantified the direct human health impacts of criteria air pollutants. To compare their model to previous efforts and to evaluate potential policy applications, the authors assess the impacts of an oil and natural gas-fueled cogeneration power plant in Boston, MA. Impacts under baseline assumptions are estimated to be $0.007/kWh of electricity, $0.23/klb of steam, and $0.004/ton-h of chilled water (representing 2--9% of the market value of outputs). Impacts are largely related to ozone (48%) and particulate matter (42%). Addition of upstream emissions and nonpublic health impacts increases externalities by as much as 50%. Sensitivity analyses demonstrate the importance of plant siting, meteorological conditions, epidemiological assumptions, and the monetary value placed on premature mortality as well as the potential influence of global warming. Comparative analyses demonstrate that their model provides reasonable impact estimates and would therefore be applicable in a broad range of policy settings

  3. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T

    International Nuclear Information System (INIS)

    Hurtado, A.; Eguilior, S.; Recreo, F.

    2011-01-01

    In the framework of CO 2 Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  4. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  5. Development Customer Knowledge Management (Ckm) Models in Purbalingga Hospitality Using Soft Systems Methodology (Ssm)

    OpenAIRE

    Chasanah, Nur; Sensuse, Dana Indra; Lusa, Jonathan Sofian

    2014-01-01

    Development of the tourism sector is part of the national development efforts that are being implemented in Indonesia. This research was conducted with the customer to make an overview of knowledge management models to address the existing problems in hospitality in the hospitality Purbalingga as supporting tourism Purbalingga. The model depicts a series of problem-solving activities that result in the hospitality, especially in Purbalingga. This research was action research with methods of S...

  6. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  7. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  8. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  9. Model development and optimization of operating conditions to maximize PEMFC performance by response surface methodology

    International Nuclear Information System (INIS)

    Kanani, Homayoon; Shams, Mehrzad; Hasheminasab, Mohammadreza; Bozorgnezhad, Ali

    2015-01-01

    Highlights: • The optimization of the operating parameters in a serpentine PEMFC is done using RSM. • The RSM model can predict the cell power over the wide range of operating conditions. • St-An, St-Ca and RH-Ca have an optimum value to obtain the best performance. • The interactions of the operating conditions affect the output power significantly. • The cathode and anode stoichiometry are the most effective parameters on the power. - Abstract: Optimization of operating conditions to obtain maximum power in PEMFCs could have a significant role to reduce the costs of this emerging technology. In the present experimental study, a single serpentine PEMFC is used to investigate the effects of operating conditions on the electrical power production of the cell. Four significant parameters including cathode stoichiometry, anode stoichiometry, gases inlet temperature, and cathode relative humidity are studied using Design of Experiment (DOE) to obtain an optimal power. Central composite second order Response Surface Methodology (RSM) is used to model the relationship between goal function (power) and considered input parameters (operating conditions). Using this statistical–mathematical method leads to obtain a second-order equation for the cell power. This model considers interactions and quadratic effects of different operating conditions and predicts the maximum or minimum power production over the entire working range of the parameters. In this range, high stoichiometry of cathode and low stoichiometry of anode results in the minimum cell power and contrary the medium range of fuel and oxidant stoichiometry leads to the maximum power. Results show that there is an optimum value for the anode stoichiometry, cathode stoichiometry and relative humidity to reach the best performance. The predictions of the model are evaluated by experimental tests and they are in a good agreement for different ranges of the parameters

  10. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  11. MODELS OF THE 5 PORTERS COMPETITIVE FORCES METHODOLOGY CHANGES IN COMPANIES STRATEGY DEVELOPMENT ON COMPETITIVE MARKET

    Directory of Open Access Journals (Sweden)

    Sergey I Zubin

    2014-01-01

    Full Text Available There are some different types of approaches to 5 Porters Forces model development in thisarticle. Authors take up the negative attitude researcher reasons to this instrument and inputsuch changes in it, which can help to fi nd the best way to companies growing up on competitive market.

  12. Analysis of methodology for designing education and training model for professional development in the field of radiation technology

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kon Wuk; Lee, Jae Hun; Park, Tai Jin; Song, Myung Jae [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-02-15

    The domestic Radiation Technology is integrated into and utilized in various areas and is closely related to the industrial growth in Korea. The domestic use of radiation and RI (Radioisotope) increases in quantity every year, however the level of technology is poor when compared to other developed countries. Manpower training is essential for the development of Radiation Technology. Therefore, this study aimed to propose a methodology for designing systemic education and training model in the field of measurement and analysis of radiation. A survey was conducted to design education and training model and the training program for measurement and analysis of radiation was developed based on the survey results. The education and training program designed in this study will be utilized as a model for evaluating the professional development and effective recruitment of the professional workforce, and can be further applied to other radiation-related fields.

  13. Analysis of methodology for designing education and training model for professional development in the field of radiation technology

    International Nuclear Information System (INIS)

    Kim, Kon Wuk; Lee, Jae Hun; Park, Tai Jin; Song, Myung Jae

    2015-01-01

    The domestic Radiation Technology is integrated into and utilized in various areas and is closely related to the industrial growth in Korea. The domestic use of radiation and RI (Radioisotope) increases in quantity every year, however the level of technology is poor when compared to other developed countries. Manpower training is essential for the development of Radiation Technology. Therefore, this study aimed to propose a methodology for designing systemic education and training model in the field of measurement and analysis of radiation. A survey was conducted to design education and training model and the training program for measurement and analysis of radiation was developed based on the survey results. The education and training program designed in this study will be utilized as a model for evaluating the professional development and effective recruitment of the professional workforce, and can be further applied to other radiation-related fields

  14. GREET 1.5 - transportation fuel-cycle model - Vol. 1 : methodology, development, use, and results

    International Nuclear Information System (INIS)

    Wang, M. Q.

    1999-01-01

    This report documents the development and use of the most recent version (Version 1.5) of the Greenhouse Gases, Regulated Emissions, and Energy Use in Transportation (GREET) model. The model, developed in a spreadsheet format, estimates the full fuel-cycle emissions and energy associated with various transportation fuels and advanced vehicle technologies for light-duty vehicles. The model calculates fuel-cycle emissions of five criteria pollutants (volatile organic compounds, carbon monoxide, nitrogen oxides, particulate matter with diameters of 10 micrometers or less, and sulfur oxides) and three greenhouse gases (carbon dioxide, methane, and nitrous oxide). The model also calculates total energy consumption, fossil fuel consumption, and petroleum consumption when various transportation fuels are used. The GREET model includes the following cycles: petroleum to conventional gasoline, reformulated gasoline, conventional diesel, reformulated diesel, liquefied petroleum gas, and electricity via residual oil; natural gas to compressed natural gas, liquefied natural gas, liquefied petroleum gas, methanol, Fischer-Tropsch diesel, dimethyl ether, hydrogen, and electricity; coal to electricity; uranium to electricity; renewable energy (hydropower, solar energy, and wind) to electricity; corn, woody biomass, and herbaceous biomass to ethanol; soybeans to biodiesel; flared gas to methanol, dimethyl ether, and Fischer-Tropsch diesel; and landfill gases to methanol. This report also presents the results of the analysis of fuel-cycle energy use and emissions associated with alternative transportation fuels and advanced vehicle technologies to be applied to passenger cars and light-duty trucks

  15. On the development of a new methodology in sub-surface parameterisation on the calibration of groundwater models

    Science.gov (United States)

    Klaas, D. K. S. Y.; Imteaz, M. A.; Sudiayem, I.; Klaas, E. M. E.; Klaas, E. C. M.

    2017-10-01

    In groundwater modelling, robust parameterisation of sub-surface parameters is crucial towards obtaining an agreeable model performance. Pilot point is an alternative in parameterisation step to correctly configure the distribution of parameters into a model. However, the methodology given by the current studies are considered less practical to be applied on real catchment conditions. In this study, a practical approach of using geometric features of pilot point and distribution of hydraulic gradient over the catchment area is proposed to efficiently configure pilot point distribution in the calibration step of a groundwater model. A development of new pilot point distribution, Head Zonation-based (HZB) technique, which is based on the hydraulic gradient distribution of groundwater flow, is presented. Seven models of seven zone ratios (1, 5, 10, 15, 20, 25 and 30) using HZB technique were constructed on an eogenetic karst catchment in Rote Island, Indonesia and their performances were assessed. This study also concludes some insights into the trade-off between restricting and maximising the number of pilot points and offers a new methodology for selecting pilot point properties and distribution method in the development of a physically-based groundwater model.

  16. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Laxemar

    International Nuclear Information System (INIS)

    Aneljung, Maria; Sassner, Mona; Gustafsson, Lars-Goeran

    2007-11-01

    This report describes modelling where the hydrological modelling system MIKE SHE has been used to describe surface hydrology, near-surface hydrogeology, advective transport mechanisms, and the contact between groundwater and surface water within the SKB site investigation area at Laxemar. In the MIKE SHE system, surface water flow is described with the one-dimensional modelling tool MIKE 11, which is fully and dynamically integrated with the groundwater flow module in MIKE SHE. In early 2008, a supplementary data set will be available and a process of updating, rebuilding and calibrating the MIKE SHE model based on this data set will start. Before the calibration on the new data begins, it is important to gather as much knowledge as possible on calibration methods, and to identify critical calibration parameters and areas within the model that require special attention. In this project, the MIKE SHE model has been further developed. The model area has been extended, and the present model also includes an updated bedrock model and a more detailed description of the surface stream network. The numerical model has been updated and optimized, especially regarding the modelling of evapotranspiration and the unsaturated zone, and the coupling between the surface stream network in MIKE 11 and the overland flow in MIKE SHE. An initial calibration has been made and a base case has been defined and evaluated. In connection with the calibration, the most important changes made in the model were the following: The evapotranspiration was reduced. The infiltration capacity was reduced. The hydraulic conductivities of the Quaternary deposits in the water-saturated part of the subsurface were reduced. Data from one surface water level monitoring station, four surface water discharge monitoring stations and 43 groundwater level monitoring stations (SSM series boreholes) have been used to evaluate and calibrate the model. The base case simulations showed a reasonable agreement

  17. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Laxemar

    Energy Technology Data Exchange (ETDEWEB)

    Aneljung, Maria; Sassner, Mona; Gustafsson, Lars-Goeran (DHI Sverige AB, Lilla Bommen 1, SE-411 04 Goeteborg (Sweden))

    2007-11-15

    This report describes modelling where the hydrological modelling system MIKE SHE has been used to describe surface hydrology, near-surface hydrogeology, advective transport mechanisms, and the contact between groundwater and surface water within the SKB site investigation area at Laxemar. In the MIKE SHE system, surface water flow is described with the one-dimensional modelling tool MIKE 11, which is fully and dynamically integrated with the groundwater flow module in MIKE SHE. In early 2008, a supplementary data set will be available and a process of updating, rebuilding and calibrating the MIKE SHE model based on this data set will start. Before the calibration on the new data begins, it is important to gather as much knowledge as possible on calibration methods, and to identify critical calibration parameters and areas within the model that require special attention. In this project, the MIKE SHE model has been further developed. The model area has been extended, and the present model also includes an updated bedrock model and a more detailed description of the surface stream network. The numerical model has been updated and optimized, especially regarding the modelling of evapotranspiration and the unsaturated zone, and the coupling between the surface stream network in MIKE 11 and the overland flow in MIKE SHE. An initial calibration has been made and a base case has been defined and evaluated. In connection with the calibration, the most important changes made in the model were the following: The evapotranspiration was reduced. The infiltration capacity was reduced. The hydraulic conductivities of the Quaternary deposits in the water-saturated part of the subsurface were reduced. Data from one surface water level monitoring station, four surface water discharge monitoring stations and 43 groundwater level monitoring stations (SSM series boreholes) have been used to evaluate and calibrate the model. The base case simulations showed a reasonable agreement

  18. A flexible hydrological modelling system developed using an object oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Rinde, Trond

    1998-12-31

    The report presents a software system called Process Integrating Network (PINE). The capabilities, working principles, programming technical design and principles of use of the system are described as are some practical applications. PINE is a simulation tool for modelling of hydrological and hydrologically related phenomena. The system is based on object oriented programming principles and was specially designed to provide freedom in the choice of model structures and algorithms for process descriptions. It supports full freedom with regards to spatial distribution and temporal resolution. Geographical information systems (GIS) may be integrated with PINE in order to provide full spatial distribution in system parametrisation, process simulation and visualisation of simulation results. Simulation models are developed by linking components for process description together in a structure. The system can handle compound working media such as water with chemical or biological constituents. Non-hydrological routines may then be included to describe the responses of such constituents. Features such as extensibility and reuse of program components are emphasised in the program design. Separation between process topology, process descriptions and process data facilitates simple and consistent implementation of components for process description. Such components may be automatically prototyped and their response functions may be implemented without knowledge of other parts of the program system and without the need to program import or export routines or a user interface. Model extension is thus a rapid process that does not require extensive programming skills. Components for process descriptions may further be placed in separate program libraries, which can be included in the program as required. The program system can thus be very compact while it still has a large number of process algorithms available. The system can run on both PC and UNIX platforms. 106 figs., 20

  19. Methodology for Developing Hydrological Models Based on an Artificial Neural Network to Establish an Early Warning System in Small Catchments

    Directory of Open Access Journals (Sweden)

    Ivana Sušanj

    2016-01-01

    Full Text Available In some situations, there is no possibility of hazard mitigation, especially if the hazard is induced by water. Thus, it is important to prevent consequences via an early warning system (EWS to announce the possible occurrence of a hazard. The aim and objective of this paper are to investigate the possibility of implementing an EWS in a small-scale catchment and to develop a methodology for developing a hydrological prediction model based on an artificial neural network (ANN as an essential part of the EWS. The methodology is implemented in the case study of the Slani Potok catchment, which is historically recognized as a hazard-prone area, by establishing continuous monitoring of meteorological and hydrological parameters to collect data for the training, validation, and evaluation of the prediction capabilities of the ANN model. The model is validated and evaluated by visual and common calculation approaches and a new evaluation for the assessment. This new evaluation is proposed based on the separation of the observed data into classes based on the mean data value and the percentages of classes above or below the mean data value as well as on the performance of the mean absolute error.

  20. Enviro-HIRLAM online integrated meteorology–chemistry modelling system: strategy, methodology, developments and applications (v7.2

    Directory of Open Access Journals (Sweden)

    A. Baklanov

    2017-08-01

    Full Text Available The Environment – High Resolution Limited Area Model (Enviro-HIRLAM is developed as a fully online integrated numerical weather prediction (NWP and atmospheric chemical transport (ACT model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2, in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct on radiation and (first and second indirect effects on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform – HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose model configurations for the meteorological and air quality communities are discussed.

  1. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  2. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  3. a Novel Methodology for Developing Inundation Maps Under Climate Change Scenarios Using One-Dimensional Model

    Science.gov (United States)

    Vu, M. T.; Liong, S. Y.; Raghavan, V. S.; Liew, S. C.

    2011-07-01

    Climate change is expected to cause increases in extreme climatic events such as heavy rainstorms and rising tidal level. Heavy rainstorms are known to be serious causes of flooding problems in big cities. Thus, high density residential and commercial areas along the rivers are facing risks of being flooded. For that reason, inundated area determination is now being considered as one of the most important areas of research focus in flood forecasting. In such a context, this paper presents the development of a floodmap in determining flood-prone areas and its volumes. The areas and volumes of flood are computed by the inundated level using the existing digital elevation model (DEM) of a hypothetical catchment chosen for study. The study focuses on the application of Flood Early Warning System (Delft — FEWS, Deltares), which is designated to work with the SOBEK (Delft) to simulate the extent of stormwater on the ground surface. The results from FEWS consist of time-series of inundation maps in Image file format (PNG) and ASCII format, which are subsequently imported to ArcGIS for further calculations. In addition, FEWS results provide options to export the video clip of water spreading out over the catchment. Consequently, inundated area and volume will be determined by the water level on the ground. Final floodmap is displayed in colors created by ArcGIS. Various flood map results corresponding to climate change scenarios will be displayed in the main part of the paper.

  4. Model-based methodology to develop the isochronous stress-strain curves for modified 9Cr steels

    International Nuclear Information System (INIS)

    Kim, Woo Gon; Yin, Song Nan; Kim, Sung Ho; Lee, Chan Bock; Jung, Ik Hee

    2008-01-01

    Since high temperature materials are designed with a target life based on a specified amount of allowable strain and stress, their Isochronous Stress-Strain Curves (ISSC) are needed to avoid an excessive deformation during an intended service life. In this paper, a model-based methodology to develop the isochronous curves for a G91 steel is described. Creep strain-time curves were reviewed for typical high-temperature materials, and Garofalo's model which conforms well to the primary and secondary creep stages was proper for the G91 steel. Procedures to obtain an instantaneous elastic-plastic strain, ε i were given in detail. Also, to accurately determine the P 1 , P 2 and P 3 parameters in the Garofalo's model, a Nonlinear Least Square Fitting (NLSF) method was adopted and useful. The long-term creep curves for the G91 steel can be modeled by the Garofalo's model, and the long-term ISSCs can be developed using the modeled creep curves

  5. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  6. Methodological and empirical developments for the Ratcliff diffusion model of response times and accuracy

    NARCIS (Netherlands)

    Wagenmakers, E.-J.

    2009-01-01

    The Ratcliff diffusion model for simple two-choice decisions (e.g., Ratcliff, 1978; Ratcliff & McKoon, 2008) has two outstanding advantages. First, the model generally provides an excellent fit to the observed data (i.e., response accuracy and the shape of RT distributions, both for correct and

  7. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  8. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    Energy Technology Data Exchange (ETDEWEB)

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  9. Recent developments in imaging system assessment methodology, FROC analysis and the search model.

    Science.gov (United States)

    Chakraborty, Dev P

    2011-08-21

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  10. Recent developments in imaging system assessment methodology, FROC analysis and the search model

    International Nuclear Information System (INIS)

    Chakraborty, Dev P.

    2011-01-01

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  11. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Forsmark

    International Nuclear Information System (INIS)

    Aneljung, Maria; Gustafsson, Lars-Goeran

    2007-04-01

    The hydrological modelling system MIKE SHE has been used to describe near-surface groundwater flow, transport mechanisms and the contact between ground- and surface water at the Forsmark site. The surface water system at Forsmark is described with the 1D modelling tool MIKE 11, which is fully and dynamically integrated with MIKE SHE. In spring 2007, a new data freeze will be available and a process of updating, rebuilding and calibrating the MIKE SHE model will start, based on the latest data set. Prior to this, it is important to gather as much knowledge as possible on calibration methods and to define critical calibration parameters and areas within the model. In this project, an optimization of the numerical description and an initial calibration of the MIKE SHE model has been made, and an updated base case has been defined. Data from 5 surface water level monitoring stations, 4 surface water discharge monitoring stations and 32 groundwater level monitoring stations (SFM soil boreholes) has been used for model calibration and evaluation. The base case simulations generally show a good agreement between calculated and measured water levels and discharges, indicating that the total runoff from the area is well described by the model. Moreover, with two exceptions (SFM0012 and SFM0022) the base case results show very good agreement between calculated and measured groundwater head elevations for boreholes installed below lakes. The model also shows a reasonably good agreement between calculated and measured groundwater head elevations or depths to phreatic surfaces in many other points. The following major types of calculation-measurement differences can be noted: Differences in groundwater level amplitudes due to transpiration processes. Differences in absolute mean groundwater head, due to differences between borehole casing levels and the interpolated DEM. Differences in absolute mean head elevations, due to local errors in hydraulic conductivity values

  12. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Forsmark

    Energy Technology Data Exchange (ETDEWEB)

    Aneljung, Maria; Gustafsson, Lars-Goeran [DHI Water and Environment AB, Goeteborg (Sweden)

    2007-04-15

    The hydrological modelling system MIKE SHE has been used to describe near-surface groundwater flow, transport mechanisms and the contact between ground- and surface water at the Forsmark site. The surface water system at Forsmark is described with the 1D modelling tool MIKE 11, which is fully and dynamically integrated with MIKE SHE. In spring 2007, a new data freeze will be available and a process of updating, rebuilding and calibrating the MIKE SHE model will start, based on the latest data set. Prior to this, it is important to gather as much knowledge as possible on calibration methods and to define critical calibration parameters and areas within the model. In this project, an optimization of the numerical description and an initial calibration of the MIKE SHE model has been made, and an updated base case has been defined. Data from 5 surface water level monitoring stations, 4 surface water discharge monitoring stations and 32 groundwater level monitoring stations (SFM soil boreholes) has been used for model calibration and evaluation. The base case simulations generally show a good agreement between calculated and measured water levels and discharges, indicating that the total runoff from the area is well described by the model. Moreover, with two exceptions (SFM0012 and SFM0022) the base case results show very good agreement between calculated and measured groundwater head elevations for boreholes installed below lakes. The model also shows a reasonably good agreement between calculated and measured groundwater head elevations or depths to phreatic surfaces in many other points. The following major types of calculation-measurement differences can be noted: Differences in groundwater level amplitudes due to transpiration processes. Differences in absolute mean groundwater head, due to differences between borehole casing levels and the interpolated DEM. Differences in absolute mean head elevations, due to local errors in hydraulic conductivity values

  13. On the Development of Methodology for Planning and Cost-Modeling of a Wide Area Network

    OpenAIRE

    Ahmedi, Basri; Mitrevski, Pece

    2014-01-01

    The most important stages in designing a computer network in a wider geographical area include: definition of requirements, topological description, identification and calculation of relevant parameters (i.e. traffic matrix), determining the shortest path between nodes, quantification of the effect of various levels of technical and technological development of urban areas involved, the cost of technology, and the cost of services. These parameters differ for WAN networks in different regions...

  14. Development of a fluidized bed agglomeration modeling methodology to include particle-level heterogeneities in ash chemistry and granular physics

    Science.gov (United States)

    Khadilkar, Aditi B.

    The utility of fluidized bed reactors for combustion and gasification can be enhanced if operational issues such as agglomeration are mitigated. The monetary and efficiency losses could be avoided through a mechanistic understanding of the agglomeration process and prediction of operational conditions that promote agglomeration. Pilot-scale experimentation prior to operation for each specific condition can be cumbersome and expensive. So the development of a mathematical model would aid predictions. With this motivation, the study comprised of the following model development stages- 1) development of an agglomeration modeling methodology based on binary particle collisions, 2) study of heterogeneities in ash chemical composition and gaseous atmosphere, 3) computation of a distribution of particle collision frequencies based on granular physics for a poly-disperse particle size distribution, 4) combining the ash chemistry and granular physics inputs to obtain agglomerate growth probabilities and 5) validation of the modeling methodology. The modeling methodology comprised of testing every binary particle collision in the system for sticking, based on the extent of dissipation of the particles' kinetic energy through viscous dissipation by slag-liquid (molten ash) covering the particles. In the modeling methodology developed in this study, thermodynamic equilibrium calculations are used to estimate the amount of slag-liquid in the system, and the changes in particle collision frequencies are accounted for by continuously tracking the number density of the various particle sizes. In this study, the heterogeneities in chemical composition of fuel ash were studied by separating the bulk fuel into particle classes that are rich in specific minerals. FactSage simulations were performed on two bituminous coals and an anthracite to understand the effect of particle-level heterogeneities on agglomeration. The mineral matter behavior of these constituent classes was studied

  15. Development of loca calculation capability with relap5-3D in accordance with the evaluation model methodology

    International Nuclear Information System (INIS)

    Liang, T.K.S.; Huan-Jen, Hung; Chin-Jang, Chang; Lance, Wang

    2001-01-01

    In light water reactors, particularly the pressurized water reactor (PWR), the severity of a LOCA (loss of coolant accident) will limit how high the reactor power can operate. Although the best-estimate LOCA licensing methodology can provide the greatest margin on the PCT (peak cladding temperature) evaluation during LOCA, it generally takes more resources to develop. Instead, implementation of evaluation models required by the Appendix K of 10 CFR 50 upon an advanced thermal-hydraulic platform can also enlarge significant margin between the highest calculated PCT and the safety limit of 2200 F. The compliance of the current RELAP5-3D code with Appendix K of 10 CFR50 has been evaluated, and it was found that there are ten areas where code assessment and/or further modifications were required to satisfy the requirements set forth in the Appendix K of 10 CFR 50. The associated models for LOCA consequent phenomenon analysis should follow the major concern of regulation and be expected to give more conservative results than those by the best-estimate methodology. They were required to predict the decay power level, the blowdown hydraulics, the blowdown heat transfer, the flooding rate, and the flooding heat transfer. All of the ten areas included in above classified simulations have been further evaluated and the RELAP5-3D has been successfully modified to fulfill the associated requirements. In addition, to verify and assess the development of the Appendix K version of RELAP5-3D, nine separate-effect experiments were adopted. Through the assessments against separate-effect experiments, the success of the code modification in accordance with the Appendix K of 10 CFR 50 was demonstrated. We will apply another six sets of integral-effect experiments in the next step to assure the integral conservatism of the Appendix K version of RELAP5-3D on LOCA licensing evaluation. (authors)

  16. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  17. A methodological approach to parametric product modelling in motor car development; Ein methodischer Ansatz zur parametrischen Produktmodellierung in der Fahrzeugentwicklung

    Energy Technology Data Exchange (ETDEWEB)

    Boehme, M.

    2004-07-01

    Continuos improvement of processes and methodologies is one key element to shorten development time, reduce costs, and improve quality, and therefore to answer growing customer demands and global competition. This work describes a new concept of introducing the principles of parametric modeling to the entire product data model in the area of automotive development. Based on the idea, that not only geometric dimensions can be described by parameters, the method of parametric modeling is applied to the complete product model. The concept assumes four major principles: First, the parameters of the product model are handled independently from their proprietary data formats. Secondly, a strictly hierarchical structure is required for the parametric description of the product. The third principle demands an object-based parameterization. Finally the use of parameter-sets for the description of logical units of the product model tree is part of the concept. Those four principles are addressing the following main objectives: Supporting and improving Simultaneous Engineering, achieving data consistency over all development phases, digital approval of product properties, and incorporation of the design intent into the product model. Further improvement of the automotive development process can be achieved with the introduction of parametric product modeling using the principles described in this paper. (orig.) [German] Die Forderung nach kuerzeren Entwicklungszeiten, Reduzierung der Kosten und verbesserter Qualitaet erfordert eine stetige Verbesserung von Prozessen und Methoden in der Produktentwicklung. In dieser Arbeit wird ein neuer Ansatz vorgestellt, der die Methodik des parametrischen Konstruierens auf das gesamte Produktmodell in der Fahrzeugentwicklung anwendet, und somit weitere Potentiale zur Verbesserung des Produktentstehungsprozesses erschliesst. Ausgehend von der Annahme, dass nicht nur geometrische Abmessungen als Parameter beschrieben werden koennen, wird die

  18. Development of a methodology for probable maximum precipitation estimation over the American River watershed using the WRF model

    Science.gov (United States)

    Tan, Elcin

    A new physically-based methodology for probable maximum precipitation (PMP) estimation is developed over the American River Watershed (ARW) using the Weather Research and Forecast (WRF-ARW) model. A persistent moisture flux convergence pattern, called Pineapple Express, is analyzed for 42 historical extreme precipitation events, and it is found that Pineapple Express causes extreme precipitation over the basin of interest. An average correlation between moisture flux convergence and maximum precipitation is estimated as 0.71 for 42 events. The performance of the WRF model is verified for precipitation by means of calibration and independent validation of the model. The calibration procedure is performed only for the first ranked flood event 1997 case, whereas the WRF model is validated for 42 historical cases. Three nested model domains are set up with horizontal resolutions of 27 km, 9 km, and 3 km over the basin of interest. As a result of Chi-square goodness-of-fit tests, the hypothesis that "the WRF model can be used in the determination of PMP over the ARW for both areal average and point estimates" is accepted at the 5% level of significance. The sensitivities of model physics options on precipitation are determined using 28 microphysics, atmospheric boundary layer, and cumulus parameterization schemes combinations. It is concluded that the best triplet option is Thompson microphysics, Grell 3D ensemble cumulus, and YSU boundary layer (TGY), based on 42 historical cases, and this TGY triplet is used for all analyses of this research. Four techniques are proposed to evaluate physically possible maximum precipitation using the WRF: 1. Perturbations of atmospheric conditions; 2. Shift in atmospheric conditions; 3. Replacement of atmospheric conditions among historical events; and 4. Thermodynamically possible worst-case scenario creation. Moreover, climate change effect on precipitation is discussed by emphasizing temperature increase in order to determine the

  19. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  20. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Chang Hyeon; Yu, Yeong Woo; Cho, Jae Seon; Kim, Ju Yeol; Kim, Yun Ik; Yang, Hui Chang; Park, Gang Min; Hur, Byeong Gil [Seoul National Univ., Seoul (Korea, Republic of)

    1999-03-15

    The purpose of this study is the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of Nuclear Power Plant. The development of this methodology can contribute to enhance safety. In the first year of this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested.

  1. Development of assessment methodology for plant configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Kim, Yoon Ik; Yang, Hui Chang; Huh, Byeong Gill; Lee, Dong Won; Ahn, Gwan Won [Seoul National Univ., Seoul (Korea, Republic of)

    2001-03-15

    The purpose of this study IS the development of effective and overall assessment methodology which reflects the characteristics of plants for the surveillance, maintenance, repair and operation of nuclear power plants. In this study, recent researches are surveyed and concept definition, procedures, current PSA methodologies, implementation of various models are evaluated. Through this survey, systematic assessment methodology is suggested. Configuration control assessment methodology suggested in this study for the purpose of the development of configuration control methodology reflecting the characteristics of Korean NPPs, can be utilized as the supplement of current PSA methodologies.

  2. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  3. Development of seismic PSA methodology at JAERI

    International Nuclear Information System (INIS)

    Muramatsu, K.; Ebisawa, K.; Matsumoto, K.; Oikawa, T.; Kondo, M.

    1995-01-01

    The Japan Atomic Energy Research Institute (JAERI) is developing a methodology for seismic probabilistic safety assessment (PSA) of nuclear power plants, aiming at providing a set of procedures, computer codes and data suitable for performing seismic PSA in Japan. In order to demonstrate the usefulness of JAERI's methodology and to obtain better understanding on the controlling factors of the results of seismic PSAs, a seismic PSA for a BWR is in progress. In the course of this PSA, various improvements were made on the methodology. In the area of the hazard analysis, the application of the current method to the model plant site is being carried out. In the area of response analysis, the response factor method was modified to consider the non-linear response effect of the building. As for the capacity evaluation of components, since capacity data for PSA in Japan are very scarce, capacities of selected components used in Japan were evaluated. In the systems analysis, the improvement of the SECOM2 code was made to perform importance analysis and sensitivity analysis for the effect of correlation of responses and correlation of capacities. This paper summarizes the recent progress of the seismic PSA research at JAERI with emphasis on the evaluation of component capacity and the methodology improvement of systems reliability analysis. (author)

  4. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  5. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  6. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  7. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  8. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    Science.gov (United States)

    2016-06-01

    18 Figure 5 Spiral Model ...............................................................................................20 Figure 6...Memorandum No. 1. Tallahassee, FL: Florida Department of Transportation. 19 The spiral model of system development, first introduced in Boehm...system capabilities into the waterfall model would prove quite difficult, the spiral model assumes that available technologies will change over the

  9. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  10. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  11. Developing educational hypermedia applications: a methodological approach

    Directory of Open Access Journals (Sweden)

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  12. Comparative study on software development methodologies

    OpenAIRE

    Mihai Liviu DESPA

    2014-01-01

    This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager han...

  13. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  14. Study on fermentation conditions of palm juice vinegar by response surface methodology and development of a kinetic model

    Directory of Open Access Journals (Sweden)

    S. Ghosh

    2012-09-01

    Full Text Available Natural vinegar is one of the fermented products which has some potentiality with respect to a nutraceutical standpoint. The present study is an optimization of the fermentation conditions for palm juice vinegar production from palm juice (Borassus flabellifer wine, this biochemical process being aided by Acetobacter aceti (NCIM 2251. The physical parameters of the fermentation conditions such as temperature, pH, and time were investigated by Response Surface Methodology (RSM with 2³ factorial central composite designs (CCD. The optimum pH, temperature and time were 5.5, 30 °C and 72 hrs for the highest yield of acetic acid (68.12 g / L. The quadratic model equation had a R² value of 0.992. RSM played an important role in elucidating the basic mechanisms in a complex situation, thus providing better process control by maximizing acetic acid production with the respective physical parameters. At the optimized conditions of temperature, pH and time and with the help of mathematical kinetic equations, the Monod specific growth rate ( µ max= 0.021 h-1, maximum Logistic specific growth rate ( µ 'max = 0.027 h-1 and various other kinetic parameters were calculated, which helped in validation of the experimental data. Therefore, the established kinetic models may be applied for the production of natural vinegar by fermentation of low cost palm juice.

  15. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  16. A Comprehensive Methodology for Development, Parameter Estimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens

    2016-01-01

    of the prediction. The methodology is evaluated through development of a GC method for the prediction of the heat of combustion (ΔHco) for pure components. The results showed that robust regression lead to best performance statistics for parameter estimation. The bootstrap method is found to be a valid alternative......A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... identifiability issues, reporting of the 95% confidence intervals of the predicted property values should be mandatory as opposed to reporting only single value prediction, currently the norm in literature. Moreover, inclusion of higher order groups (additional parameters) does not always lead to improved...

  17. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  18. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  19. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  20. Development of effect assessment methodology for the deployment of fast reactor cycle system with dynamic computable general equilibrium model

    International Nuclear Information System (INIS)

    Shiotani, Hiroki; Ono, Kiyoshi

    2009-01-01

    The Global Trade and Analysis Project (GTAP) is a widely used computable general equilibrium (CGE) model developed by Purdue University. Although the GTAP-E, an energy environmental version of the GTAP model, is useful for surveying the energy-economy-environment-trade linkage is economic policy analysis, it does not have the decomposed model of the electricity sector and its analyses are comparatively static. In this study, a recursive dynamic CGE model with a detailed electricity technology bundle with nuclear power generation including FR was developed based on the GTAP-E to evaluate the long-term socioeconomic effects of FR deployment. The capital stock changes caused by international investments and some dynamic constraints of the FR deployment and operation (e.g., load following capability and plutonium mass balance) were incorporated in the analyses. The long-term socioeconomic effects resulting from the deployment of economic competitive FR with innovative technologies can be assessed; the cumulative effects of the FR deployment on GDP calculated using this model costed over 40 trillion yen in Japan and 400 trillion yen worldwide, which were several times more than the cost of the effects calculated using the conventional cost-benefit analysis tool, because of ripple effects and energy substitutions among others. (author)

  1. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  2. The prosa methodology for scenario development

    International Nuclear Information System (INIS)

    Grupa, J.B.

    2001-01-01

    In this paper a methodology for scenario development is proposed. The method is developed in an effort to convince ourselves (and others) that all conceivable future developments of a waste repository have been covered. To be able to assess all conceivable future developments, the method needs to be comprehensive. To convince us and others the method should be structured in such a way that the treatment of each conceivable future development is traceable. The methodology is currently being applied to two Dutch disposal designs. Preliminary results show that the elaborated method functions better than the original method. However, some elements in the method will need further refinement. (author)

  3. Integrated management model. Methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization

    International Nuclear Information System (INIS)

    Llovet, R.; Ibanez, R.; Woodcock, J.

    2005-01-01

    A key concern for utilities today is optimizing station aging and realibility management activities in a manner that maximizes the value of those activities withing an affordable budget. The Westinghouse Proactive Asset Management Model is a methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization of those activities. The process and tool support the development of an optimized, station-wide plan for inspection, testing, maintenance, repaor and replacement of aging components. The optimization identifies the benefit and optimal timing of those activities based on minimizing unplanned outage costs (avoided costs) and maximizing station Net Present Value. (Author)

  4. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  5. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  6. Evolution of courseware development methodology : recent issues

    NARCIS (Netherlands)

    Moonen, J.C.M.M.; Schoenmaker, Jan

    1992-01-01

    To improve the quality of courseware products and the efficiency of the courseware development process, a methodology based upon "courseware engineering", being a combination of instructional systems development and software engineering, has emerged over the last 10¿15 years. Recently, software

  7. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available as the acceleration of the system development phases through an iterative construction approach. These methodologies also claim to manage the changing nature of requirements. However, during the development of large and complex systems by a small and technically...

  8. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  9. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    are available. The first case study presents a rational approach for defining a development strategy for multi-enzymatic processes. The proposed methodology requires a profound and structured knowledge of the multi-enzyme systems, integrating chemistry, biological and process engineering. In order to suggest......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... and their relationship with the overall process is not clear.The work described in this thesis presents a methodological approach for early stage development of biocatalytic processes, understanding and dealing with the reaction, biocatalyst and process constraints. When applied, this methodology has a decisive role...

  10. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  11. Development Methodology for an Integrated Legal Cadastre

    NARCIS (Netherlands)

    Hespanha, J.P.

    2012-01-01

    This Thesis describes the research process followed in order to achieve a development methodology applicable to the reform of cadastral systems with a legal basis. It was motivated by the author’s participation in one of the first surveying and mapping operations for a digital cadastre in Portugal,

  12. Methodology proposal for the development of Tillage Models - (Part II) Indexes of physical-mechanic characterization of the soil and development of a tillage model

    International Nuclear Information System (INIS)

    Lozano Osorno Fernando; Castillo Herran, Bernardo

    1999-01-01

    A proposal was presented for the elaboration of tillage models that allows making decisions on systems of soil preparation (including the option of zero tillage) starting with measurements of the condition of these. After following a plan of sampling of diverse physical-mechanics parameters of the soil and of a statistical process of correlation, they were chosen as representative variables: the apparent density, the cone index, the content of humidity and the cohesion (torsion box, proves in situ); not only such parameters are very related to each other, but rather they also make possible to estimate other variables of interest like the total porosity appropriately, the macro-porosity, the hydraulic conductivity and in general the soil resistance, which makes viable to choose methods of removal of the soil in function of the initial state of the same one. In the proven case it could verify the possibility to establish systems of tillage reduction

  13. Development of risk-informed assessment (RIA) design methodology

    International Nuclear Information System (INIS)

    Ji, S. K.; Park, S. J.; Park, B. R.; Kim, M. R.; Choi, C. J.

    2001-01-01

    It has been assessed that the capital cost for future nuclear power plants needs to be reduced on the order of 35% to 40% for Advanced Light Water Reactors such as KNGR and System 80+. Such reduction in the capital cost will require a fundamental re-evaluation of the industry standards and regulatory basis under which nuclear plants are designed and licensed. The objective of this study is to develop the risk-informed assessment (RIA) design methodology for future nuclear power plants. In order to meet this objective, the design simplification method is developed and RIA design methodology exercised for conceptual system. For the methodology verification, simplified conceptual ECCS and feedwater system are developed, then LOCA sensitivity analyses and agressive secondary cooldown analyses for these systems are performed. In addition, the probability safety assessment (PSA) model for LOCA is developed and the validation of RIA design methodology is demonstrated

  14. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  15. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  16. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the

  17. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  18. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    Science.gov (United States)

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non

  19. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  20. Synthesis of methodology development and case studies

    OpenAIRE

    Roetter, R.P.; Keulen, van, H.; Laar, van, H.H.

    2000-01-01

    The .Systems Research Network for Ecoregional Land Use Planning in Support of Natural Resource Management in Tropical Asia (SysNet). was financed under the Ecoregional Fund, administered by the International Service for National Agricultural Research (ISNAR). The objective of the project was to develop and evaluate methodologies and tools for land use analysis, and apply them at the subnational scale to support agricultural and environmental policy formulation. In the framework of this projec...

  1. Comparing Methodologies for Developing an Early Warning System: Classification and Regression Tree Model versus Logistic Regression. REL 2015-077

    Science.gov (United States)

    Koon, Sharon; Petscher, Yaacov

    2015-01-01

    The purpose of this report was to explicate the use of logistic regression and classification and regression tree (CART) analysis in the development of early warning systems. It was motivated by state education leaders' interest in maintaining high classification accuracy while simultaneously improving practitioner understanding of the rules by…

  2. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  3. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  4. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  5. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  6. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    climate, omitting the energy in the frequency band between the two lower limits tested can lead to an incomplete characterization of model performance. This methodology was developed to aid in selecting a comparison frequency range that does not needlessly increase computational expense and does not exclude energy to the detriment of model performance analysis.

  7. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  8. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...... the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main...

  9. Development of methodology to construct a generic conceptual model of river-valley evolution for performance assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Kawamura, Makoto; Tanikawa, Shin-ichi; Yasue, Ken-ichi; Niizato, Tadafumi

    2011-01-01

    In order to assess the long-term safety of a geological disposal system for high-level radioactive waste (HLW), it is important to consider the impact of uplift and erosion, which cannot be precluded on a timescale in the order of several hundred thousand years for many locations in Japan. Geomorphic evolution, caused by uplift and erosion and coupled to climatic and sea-level changes, will impact the geological disposal system due to resulting spatial and temporal changes in the disposal environment. Degradation of HLW barrier performance will be particularly significant when the remnant repository structures near, and are eventually exposed at, the ground surface. In previous studies, fluvial erosion was densified as the key concern in most settings in Japan. Interpretation of the impact of the phenomena at relevant locations in Japan has led to development of a generic conceptual model which contains the features typical at middle reach of rivers. Here, therefore, we present a methodology for development of a generic conceptual model based on best current understanding of fluvial erosion in Japan, which identifies the simplifications and uncertainties involved and assesses their consequences in the context of repository performance. (author)

  10. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  11. Extending statistical boosting. An overview of recent methodological developments.

    Science.gov (United States)

    Mayr, A; Binder, H; Gefeller, O; Schmid, M

    2014-01-01

    Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately in the literature up to now. The methodological developments on statistical boosting during the last ten years can be grouped into three different lines of research: i) efforts to ensure variable selection leading to sparser models, ii) developments regarding different types of predictor effects and how to choose them, iii) approaches to extend the statistical boosting framework to new regression settings. Statistical boosting algorithms have been adapted to carry out unbiased variable selection and automated model choice during the fitting process and can nowadays be applied in almost any regression setting in combination with a large amount of different types of predictor effects.

  12. NPA4K development system using object-oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced.

  13. NPA4K development system using object-oriented methodology

    International Nuclear Information System (INIS)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced

  14. Evaluation of mechanical load in the musculoskeletal system : development of experimental and modeling methodologies for the study of the effect of exercise in human models

    OpenAIRE

    João, Filipa Oliveira da Silva

    2013-01-01

    Doutoramento em Motricidade Humana, na especialidade de Biomecânica A major concern of Biomechanics research is the evaluation of the mechanical load and power that the human body develops and endorses when performing high to moderate sport activities. With the purpose of increasing performance and reducing the risk of injury, substantial advances were accomplished to pursuit this goal, either on the laboratory techniques as well as modelling and simulation. Traditionally, the main focus w...

  15. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  16. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  17. Women in India with Gestational Diabetes Mellitus Strategy (WINGS: Methodology and development of model of care for gestational diabetes mellitus (WINGS 4

    Directory of Open Access Journals (Sweden)

    Arivudainambi Kayal

    2016-01-01

    Full Text Available Aim: The Women In India with GDM Strategy (WINGS project was conducted with the aim of developing a model of care (MOC suitable for women with gestational diabetes mellitus (GDM in low- and middle-income countries. Methodology: The WINGS project was carried out in Chennai, Southern India, in two phases. In Phase I, a situational analysis was conducted to understand the practice patterns of health-care professionals and to determine the best screening criteria through a pilot screening study. Results: Phase II involved developing a MOC-based on findings from the situational analysis and evaluating its effectiveness. The model focused on diagnosis, management, and follow-up of women with GDM who were followed prospectively throughout their pregnancy. An educational booklet was provided to all women with GDM, offering guidance on self-management of GDM including sample meal plans and physical activity tips. A pedometer was provided to all women to monitor step count. Medical nutrition therapy (MNT was the first line of treatment given to women with GDM. Women were advised to undergo fasting blood glucose and postprandial blood glucose testing every fortnight. Insulin was indicated when the target blood glucose levels were not achieved with MNT. Women were evaluated for pregnancy outcomes and postpartum glucose tolerance status. Conclusions: The WINGS MOC offers a comprehensive package at every level of care for women with GDM. If successful, this MOC will be scaled up to other resource-constrained settings with the hope of improving lives of women with GDM.

  18. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation.

    Science.gov (United States)

    Ribba, B; Grimm, H P; Agoram, B; Davies, M R; Gadkar, K; Niederer, S; van Riel, N; Timmis, J; van der Graaf, P H

    2017-08-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early Development to focus discussions on two critical methodological aspects of QSP model development: optimal structural granularity and parameter estimation. We here report in a perspective article a summary of presentations and discussions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  19. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  20. Development of a comprehensive management site evaluation methodology

    International Nuclear Information System (INIS)

    Rodgers, J.C.; Onishi, Y.

    1981-01-01

    The Nuclear Regulatory Commission is in the process of preparing regulations that will define the necessary conditions for adequate disposal of low-level waste (LLW) by confinement in an LLW disposal facility. These proposed regulations form the context in which the motivation for the joint Los Alamos National Laboratory Battelle Pacific Northwest Laboratory program to develop a site-specific, LLW site evaluation methodology is discussed. The overall effort is divided into three development areas: land-use evaluation, environmental transport modelling, and long term scenario development including long-range climatology projections. At the present time four steps are envisioned in the application of the methodology to a site: site land use suitability assessment, land use-ecosystem interaction, contaminant transport simulation, and sensitivity analysis. Each of these steps is discussed in the paper. 12 refs

  1. Geologic modeling in risk assessment methodology for radioactive waste management

    International Nuclear Information System (INIS)

    Logan, S.E.; Berbano, M.C.

    1977-01-01

    Under contract to the U.S. Environmental Protection Agency (EPA), the University of New Mexico is developing a computer based assessment methodology for evaluating public health and environmental impacts from the disposal of radioactive waste in geologic formations. Methodology incorporates a release or fault tree model, an environmental model, and an economic model. The release model and its application to a model repository in bedded salt is described. Fault trees are constructed to provide the relationships between various geologic and man-caused events which are potential mechanisms for release of radioactive material beyond the immediate environs of the repository. The environmental model includes: 1) the transport to and accumulations at various receptors in the biosphere, 2) pathways from these environmental concentrations, and 3) radiation dose to man. Finally, economic results are used to compare and assess various disposal configurations as a basis for formulatin

  2. METHODOLOGICAL DEVELOPMENTS IN 3D SCANNING AND MODELLING OF ARCHAEOLOGICAL FRENCH HERITAGE SITE : THE BRONZE AGE PAINTED CAVE OF "LES FRAUX", DORDOGNE (FRANCE

    Directory of Open Access Journals (Sweden)

    A. Burens

    2013-07-01

    Full Text Available For six years, an interdisciplinary team of archaeologists, surveyors, environmentalists and archaeometrists have jointly carried out the study of a Bronze Age painted cave, registrered in the French Historical Monuments. The archaeological cave of Les Fraux (Saint-Martin-de-Fressengeas, Dordogne forms a wide network of galleries, characterized by the exceptional richness of its archaeological remains such as ceramic and metal deposits, parietal representation and about domestic fireplaces. This cave is the only protohistorical site in Europe wherein are gathered testimonies of domestic, spiritual and artistic activities. Fortunately, the cave was closed at the end of the Bronze Age, following to the collapse of its entrance. The site was re-discovered in 1989 and its study started in 2007. The study in progress takes place in a new kind of tool founded by the CNRS's Institute of Ecology and Environment. The purpose of this observatory is the promotion of new methodologies and experimental studies in Global Ecology. In that framework, 3D models of the cave constitute the common work support and the best way for scientific communication for the various studies conducted on the site by nearly forty researchers. In this specific context, a partnership among archaeologists and surveyors from INSA Strasbourg allows the team to develop, in an interdisciplinary way, new methods of data acquiring based on contact-free measurements techniques in order to acquire a full 3D-documentation. This work is conducted in compliance with the integrity of the site. Different techniques based on Terrestrial Laser Scanning, Digital Photogrammetry and Spatial Imaging System have been used in order to generate a geometric and photorealistic 3D model from the combination of point clouds and photogrammetric images, for both visualization and accurate documentation purposes. Various scales of acquiring and diverse resolutions have been applied according to the subject

  3. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  4. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  5. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  6. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    Science.gov (United States)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  7. Monitoring sustainable biomass flows : General methodology development

    NARCIS (Netherlands)

    Goh, Chun Sheng; Junginger, Martin; Faaij, André

    Transition to a bio-based economy will create new demand for biomass, e.g. the increasing use of bioenergy, but the impacts on existing markets are unclear. Furthermore, there is a growing public concern on the sustainability of biomass. This study proposes a methodological framework for mapping

  8. The use and effectiveness of information system development methodologies in health information systems / Pieter Wynand Conradie.

    OpenAIRE

    Conradie, Pieter Wynand

    2010-01-01

    Abstract The main focus of this study is the identification of factors influencing the use and effectiveness of information system development methodologies (Le., systems development methodologies) in health information systems. In essence, it can be viewed as exploratory research, utilizing a conceptual research model to investigate the relationships among the hypothesised factors. More specifically, classified as behavioural science, it combines two theoretical models, namely...

  9. On-Line Maintenance Methodology Development

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyo Won; Kim, Jae Ho; Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2012-05-15

    Most of domestic maintenance activities for nuclear power plants are performed while overhaul. Therefore, On-Line Maintenance (OLM) is one of the proper risks informed application techniques for diffusing maintenance burden during overhaul with safety of the plant is secured. The NUMARC 93-01 (Rev.3) presents the OLM state of the art and it provides methodology. This study adopts NUMARC 93-01 (Rev.3) and present OLM. The reference component is Emergency Diesel Generator (EDG) of Ulchin 3, 4

  10. Locally Simple Models Construction: Methodology and Practice

    Directory of Open Access Journals (Sweden)

    I. A. Kazakov

    2017-12-01

    Full Text Available One of the most notable trends associated with the Fourth industrial revolution is a significant strengthening of the role played by semantic methods. They are engaged in artificial intelligence means, knowledge mining in huge flows of big data, robotization, and in the internet of things. Smart contracts also can be mentioned here, although the ’intelligence’ of smart contracts still needs to be seriously elaborated. These trends should inevitably lead to an increased role of logical methods working with semantics, and significantly expand the scope of their application in practice. However, there are a number of problems that hinder this process. We are developing an approach, which makes the application of logical modeling efficient in some important areas. The approach is based on the concept of locally simple models and is primarily focused on solving tasks in the management of enterprises, organizations, governing bodies. The most important feature of locally simple models is their ability to replace software systems. Replacement of programming by modeling gives huge advantages, for instance, it dramatically reduces development and support costs. Modeling, unlike programming, preserves the explicit semantics of models allowing integration with artificial intelligence and robots. In addition, models are much more understandable to general people than programs. In this paper we propose the implementation of the concept of locally simple modeling on the basis of so-called document models, which has been developed by us earlier. It is shown that locally simple modeling is realized through document models with finite submodel coverages. In the second part of the paper an example of using document models for solving a management problem of real complexity is demonstrated.

  11. Development of a new methodology for quantifying nuclear safety culture

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of). Dept. of Nuclear Engineering

    2017-01-15

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  12. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  13. Model development for mechanical properties and weld quality class of friction stir welding using multi-objective Taguchi method and response surface methodology

    International Nuclear Information System (INIS)

    Mohamed, Mohamed Ackiel; Manurung, Yupiter HP; Berhan, Mohamed Nor

    2015-01-01

    This study presents the effect of the governing parameters in friction stir welding (FSW) on the mechanical properties and weld quality of a 6mm thick 6061 T651 Aluminum alloy butt joint. The main FSW parameters, the rotational and traverse speed were optimized based on multiple mechanical properties and quality features, which focus on the tensile strength, hardness and the weld quality class using the multi-objective Taguchi method (MTM). Multi signal to noise ratio (MSNR) was employed to determine the optimum welding parameters for MTM while further analysis concerning the significant level determination was accomplished via the well-established analysis of variance (ANOVA). Furthermore, the first order model for predicting the mechanical properties and weld quality class is derived by applying response surface methodology (RSM). Based on the experimental confirmation test, the proposed method can effectively estimate the mechanical properties and weld quality class which can be used to enhance the welding performance in FSW or other applications.

  14. Model development for mechanical properties and weld quality class of friction stir welding using multi-objective Taguchi method and response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, Mohamed Ackiel [University Kuala Lumpur Malaysia France Institute, Bandar Baru Bangi (Malaysia); Manurung, Yupiter HP; Berhan, Mohamed Nor [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-06-15

    This study presents the effect of the governing parameters in friction stir welding (FSW) on the mechanical properties and weld quality of a 6mm thick 6061 T651 Aluminum alloy butt joint. The main FSW parameters, the rotational and traverse speed were optimized based on multiple mechanical properties and quality features, which focus on the tensile strength, hardness and the weld quality class using the multi-objective Taguchi method (MTM). Multi signal to noise ratio (MSNR) was employed to determine the optimum welding parameters for MTM while further analysis concerning the significant level determination was accomplished via the well-established analysis of variance (ANOVA). Furthermore, the first order model for predicting the mechanical properties and weld quality class is derived by applying response surface methodology (RSM). Based on the experimental confirmation test, the proposed method can effectively estimate the mechanical properties and weld quality class which can be used to enhance the welding performance in FSW or other applications.

  15. Development of theoretical methodology for large molecules

    International Nuclear Information System (INIS)

    Maggiora, G.M.; Christoffersen, R.E.; Yoffe, J.A.; Petke, J.D.

    1981-01-01

    A major advantage of the use of floating spherical Gaussian or vitals (FSGO) is the extreme rapidity with which the necessary quantum mechanical integrals can be evaluated. This advantage has been exploited in several quantum mechanical procedures for molecular electronic structure calculations, as described below. Several other properties of these functions have also been exploited, and have led to the development of semiclassical point charge and harmonic oscillator models capable of describing first and second order electromagnetic properties and intermolecular forces with reasonable accuracy in all cases and with considerably better accuracy than much more elaborate theoretical procedures in some cases. These applications are also described below. The primary intent of the current paper is to present an overview of some of the uses of FSGOs in the study of molecular electronic structure and properties and to indicate possible directions for future applications. No attempt will be made to include all possible applications. Rather, those applications of interest to the authors have been stressed. Hopefully, this paper will further stimulate the development of additional uses of these remarkable functions

  16. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  17. Status of Methodology Development for the Evaluation of Proliferation Resistance

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Ko, Won Il; Lee, Jung Won

    2010-01-01

    is inherently qualitative and it is difficult to quantify the evaluation result. Therefore, the new evaluation model needs to develop the methodology how to quantify the evaluation results with credibility

  18. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  19. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  20. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  1. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF). Th...

  2. Development of Testing Methodologies for the Mechanical Properties of MEMS

    Science.gov (United States)

    Ekwaro-Osire, Stephen

    2003-01-01

    This effort is to investigate and design testing strategies to determine the mechanical properties of MicroElectroMechanical Systems (MEMS) as well as investigate the development of a MEMS Probabilistic Design Methodology (PDM). One item of potential interest is the design of a test for the Weibull size effect in pressure membranes. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. However, the primary area of investigation will most likely be analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. This will be a continuation of the previous years work. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads.

  3. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  4. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  5. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  6. Reflood completion report: Volume 1. A phenomenological thermal-hydraulic model of hot rod bundles experiencing simultaneous bottom and top quenching and an optimization methodology for closure development

    International Nuclear Information System (INIS)

    Nelson, R.A. Jr.; Pimentel, D.A.; Jolly-Woodruff, S.; Spore, J.

    1998-04-01

    In this report, a phenomenological model of simultaneous bottom-up and top-down quenching is developed and discussed. The model was implemented in the TRAC-PF1/MOD2 computer code. Two sets of closure relationships were compared within the study, the Absolute set and the Conditional set. The Absolute set of correlations is frequently viewed as the pure set because the correlations is frequently viewed as the pure set because the correlations utilize their original coefficients as suggested by the developer. The Conditional set is a modified set of correlations with changes to the correlation coefficient only. Results for these two sets indicate quite similar results. This report also summarizes initial results of an effort to investigate nonlinear optimization techniques applied to the closure model development. Results suggest that such techniques can provide advantages for future model development work, but that extensive expertise is required to utilize such techniques (i.e., the model developer must fully understand both the physics of the process being represented and the computational techniques being employed). The computer may then be used to improve the correlation of computational results with experiments

  7. Safety-related operator actions: methodology for developing criteria

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria

  8. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  9. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  10. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  11. A component-based groupware development methodology

    NARCIS (Netherlands)

    Guareis de farias, Cléver; Ferreira Pires, Luis; van Sinderen, Marten J.

    2000-01-01

    Software development in general and groupware applications in particular can greatly benefit from the reusability and interoperability aspects associated with software components. Component-based software development enables the construction of software artefacts by assembling prefabricated,

  12. Bioclim Deliverable D8b: development of the physical/statistical down-scaling methodology and application to climate model Climber for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    too coarse and simplified. This is why we first need to find these 'physically based' relations between large scale model outputs and regional scale predictors. This is a solution to the specific problem of down-scaling from an intermediate complexity model such as CLIMBER. There are several other types of down-scaling methodologies, such has the dynamical and rule-based method presented in other BIOCLIM deliverables. A specificity of the present method is to attempt to use physical considerations in the down-scaling while a detailed 'dynamical' approach is out of reach because CLIMBER mainly provides the average climate. By contrast, an input of time-variability at various scales is necessary for a more dynamical approach. This report is organised as follows: Section 2 relates to the design and validation of the method, while section 3 reports the application to BIOCLIM simulations. We first present the employed data sources, which are the model results and the observed climatology. We then present the principles of the down-scaling method, the formulation of the predictors and the calibration of the statistical model, including results for the last glacial maximum. In section 3, the results are first presented as time series for each site, then as maps at specific times, or snapshots

  13. Development of intelligent model for personalized guidance on wheelchair tilt and recline usage for people with spinal cord injury: methodology and preliminary report.

    Science.gov (United States)

    Fu, Jicheng; Jones, Maria; Jan, Yih-Kuen

    2014-01-01

    Wheelchair tilt and recline functions are two of the most desirable features for relieving seating pressure to decrease the risk of pressure ulcers. The effective guidance on wheelchair tilt and recline usage is therefore critical to pressure ulcer prevention. The aim of this study was to demonstrate the feasibility of using machine learning techniques to construct an intelligent model to provide personalized guidance to individuals with spinal cord injury (SCI). The motivation stems from the clinical evidence that the requirements of individuals vary greatly and that no universal guidance on tilt and recline usage could possibly satisfy all individuals with SCI. We explored all aspects involved in constructing the intelligent model and proposed approaches tailored to suit the characteristics of this preliminary study, such as the way of modeling research participants, using machine learning techniques to construct the intelligent model, and evaluating the performance of the intelligent model. We further improved the intelligent model's prediction accuracy by developing a two-phase feature selection algorithm to identify important attributes. Experimental results demonstrated that our approaches held the promise: they could effectively construct the intelligent model, evaluate its performance, and refine the participant model so that the intelligent model's prediction accuracy was significantly improved.

  14. The Typology of Methodological Approaches to Development of Innovative Clusters

    Directory of Open Access Journals (Sweden)

    Farat Olexandra V.

    2017-06-01

    Full Text Available The aim of the article is to study the existing methodological approaches to assessing the development of enterprises for further substantiation of possibilities of their using by cluster associations. As a result of research, based on the analysis of scientific literature, the most applicable methodological approaches to assessing the development of enterprises are characterized. 8 methodical approaches to assessing the level of development of enterprises and 4 methodological approaches to assessing the level of development of clusters are singled out. Each of the approaches is characterized by the presence of certain advantages and disadvantages, but none of them allows to obtain a systematic assessment of all areas of cluster functioning, identify possible reserves for cluster competitiveness growth and characterize possible strategies for their future development. Taking into account peculiarities of the functioning and development of cluster associations of enterprises, we propose our own methodological approach for assessing the development of innovative cluster structures.

  15. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  16. Development of proliferation resistance assessment methodology based on international standard

    International Nuclear Information System (INIS)

    Ko, W. I.; Chang, H. L.; Lee, Y. D.; Lee, J. W.; Park, J. H.; Kim, Y. I.; Ryu, J. S.; Ko, H. S.; Lee, K. W.

    2012-04-01

    Nonproliferation is one of the main requirements to be satisfied by the advanced future nuclear energy systems that have been developed in the Generation IV and INPRO studies. The methodologies to evaluate proliferation resistance has been developed since 1980s, however, the systematic evaluation approach has begun from around 2000. Domestically a study to develop national method to evaluate proliferation resistance (PR) of advanced future nuclear energy systems has started in 2007 as one of the long-term nuclear R and D subjects in order to promote export and international credibility and transparency of national nuclear energy systems and nuclear fuel cycle technology development program. In the first phase (2007-2010) development and improvement of intrinsic evaluation parameters for the evaluation of proliferation resistance, quantification of evaluation parameters, development of evaluation models, and development of permissible ranges of evaluation parameters have been carried out. In the second phase (2010-2012) generic principle of to evaluate PR was established, and techincal guidelines, nuclear material diversion pathway analysis method, and a method to integrate evaluation parameters have been developed. which were applied to 5 alternative nuclear fuel cycles to estimate their appicability and objectivity. In addition, measures to enhance PR of advanced future nuclear energy systems and technical guidelines of PR assessment using intrinsic PR evaluation parameters were developed. Lastly, requlatory requirements to secure nonproliferation requirements of nuclear energy systems from the early design stage, operation and to decommissioning which will support the export of newly developed advanced future nuclear energy system

  17. Cooperative learning as a methodology for inclusive education development

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz Martínez

    2017-06-01

    Full Text Available This paper presents the methodology of cooperative learning as a strategy to develop the principles of inclusive education. It has a very practical orientation, with the intention of providing tools for teachers who want to implement this methodology in the classroom, starting with a theoretical review, and then a description of a case in which they have worked this methodology for 5 years. We describe specific activities and ways of working with students, later reaching conclusions on the implementation of the methodology.

  18. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  19. A study on methodological of software development for HEP

    International Nuclear Information System (INIS)

    Ding Yuzheng; Dai Guiliang

    1999-01-01

    The HEP related software system is a large one. It comprises mainly detector simulation software, DAQ software and offline system. The author discusses the advantages of OO object oriented methodologies applying to such software system, and the basic strategy for the usage of OO methodologies, languages and tools in the development of the HEP related software are given

  20. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  1. Developing new methodology for nuclear power plants vulnerability assessment

    International Nuclear Information System (INIS)

    Kostadinov, Venceslav

    2011-01-01

    Research highlights: → Paper presents new methodology for vulnerability assessment of nuclear power plants. → First universal quantitative risks assessment model for terrorist attack on a NPPs. → New model enhance security, reliability and safe operation of all energy infrastructure. → Significant research benefits: increased NPPs security, reliability and availability. → Useful new tool for PRA application to evaluation of terrorist threats on NPPs. - Abstract: The fundamental aim of an efficient regulatory emergency preparedness and response system is to provide sustained emergency readiness and to prevent emergency situations and accidents. But when an event occurs, the regulatory mission is to mitigate consequences and to protect people and the environment against nuclear and radiological damage. The regulatory emergency response system, which would be activated in the case of a nuclear and/or radiological emergency and release of radioactivity to the environment, is an important element of a comprehensive national regulatory system of nuclear and radiation safety. In the past, national emergency systems explicitly did not include vulnerability assessments of the critical nuclear infrastructure as an important part of a comprehensive preparedness framework. But after the huge terrorist attack on 11/09/2001, decision-makers became aware that critical nuclear infrastructure could also be an attractive target to terrorism, with the purpose of using the physical and radioactive properties of the nuclear material to cause mass casualties, property damage, and detrimental economic and/or environmental impacts. The necessity to evaluate critical nuclear infrastructure vulnerability to threats like human errors, terrorist attacks and natural disasters, as well as preparation of emergency response plans with estimation of optimized costs, are of vital importance for assurance of safe nuclear facilities operation and national security. In this paper presented

  2. Development of Engine Loads Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR seeks to improve the definition of design loads for rocket engine components such that higher performing, lighter weight engines can be developed more...

  3. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  4. 3D Urban Virtual Models generation methodology for smart cities

    Directory of Open Access Journals (Sweden)

    M. Álvarez

    2018-04-01

    Full Text Available Currently the use of Urban 3D Models goes beyond the mere support of three-dimensional image for the visualization of our urban surroundings. The three-dimensional Urban Models are in themselves fundamental tools to manage the different phenomena that occur in smart cities. It is therefore necessary to generate realistic models, in which BIM building design information can be integrated with GIS and other space technologies. The generation of 3D Urban Models benefit from the amount of data from sensors with the latest technologies such as airborne sensors and of the existence of international standards such as CityGML. This paper presents a methodology for the development of a three - dimensional Urban Model, based on LiDAR data and the CityGML standard, applied to the city of Lorca.

  5. A Review of Roads Data Development Methodologies

    Directory of Open Access Journals (Sweden)

    Taro Ubukawa

    2014-05-01

    Full Text Available There is a clear need for a public domain data set of road networks with high special accuracy and global coverage for a range of applications. The Global Roads Open Access Data Set (gROADS, version 1, is a first step in that direction. gROADS relies on data from a wide range of sources and was developed using a range of methods. Traditionally, map development was highly centralized and controlled by government agencies due to the high cost or required expertise and technology. In the past decade, however, high resolution satellite imagery and global positioning system (GPS technologies have come into wide use, and there has been significant innovation in web services, such that a number of new methods to develop geospatial information have emerged, including automated and semi-automated road extraction from satellite/aerial imagery and crowdsourcing. In this paper we review the data sources, methods, and pros and cons of a range of road data development methods: heads-up digitizing, automated/semi-automated extraction from remote sensing imagery, GPS technology, crowdsourcing, and compiling existing data sets. We also consider the implications for each method in the production of open data.

  6. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  7. Development of Cost Estimation Methodology of Decommissioning for PWR

    International Nuclear Information System (INIS)

    Lee, Sang Il; Yoo, Yeon Jae; Lim, Yong Kyu; Chang, Hyeon Sik; Song, Geun Ho

    2013-01-01

    The permanent closure of nuclear power plant should be conducted with the strict laws and the profound planning including the cost and schedule estimation because the plant is very contaminated with the radioactivity. In Korea, there are two types of the nuclear power plant. One is the pressurized light water reactor (PWR) and the other is the pressurized heavy water reactor (PHWR) called as CANDU reactor. Also, the 50% of the operating nuclear power plant in Korea is the PWRs which were originally designed by CE (Combustion Engineering). There have been experiences about the decommissioning of Westinghouse type PWR, but are few experiences on that of CE type PWR. Therefore, the purpose of this paper is to develop the cost estimation methodology and evaluate technical level of decommissioning for the application to CE type PWR based on the system engineering technology. The aim of present study is to develop the cost estimation methodology of decommissioning for application to PWR. Through the study, the following conclusions are obtained: · Based on the system engineering, the decommissioning work can be classified as Set, Subset, Task, Subtask and Work cost units. · The Set and Task structure are grouped as 29 Sets and 15 Task s, respectively. · The final result shows the cost and project schedule for the project control and risk management. · The present results are preliminary and should be refined and improved based on the modeling and cost data reflecting available technology and current costs like labor and waste data

  8. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  9. Modelization of physical phenomena in research reactors with the help of new developments in transport methods, and methodology validation with experimental data

    International Nuclear Information System (INIS)

    Rauck, St.

    2000-10-01

    The aim of this work is to develop a scheme for experimental reactors, based on transport equations. This type of reactors is characterized by a small core, a complex, very heterogeneous geometry and a large leakage. The possible insertion of neutron beams in the reflector and the presence of absorbers in the core increase the difficulty of the 3D-geometrical description and the physical modeling of the component parameters of the reactor. The Orphee reactor has been chosen for our study. Physical models (homogenization, collapsing cross section in few groups, albedo multigroup condition) have been developed in the APOLLO2 and CRONOS2 codes to calculate flux and power maps in a 3D-geometry, with different burnup and through transport equations. Comparisons with experimental measurements have shown the interest of taking into account anisotropy, steep flux gradients by using Sn methods, and on the other hand using a 12-group cross section library. The modeling of neutron beams has been done outside the core modeling through Monte Carlo calculations and with the total geometry, including a large thickness of heavy water. Thanks to this calculations, one can evaluate the neutron beams anti-reactivity and determinate the core cycle. We assure these methods more accurate than usual transport-diffusion calculations will be used for the conception of new research reactors. (author)

  10. Cernavoda NPP risk - Based test and maintenance planning - Methodology development

    International Nuclear Information System (INIS)

    Georgescu, G.; Popa, P.; Petrescu, A.; Naum, M.; Gutu, M.

    1997-01-01

    The Cernavoda Power Plant starts the commercial operation in November 1996. During operation of the nuclear power plant, several mandatory tests and maintenance are performed on stand-by safety system components to ensure their availability in case of accident. The basic purpose of such activities is the early detection of any failure and degradation, and timely correction of deteriorations. Because of the large number of such activities, emphasis on plant safety and allocation of resources becomes difficult. The probabilistic model and methodology can be effectively used to obtain the risk significance of these activities so that the resources are directed to the most important areas. The proposed Research Contract activity is strongly connected with other safety related areas under development. Since, the Cernavoda Probabilistic Safety Evaluation Level 1 PSA Study (CPSE) was performed and now the study is revised taking into account the as-built information, it is recommended to implement into the model the necessary modeling features to support further PSA application, especially related to Test and Maintenance optimization. Methods need to be developed in order to apply the PSA model including risk information together with other needed information for Test and Maintenance optimization. Also, in parallel with the CPSE study updating, the software interface for the PSA model is under development (Risk Monitor Software class), methods and models needing to be developed for the purpose of using it for qualified monitoring of Test and Maintenance Strategy efficiency. Similar, the Data Collection System need to be appropriate for the purpose of an ongoing implementation of a risk - based Test and Maintenance Strategy. (author). 4 refs, 1 fig

  11. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  12. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Science.gov (United States)

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  13. Advanced Power Plant Development and Analyses Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    G.S. Samuelsen; A.D. Rao

    2006-02-06

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include ''Zero Emission'' power plants and the ''FutureGen'' H{sub 2} co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the ''Vision 21'' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  14. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  15. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  16. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral......World-wide, much attention has been given to cadastral development. As a consequence of experiences made during the last decades, several authors have stated the need of research in the domain of cadastre and proposed methodologies to be used. The purpose of this paper is to contribute...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  17. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  18. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  19. Methodological developments and qualification of calculation schemes for the modelling of photonic heating in the experimental devices of the future Jules Horowitz material testing reactor (RJH)

    International Nuclear Information System (INIS)

    Blanchet, D.

    2006-01-01

    The objective of this work is to develop the modelling of the nuclear heating of the experimental devices of the future Jules Horowitz material testing reactor (RJH). The strong specific nuclear power produced (460 kW/l), induces so intense photonic fluxes which cause heating and large temperature gradients that it is necessary to control it by an adequate design. However, calculations of heating are penalized by the very large uncertainties estimated at a value of about 30% (2*σ) coming from the gaps and uncertainties of the data of gamma emission present in the libraries of basic nuclear data. The experimental program ADAPh aims at reducing these uncertainties. Measurements by thermoluminescent detectors (TLD) and ionisation chambers are carried out in the critical assemblies EOLE (Mox) and Minerve (UO 2 ). The rigorous interpretation of these measurements requires specific developments based on Monte-Carlo simulations of coupled neutron-gamma and gamma-electron transport. The developments carried out are made different in particular by the modelling of cavities phenomena and delayed gamma emissions by the decay of fission products. The comparisons calculation-measurement made it possible to identify a systematic bias confirming a tendency of calculations to underestimate measurements. A Bayesian method of adjustment was developed in order to re-estimate the principal components of the gamma heating and to transpose the results obtained to the devices of the RJH, under conditions clearly and definitely representative. This work made possible to reduce significantly the uncertainties on the determination of the gamma heating from 30 to 15 per cent. (author)

  20. What the Current System Development Trends tell us about Systems Development Methodologies: Toward explaining SSDAM, Agile and IDEF0 Methodologies

    Directory of Open Access Journals (Sweden)

    Abdulla F. Ally

    2015-03-01

    Full Text Available Systems integration, customization and component based development approach are of increasing attention. This trend facilitates the research attention to also focus on systems development methodologies. The availability of systems development tools, rapid change in technologies, evolution of mobile computing and the growth of cloud computing have necessitated a move toward systems integration and customization rather than developing systems from scratch. This tendency encourages component based development and discourages traditional systems development approach. The paper presents and evaluates SSADM, IDEF0 and Agile systems development methodologies. More specifically, it examines how they fit or not fit into the current competitive market of systems development. In the view of this perspective, it is anticipated that despite of its popularity, SSADM methodology is becoming obsolete while Agile and IDEF0 methodologies are still gaining acceptance in the current competitive market of systems development. The present study more likely enrich our understanding of the systems development methodologies concepts and draw attention regarding where the current trends in system development are heading.

  1. METHODOLOGICAL GUIDELINES FOR THE TRANSPROFESSIONALISM DEVELOPMENT AMONG VOCATIONAL EDUCATORS

    Directory of Open Access Journals (Sweden)

    E. F. Zeer

    2017-01-01

    Full Text Available Introduction. Nowadays, regarding the 6thwave of technological innovations and emergence of a phenomenon «transfession», there is a need for modernization of the vocational staff training in our country. Transfession is a type of the labour activity realized on the basis of synthesis and convergence of the professional competences that involve different specialized areas. Thus, the authors of the present article propose to use the professional and educational platform, developed by them, taking into account a specialists’ training specialty. The aims of the article are the following: to describe the phenomenon «transprofessionalism», to determine the initial attitudes towards its understanding; to present the block-modular model of the platform for the formation of the transprofessionalism of the teachers of the vocational school. Methodology and research methods. The research is based on the following theoretical and scientific methods: analysis, synthesis, concretization, generalization; hypothetical-deductive method; project-based method. The projecting of the transprofessionalism platform model was constructed on the basis of multidimensional, transdisciplinary, network and project approaches. Results and scientific novelty. The relevance of the discussed phenomenon in the productive-economic sphere is proved. The transprofessionalism requires a brand new content-informative and technological training of specialists. In particular, the concept «profession» has lost its original meaning as an area of the social division of labour during socio-technological development of the Russian economy. Therefore, transprofessionals are becoming more competitive and demanded in the employment market, being capable to perform a wide range of specialized types of professional activities. The structure, principles and mechanisms of the professional-educational platform functioning for transprofessionalism formation among the members of professional

  2. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  3. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  4. Prometheus Reactor I and C Software Development Methodology, for Action

    International Nuclear Information System (INIS)

    T. Hamilton

    2005-01-01

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I and C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I and C Software Development Process Manual and Reactor Module Software Development Plan to NR for information

  5. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  6. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning, the establishment and destruction of components at software run-time (dynamic reconfiguration, cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning.Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration, and finally with a mobile architecture (i.e. architecture with dynamic component mobility confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008. Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development. According to (Szyper, 2002 that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process, see (Sommer, 2010 as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of

  7. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used ...

  8. Methodological Aspects of Modelling and Simulation of Robotized Workstations

    Directory of Open Access Journals (Sweden)

    Naqib Daneshjo

    2018-05-01

    Full Text Available From the point of view of development of application and program products, key directions that need to be respected in computer support for project activities are quite clearly specified. User interfaces with a high degree of graphical interactive convenience, two-dimensional and three-dimensional computer graphics contribute greatly to streamlining project methodologies and procedures in particular. This is mainly due to the fact that a high number of solved tasks is clearly graphic in the modern design of robotic systems. Automation of graphical character tasks is therefore a significant development direction for the subject area. The authors present results of their research in the area of automation and computer-aided design of robotized systems. A new methodical approach to modelling robotic workstations, consisting of ten steps incorporated into the four phases of the logistics process of creating and implementing a robotic workplace, is presented. The emphasis is placed on the modelling and simulation phase with verification of elaborated methodologies on specific projects or elements of the robotized welding plant in automotive production.

  9. Building Modelling Methodologies for Virtual District Heating and Cooling Networks

    Energy Technology Data Exchange (ETDEWEB)

    Saurav, Kumar; Choudhury, Anamitra R.; Chandan, Vikas; Lingman, Peter; Linder, Nicklas

    2017-10-26

    District heating and cooling systems (DHC) are a proven energy solution that has been deployed for many years in a growing number of urban areas worldwide. They comprise a variety of technologies that seek to develop synergies between the production and supply of heat, cooling, domestic hot water and electricity. Although the benefits of DHC systems are significant and have been widely acclaimed, yet the full potential of modern DHC systems remains largely untapped. There are several opportunities for development of energy efficient DHC systems, which will enable the effective exploitation of alternative renewable resources, waste heat recovery, etc., in order to increase the overall efficiency and facilitate the transition towards the next generation of DHC systems. This motivated the need for modelling these complex systems. Large-scale modelling of DHC-networks is challenging, as it has several components interacting with each other. In this paper we present two building methodologies to model the consumer buildings. These models will be further integrated with network model and the control system layer to create a virtual test bed for the entire DHC system. The model is validated using data collected from a real life DHC system located at Lulea, a city on the coast of northern Sweden. The test bed will be then used for simulating various test cases such as peak energy reduction, overall demand reduction etc.

  10. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  11. Development of radiation risk assessment simulator using system dynamics methodology

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moosung

    2008-01-01

    The potential magnitudes of radionuclide releases under severe accident loadings and offsite consequences as well as the overall risk (the product of accident frequencies and consequences) are analyzed and evaluated quantitatively in this study. The system dynamics methodology has been applied to predict the time-dependent behaviors such as feedback and dependency as well as to model uncertain behavior of complex physical system. It is used to construct the transfer mechanisms of time dependent radioactivity concentration and to evaluate them. Dynamic variations of radio activities are simulated by considering several effects such as deposition, weathering, washout, re-suspension, root uptake, translocation, leaching, senescence, intake, and excretion of soil. The time-dependent radio-ecological model applicable to Korean specific environment has been developed in order to assess the radiological consequences following the short-term deposition of radio-nuclides during severe accidents nuclear power plant. An ingestion food chain model can estimate time dependent radioactivity concentrations in foodstuffs. And it is also shown that the system dynamics approach is useful for analyzing the phenomenon of the complex system as well as the behavior of structure values with respect to time. The output of this model (Bq ingested per Bq m - 2 deposited) may be multiplied by the deposition and a dose conversion factor (Gy Bq -1 ) to yield organ-specific doses. The model may be run deterministically to yield a single estimate or stochastic distributions by 'Monte-Carlo' calculation that reflects uncertainty of parameter and model uncertainties. The results of this study may contribute to identifying the relative importance of various parameters occurred in consequence analysis, as well as to assessing risk reduction effects in accident management. (author)

  12. A methodology for the parametric modelling of the flow coefficients and flow rate in hydraulic valves

    International Nuclear Information System (INIS)

    Valdés, José R.; Rodríguez, José M.; Saumell, Javier; Pütz, Thomas

    2014-01-01

    Highlights: • We develop a methodology for the parametric modelling of flow in hydraulic valves. • We characterize the flow coefficients with a generic function with two parameters. • The parameters are derived from CFD simulations of the generic geometry. • We apply the methodology to two cases from the automotive brake industry. • We validate by comparing with CFD results varying the original dimensions. - Abstract: The main objective of this work is to develop a methodology for the parametric modelling of the flow rate in hydraulic valve systems. This methodology is based on the derivation, from CFD simulations, of the flow coefficient of the critical restrictions as a function of the Reynolds number, using a generalized square root function with two parameters. The methodology is then demonstrated by applying it to two completely different hydraulic systems: a brake master cylinder and an ABS valve. This type of parametric valve models facilitates their implementation in dynamic simulation models of complex hydraulic systems

  13. Development of methodology for early detection of BWR instabilities

    International Nuclear Information System (INIS)

    Alessandro Petruzzi; Shin Chin; Kostadin Ivanov; Asok Ray; Fan-Bill Cheung

    2005-01-01

    Full text of publication follows: The objective of the work presented in this paper research, which is supported by the US Department of Energy under the NEER program, is to develop an early anomaly detection methodology in order to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, based on the US NRC coupled code TRACE/PARCS, is being utilized as a generator of time series data for anomaly detection at an early stage. The concept of the methodology is based on the fact that nonlinear systems show bifurcation, which is a change in the qualitative behavior as the system parameters vary. Some of these parameters may change on their own accord and account for the anomaly, while certain parameters can be altered in a controlled fashion. The non-linear, non-autonomous BWR system model considered in this research exhibits phenomena at two time scales. Anomalies occur at the slow time scale while the observation of the dynamical behavior, based on which inferences are made, takes place at the fast time scale. It is assumed that: (i) the system behavior is stationary at the fast time scale; and (ii) any observable non-stationary behavior is associated with parametric changes evolving at the slow time scale. The goal is to make inferences about evolving anomalies based on the asymptotic behavior derived from the computer simulation. However, only sufficient changes in the slowly varying parameter may lead to detectable difference in the asymptotic behavior. The need to detect such small changes in parameters and hence early detection of an anomaly motivate the utilized stimulus-response approach. In this approach, the model

  14. Non-economic determinants of economic development: methodology and influence

    OpenAIRE

    Barashov, N.

    2011-01-01

    The paper deals with research methodology of non-economic determinants of economic development. The author considers various theoretical approaches to definition of economic growth factors. Considerable attention is given to studying possible influence of non-economic determinants on quality of economic development.

  15. A vision on methodology for integrated sustainable urban development: bequest

    NARCIS (Netherlands)

    Bentivegna, V.; Curwell, S.; Deakin, M.; Lombardi, P.; Mitchell, G.; Nijkamp, P.

    2002-01-01

    The concepts and visions of sustainable development that have emerged in the post-Brundtland era are explored in terms laying the foundations for a common vision of sustainable urban development (SUD). The described vision and methodology for SUD resulted from the activities of an international

  16. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  17. Modeling Methodologies for Representing Urban Cultural Geographies in Stability Operations

    National Research Council Canada - National Science Library

    Ferris, Todd P

    2008-01-01

    ... 2.0.0, in an effort to provide modeling methodologies for a single simulation tool capable of exploring the complex world of urban cultural geographies undergoing Stability Operations in an irregular warfare (IW) environment...

  18. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  19. Methodology for development of risk indicators for offshore platforms

    International Nuclear Information System (INIS)

    Oeien, K.; Sklet, S.

    1999-01-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  20. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  1. Methodological Grounds of Managing Innovation Development of Restaurants

    OpenAIRE

    Naidiuk V. S.

    2013-01-01

    The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the "managing innovation development of an enterprise" notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficien...

  2. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  3. Embracing Agile methodology during DevOps Developer Internship Program

    OpenAIRE

    Patwardhan, Amol; Kidd, Jon; Urena, Tiffany; Rajgopalan, Aishwarya

    2016-01-01

    The DevOps team adopted agile methodologies during the summer internship program as an initiative to move away from waterfall. The DevOps team implemented the Scrum software development strategy to create an internal data dictionary web application. This article reports on the transition process and lessons learned from the pilot program.

  4. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  5. Summary of FY-1978 consultation input for Scenario Methodology Development

    International Nuclear Information System (INIS)

    Scott, B.L.; Benson, G.L.; Craig, R.A.; Harwell, M.A.

    1979-11-01

    The Scenario Methodology Development task is concerned with evaluating the geologic system surrounding an underground repository and describing the phenomena (volcanic, seismic, meteorite, hydrologic, tectonic, climate, etc.) which could perturb the system and possibly cause loss of repository integrity. This document includes 14 individual papers. Separate abstracts were prepared for all 14 papers

  6. Conceptual and methodological biases in network models.

    Science.gov (United States)

    Lamm, Ehud

    2009-10-01

    Many natural and biological phenomena can be depicted as networks. Theoretical and empirical analyses of networks have become prevalent. I discuss theoretical biases involved in the delineation of biological networks. The network perspective is shown to dissolve the distinction between regulatory architecture and regulatory state, consistent with the theoretical impossibility of distinguishing a priori between "program" and "data." The evolutionary significance of the dynamics of trans-generational and interorganism regulatory networks is explored and implications are presented for understanding the evolution of the biological categories development-heredity, plasticity-evolvability, and epigenetic-genetic.

  7. European methodology for qualification of NDT as developed by ENIQ

    International Nuclear Information System (INIS)

    Champigny, F.; Sandberg, U.; Engl, G.; Crutzen, S.; Lemaitre, P.

    1997-01-01

    The European Network for Inspection Qualification (ENIQ) groups the major part of the nuclear power plant operators in the European Union (and Switzerland). The main objective of ENIQ is to co-ordinate and manage at European level expertise and resources for the qualification of NDE inspection systems, primarily for nuclear components. In the framework of ENIQ the European methodology for qualification of NDT has been developed. In this paper the main principles of the European methodology are given besides the main activities and organisation of ENIQ. (orig.)

  8. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  9. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  10. Modeling of development and projection of the accumulated recoverable oil volume: methodology and application; Modelagem da evolucao e projecao de volume de oleo recuperavel acumulado: metodologia e aplicacao

    Energy Technology Data Exchange (ETDEWEB)

    Melo, Luciana Cavalcanti de; Ferreira Filho, Virgilio Jose Martins; Rocha, Vinicius Brito [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)

    2004-07-01

    A relevant problem that petroleum companies deal is the estimate of the future levels of reserves The objective of the reserve forecasting is pursued through the construction of mathematical models. Considering that the exploration process is an informed and controlled process, in order to reach the exploration targets, the exploration process is lead inside of a sequence of decisions based on the reached results. Such decisions are taken surrounded by an uncertain environment added to the random nature of the process. Another important assumption that must be taken into consideration is the dependency of the exploration on the conditions, or structure, of the discovered resources and the final potential. The modeling starts with the establishment of a general problem, when the models are being constructed, based on suppositions associated to the main concepts, and ends with the attainment of specific solutions, when the best description, or model, is selected through the estimate of the respective parameters and of the measurement adjustments. The result of this approach reflects the essence of the exploration process and how it is reflected in the incorporation of reserves and history of field discoveries. A case study is used for validation of the models and the estimates. (author)

  11. Risk-Informed Assessment Methodology Development and Application

    International Nuclear Information System (INIS)

    Sung Goo Chi; Seok Jeong Park; Chul Jin Choi; Ritterbusch, S.E.; Jacob, M.C.

    2002-01-01

    Westinghouse Electric Company (WEC) has been working with Korea Power Engineering Company (KOPEC) on a US Department of Energy (DOE) sponsored Nuclear Energy Research Initiative (NERI) project through a collaborative agreement established for the domestic NERI program. The project deals with Risk-Informed Assessment (RIA) of regulatory and design requirements of future nuclear power plants. An objective of the RIA project is to develop a risk-informed design process, which focuses on identifying and incorporating advanced features into future nuclear power plants (NPPs) that would meet risk goals in a cost-effective manner. The RIA design methodology is proposed to accomplish this objective. This paper discusses the development of this methodology and demonstrates its application in the design of plant systems for future NPPs. Advanced conceptual plant systems consisting of an advanced Emergency Core Cooling System (ECCS) and Emergency Feedwater System (EFWS) for a NPP were developed and the risk-informed design process was exercised to demonstrate the viability and feasibility of the RIA design methodology. Best estimate Loss-of-Coolant Accident (LOCA) analyses were performed to validate the PSA success criteria for the NPP. The results of the analyses show that the PSA success criteria can be met using the advanced conceptual systems and that the RIA design methodology is a viable and appropriate means of designing key features of risk-significant NPP systems. (authors)

  12. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  13. Selecting a software development methodology. [of digital flight control systems

    Science.gov (United States)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  14. Territory development as economic and geographical activity (theory, methodology, practice

    Directory of Open Access Journals (Sweden)

    Vitaliy Nikolaevich Lazhentsev

    2013-03-01

    Full Text Available Accents in a description of theory and methodology of territory development are displaced from distribution of the national benefits on formation of territorial natural and economic systems and organization of economical and geographical activity. The author reveals theconcept of «territory development» and reviews its placein thetheory and methodology of human geography and regionaleconomy. In the articletheindividual directions ofeconomic activity areconsidered. The author has made an attempt to definethesubject matter of five levels of «ideal» territorial and economic systems as a part of objects of the nature, societies, population settlement, production, infrastructure and management. The author’s position of interpretation of sequences of mechanisms of territory development working according to a Nested Doll principle (mechanism of economy, economic management mechanism, controlling mechanism of economy is presented. The author shows the indicators, which authentically define territory development

  15. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  16. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  17. Analysis and development of numerical methodologies for simulation of flow control with dielectric barrier discharge actuators

    OpenAIRE

    Abdollahzadehsangroudi, Mohammadmahdi

    2014-01-01

    The aim of this thesis is to investigate and develop different numerical methodologies for modeling the Dielectric Barrier discharge (DBD) plasma actuators for flow control purposes. Two different modeling approaches were considered; one based on Plasma-fluid model and the other based on a phenomenological model. A three component Plasma fluid model based on the transport equations of charged particles was implemented in this thesis in OpenFOAM, using several techniques to redu...

  18. A case study in data audit and modelling methodology. Australia

    Energy Technology Data Exchange (ETDEWEB)

    Apelbaum, John [Apelbaum Consulting Group, 750 Blackburn Road, Melbourne VIC 3170 (Australia)

    2009-10-15

    The purpose of the paper is to outline a rigorous, spatially consistent and cost-effective transport planning tool that projects travel demand, energy and emissions for all modes associated with domestic and international transport. The planning tool (Aus{sub e}Tran) is a multi-modal, multi-fuel and multi-regional macroeconomic and demographic-based computational model of the Australian transport sector that overcomes some of the gaps associated with existing strategic level transport emission models. The paper also identifies a number of key data issues that need to be resolved prior to model development with particular reference to the Australian environment. The strategic model structure endogenously derives transport demand, energy and emissions by jurisdiction, vehicle type, emission type and transport service for both freight and passenger transport. Importantly, the analytical framework delineates the national transport task, energy consumed and emissions according to region, state/territory of origin and jurisdictional protocols, provides an audit mechanism for the evaluation of the methodological framework, integrates a mathematical protocol to derive time series FFC emission factors and allows for the impact of non-registered road vehicles on transport, fuel and emissions. (author)

  19. Risk methodology for geologic disposal of radioactive waste: model description and user manual for Pathways model

    International Nuclear Information System (INIS)

    Helton, J.C.; Kaestner, P.C.

    1981-03-01

    A model for the environmental movement and human uptake of radionuclides is presented. This model is designated the Pathways-to-Man Model and was developed as part of a project funded by the Nuclear Regulatory Commission to design a methodology to assess the risk associated with the geologic disposal of high-level radioactive waste. The Pathways-to-Man Model is divided into two submodels. One of these, the Environmental Transport Model, represents the long-term distribution and accumulation of radionuclides in the environment. This model is based on a mixed-cell approach and describes radionuclide movement with a system of linear differential equations. The other, the Transport-to-Man Model, represents the movement of radionuclides from the environment to man. This model is based on concentration ratios. General descriptions of these models are provided in this report. Further, documentation is provided for the computer program which implements the Pathways Model

  20. Development of an aeroelastic methodology for surface morphing rotors

    Science.gov (United States)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  1. ENTERPRISES DEVELOPMENT: MANAGEMENT MODEL

    Directory of Open Access Journals (Sweden)

    Lina Shenderivska

    2018-01-01

    Full Text Available The paper’s purpose is to provide recommendations for the effective managing the companies’ development taking into account the sectoral key elements’ transformation. Methodology. The enterprise profits’ econometric simulation is conducted to determine the most significant factors influencing their development. According to the model testing result, their multicollinearity was revealed. To get rid of the multicollinearity phenomenon from the profit models, isolated regressors are excluded, namely, return on assets, material returns, return on equity. To obtain qualitative models with a small error of model parameters estimation and, accordingly, high reliability of the conclusion about the interrelation between the factors of the model and the resulting feature, factors in the income model that are not closely interconnected, that is, not multicollinear, are included. Determination coefficients R2 and F-criterion were calculated for model quality checking. The modern printing enterprises of Ukraine key elements, connected with integration into the global information space, are analysed. Results. The interrelation between a company’s development and earning capacity is identified in the study. The profit importance as the main source for enterprise financing is substantiated. Factors that have the greatest impact on the enterprises’ development are labour productivity, financial autonomy, working capital turnover, and the character of their influence is most adequately reflected by the power model. Peculiarities of the enterprises’ activity include increased competition at the inter-branch level, poorly developed industrial relations, and the own sources of financing activities shortage. Practical implications. Based on information on the most significant developmental impact factors, directions for perspective enterprises development for their competitiveness increase are proposed: diversification based on the activity expansion

  2. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions

    DEFF Research Database (Denmark)

    Roberts, Chris; Freeman, John; Samdal, Oddrun

    2009-01-01

    OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members...... of the network, we collaboratively analysed our joint understandings of the survey's methodology. RESULTS: We identified four tensions that are likely to be present in upcoming survey cycles: (1) maintaining quality standards against a background of rapid growth, (2) continuous improvement with limited financial...... in working through such challenges renders it likely that HBSC can provide a model of other similar studies facing these tensions....

  3. Reaching the grassroots: publishing methodologies for development organizations.

    Science.gov (United States)

    Zielinski, C

    1987-01-01

    There are 3 major distinctions between the traditional form of academic publishing and publishing for the grassroots as a development-organization activity, particularly in developing countries. Whereas academic publishing seeks to cover the target audience in its entirety, grassroots publishing can only cover a sampling. Academic publishing fulfills a need, while grassroots publishing demonstrates a need and a way to fulfill it. Finally, whereas academic publishing is largely a support activity aimed at facilitating the dissemination of information as a relatively minor part of a technical program, grassroots publishing is a more substantive activity aimed at producing a catalytic effect. Publication for the grassroots further calls for a different methodological approach. Given the constraint of numbers, publications aimed at the grassroots can only be examples or prototypes. The function of a prototype is to serve both as a basis for translation, adaptation, and replication and as a model end result. The approach to the use and promotion of prototypes differs according to the specific country situation. In countries with a heterogenous culture or several different languages, 2 items should be produced: a prototype of the complete text, which should be pretested and evaluated, and a prototype adaptation kit stripped of cultural and social biases. Promotion of the translation and replication of a publication can be achieved by involving officials at the various levels of government, interesting international and voluntary funding agencies, and stimulating indigenous printing capacities at the community level. The most important factors are the appropriateness of the publication in solving specific priority problems and the interest and involvement of national and state authorities at all stages of the project.

  4. Development of the affiliate system based on modern development methodologies

    OpenAIRE

    Fajmut, Aljaž

    2016-01-01

    Affiliate partnership is a popular and effective method of online marketing through affiliate partners. The thesis describes the development of a product, which allows us to easily integrate affiliate system into an existing platform (e-commerce or service). This kind of functionality opens up growth opportunities for the business. The system is designed in a way that it requires minimal amount of changes for the implementation into an existing application. The development of the product is ...

  5. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  6. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  7. Water Quality Research Program: Development of Unstructured Grid Linkage Methodology and Software for CE-QUAL-ICM

    National Research Council Canada - National Science Library

    Chapman, Raymond

    1997-01-01

    This study was conducted for the purpose of developing a methodology and associated software for linking hydrodynamic output from the RMAlO finite element model to the CE-QUAL-ICM finite volume water quality model...

  8. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  9. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  11. Development of a simplified statistical methodology for nuclear fuel rod internal pressure calculation

    International Nuclear Information System (INIS)

    Kim, Kyu Tae; Kim, Oh Hwan

    1999-01-01

    A simplified statistical methodology is developed in order to both reduce over-conservatism of deterministic methodologies employed for PWR fuel rod internal pressure (RIP) calculation and simplify the complicated calculation procedure of the widely used statistical methodology which employs the response surface method and Monte Carlo simulation. The simplified statistical methodology employs the system moment method with a deterministic statistical methodology employs the system moment method with a deterministic approach in determining the maximum variance of RIP. The maximum RIP variance is determined with the square sum of each maximum value of a mean RIP value times a RIP sensitivity factor for all input variables considered. This approach makes this simplified statistical methodology much more efficient in the routine reload core design analysis since it eliminates the numerous calculations required for the power history-dependent RIP variance determination. This simplified statistical methodology is shown to be more conservative in generating RIP distribution than the widely used statistical methodology. Comparison of the significances of each input variable to RIP indicates that fission gas release model is the most significant input variable. (author). 11 refs., 6 figs., 2 tabs

  12. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2005-04-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  13. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2008-01-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  14. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  15. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  16. Development on design methodology of PWR passive containment system

    International Nuclear Information System (INIS)

    Lee, Seong Wook

    1998-02-01

    The containment is the most important barrier against the release of radioactive materials into the environment during accident conditions of nuclear power plants. Therefore the development of a reliable containment cooling system is one of key areas in advanced reactor development. To enhance the safety of the containment system, many new containment system designs have been proposed and developed in the world. Several passive containment cooling system (PCCS) concepts for both steel and concrete containment systems are overviewed and assessed comparatively. Major concepts considered are: (a) the spray of water on the outer surface of a steel containment from an elevated tank, (b) an external moat for a steel containment, (c) a suppression pool for a concrete containment, and (d) combination of the internal spray and internal or external condensers for a concrete containment. Emphasis is given to the heat removal principles, the required heat transfer area, system complexity and operational reliability. As one of conceptual design steps of containment, a methodology based on scaling principles is proposed to determine the containment size according to the power level. The AP600 containment system is selected as the reference containment to which the scaling laws are applied. Governing equations of containment pressure are set up in consideration of containment behavior in accident conditions. Then, the dimensionless numbers, which characterize the containment phenomena, are derived for the blowdown dominant and decay heat dominant stage, respectively. The important phenomena in blowdown stage are mass and energy sources and their absorption in containment atmosphere or containment structure, while heat transfer to the outer environment becomes important in decay heat stage. Based on their similarity between the prototype and the model, the containment sizes are determined for higher power levels and are compared with the SPWR containment design values available

  17. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  18. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  19. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  20. A New Methodology of Design and Development of Serious Games

    Directory of Open Access Journals (Sweden)

    André F. S. Barbosa

    2014-01-01

    Full Text Available The development of a serious game requires perfect knowledge of the learning domain to obtain the desired results. But it is also true that this may not be enough to develop a successful serious game. First of all, the player has to feel that he is playing a game where the learning is only a consequence of the playing actions. Otherwise, the game is viewed as boring and not as a fun activity and engaging. For example, the player can catch some items in the scenario and then separate them according to its type (i.e., recycle them. Thus, the main action for player is catching the items in the scenario where the recycle action is a second action, which is viewed as a consequence of the first action. Sometimes, the game design relies on a detailed approach based on the ideas of the developers because some educational content are difficult to integrate in the games, while maintaining the fun factor in the first place. In this paper we propose a new methodology of design and development of serious games that facilitates the integration of educational contents in the games. Furthermore, we present a serious game, called “Clean World”, created using this new methodology.

  1. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  2. Theoretical and methodological foundations of sustainable development of Geosystems

    Science.gov (United States)

    Mandryk, O. M.; Arkhypova, L. M.; Pukish, A. V.; Zelmanovych, A.; Yakovlyuk, Kh

    2017-05-01

    The theoretical and methodological foundations of sustainable development of Geosystems were further evolved. It was grounded the new scientific direction “constructive Hydroecology” - the science that studies the Hydrosphere from the standpoint of natural and technogenic safety based on geosystematical approach. A structural separation for constructive Hydroecology based on objective, subjective, and application characteristics was set. The main object of study of the new scientific field is the hydroecological environment under which the part of Hydrosphere should be understood as a part of the multicomponent dynamic system that is influenced by engineering and economical human activities and, in turn, determines to some extent this activity.

  3. Modeling myocardial infarction in mice: methodology, monitoring, pathomorphology.

    Science.gov (United States)

    Ovsepyan, A A; Panchenkov, D N; Prokhortchouk, E B; Telegin, G B; Zhigalova, N A; Golubev, E P; Sviridova, T E; Matskeplishvili, S T; Skryabin, K G; Buziashvili, U I

    2011-01-01

    Myocardial infarction is one of the most serious and widespread diseases in the world. In this work, a minimally invasive method for simulating myocardial infarction in mice is described in the Russian Federation for the very first time; the procedure is carried out by ligation of the coronary heart artery or by controlled electrocoagulation. As a part of the methodology, a series of anesthetic, microsurgical and revival protocols are designed, owing to which a decrease in the postoperational mortality from the initial 94.6 to 13.6% is achieved. ECG confirms the development of large-focal or surface myocardial infarction. Postmortal histological examination confirms the presence of necrosis foci in the heart muscles of 87.5% of animals. Altogether, the medical data allow us to conclude that an adequate mouse model for myocardial infarction was generated. A further study is focused on the standardization of the experimental procedure and the use of genetically modified mouse strains, with the purpose of finding the most efficient therapeutic approaches for this disease.

  4. A Methodology to Assess Ionospheric Models for GNSS

    Science.gov (United States)

    Rovira-Garcia, Adria; Juan, José Miguel; Sanz, Jaume; González-Casado, Guillermo; Ibánez, Deimos

    2015-04-01

    Testing the accuracy of the ionospheric models used in the Global Navigation Satellite System (GNSS) is a long-standing issue. It is still a challenging problem due to the lack of accurate enough slant ionospheric determinations to be used as a reference. The present study proposes a methodology to assess any ionospheric model used in satellite-based applications and, in particular, GNSS ionospheric models. The methodology complements other analysis comparing the navigation based on different models to correct the code and carrier-phase observations. Specifically, the following ionospheric models are assessed: the operational models broadcast in the Global Positioning System (GPS), Galileo and the European Geostationary Navigation Overlay System (EGNOS), the post-process Global Ionospheric Maps (GIMs) from different analysis centers belonging to the International GNSS Service (IGS) and, finally, a new GIM computed by the gAGE/UPC research group. The methodology is based in the comparison between the predictions of the ionospheric model with actual unambiguous carrier-phase measurements from a global distribution of permanent receivers. The differences shall be separated into the hardware delays (a receiver constant plus a satellite constant) per data interval, e.g., a day. The condition that these Differential Code Biases (DCBs) are commonly shared throughout the world-wide network of receivers and satellites provides a global character to the assessment. This approach generalizes simple tests based on double differenced Slant Total Electron Contents (STECs) between pairs of satellites and receivers on a much local scale. The present study has been conducted during the entire 2014, i.e., the last Solar Maximum. The seasonal and latitudinal structures of the results clearly reflect the different strategies used by the different models. On one hand, ionospheric model corrections based on a grid (IGS-GIMs or EGNOS) are shown to be several times better than the models

  5. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    International Nuclear Information System (INIS)

    Knezevic, J.; Odoom, E.R.

    2001-01-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets

  6. TECHNOLOGY FOR DEVELOPMENT OF ELECTRONIC TEXTBOOK ON HANDICRAFTS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Iryna V. Androshchuk

    2017-10-01

    Full Text Available The main approaches to defining the concept of electronic textbook have been analyzed in the article. The main advantages of electronic textbooks in the context of future teachers’ training have been outlined. They are interactivity, feedback provision, availability of navigation and search engine. The author has presented and characterized the main stages in the technology of development of an electronic textbook on Handicraft and Technology Training Methodology: determination of its role and significance in the process of mastering the discipline; justification of its structure; outline of the stages of its development in accordance with the defined structure. The characteristic feature of the developed electronic textbook is availability of macro- and microstructure. Macrostructure is viewed as a sequence of components of the electronic textbook that are manifested in its content; microstructure is considered to be an internal pattern of each component of macrostructure.

  7. CFD methodology development for Singapore Green Mark Building application

    NARCIS (Netherlands)

    Chiu, P.H.; Raghavan, V.S.G.; Poh, H.J.; Tan, E.; Gabriela, O.; Wong, N.H.; van Hooff, T.; Blocken, B.; Li, R.; Leong-Kok, S.M.

    2017-01-01

    In the recent decade, investigation on the total building performance has become increasingly important for the environmental modelling community. With the advance of integrated design and modelling tool and Building Information Modelling (BIM) development, it is now possible to simulate and predict

  8. Teaching methodology for modeling reference evapotranspiration with artificial neural networks

    OpenAIRE

    Martí, Pau; Pulido Calvo, Inmaculada; Gutiérrez Estrada, Juan Carlos

    2015-01-01

    [EN] Artificial neural networks are a robust alternative to conventional models for estimating different targets in irrigation engineering, among others, reference evapotranspiration, a key variable for estimating crop water requirements. This paper presents a didactic methodology for introducing students in the application of artificial neural networks for reference evapotranspiration estimation using MatLab c . Apart from learning a specific application of this software wi...

  9. Selection of low-level radioactive waste disposal sites using screening models versus more complex methodologies

    International Nuclear Information System (INIS)

    Uslu, I.; Fields, D.E.

    1993-01-01

    The task of choosing a waste-disposal site from a set of candidate sites requires an approach capable of objectively handling many environmental variables for each site. Several computer methodologies have been developed to assist in the process of choosing a site for the disposal of low-level radioactive waste; however, most of these models are costly to apply, in terms of computer resources and the time and effort required by professional modelers, geologists, and waste-disposal experts. The authors describe how the relatively simple DRASTIC methodology (a standardized system for evaluating groundwater pollution potential using hydrogeologic settings) may be used for open-quotes pre-screeningclose quotes of sites to determine which subset of candidate sites is worthy of more detailed screening. Results of site comparisons made with DRASTIC are compared with results obtained using PRESTO-II methodology, which is representative of the more complex release-transport-human exposure methodologies. 6 refs., 1 fig., 1 tab

  10. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  11. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    Science.gov (United States)

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  12. System study methodology development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Sarto, S.; Zappellini, G.; Gambi, G.

    1989-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to describe the rules to apply in scientific research. This methodology is a powerful tool for evaluating the options, compared with conventional analytical methods as a higher number of parameters can be taken into account, with a higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. This method can be limited to a specific objective such as a fusion reactor safety analysis, taking into account other major constraints such as the economical environment. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design. (orig.)

  13. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  14. CHARACTERISTICS OF RESEARCH METHODOLOGY DEVELOPMENT IN SPECIAL EDUCATION AND REHABILITATION

    Directory of Open Access Journals (Sweden)

    Natasha ANGELOSKA-GALEVSKA

    2004-12-01

    Full Text Available The aim of the text is to point out the developmental tendencies in the research methodology of special education and rehabilitation worldwide and in our country and to emphasize the importance of methodological training of students in special education and rehabilitation at the Faculty of Philosophy in Skopje.The achieved scientific knowledge through research is the fundamental pre-condition for development of special education and rehabilitation theory and practice. The results of the scientific work sometimes cause small, insignificant changes, but, at times, they make radical changes. Thank to the scientific researches and knowledge, certain prejudices were rejected. For example, in the sixth decade of the last century there was a strong prejudice that mentally retarded children should be segregated from the society as aggressive and unfriendly ones or the deaf children should not learn sign language because they would not be motivated to learn lip-reading and would hardly adapt. Piaget and his colleagues from Geneva institute were the pioneers in researching this field and they imposed their belief that handicapped children were not handicapped in each field and they had potentials that could be developed and improved by systematic and organized work. It is important to initiate further researches in the field of special education and rehabilitation, as well as a critical analysis of realized researches. Further development of the scientific research in special education and rehabilitation should be a base for education policy on people with disabilities and development of institutional and non-institutional treatment of this population.

  15. Development of a methodology for classifying software errors

    Science.gov (United States)

    Gerhart, S. L.

    1976-01-01

    A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.

  16. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  17. Urban Agglomerations in Regional Development: Theoretical, Methodological and Applied Aspects

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Shmidt

    2016-09-01

    Full Text Available The article focuses on the analysis of the major process of modern socio-economic development, such as the functioning of urban agglomerations. A short background of the economic literature on this phenomenon is given. There are the traditional (the concentration of urban types of activities, the grouping of urban settlements by the intensive production and labour communications and modern (cluster theories, theories of network society conceptions. Two methodological principles of studying the agglomeration are emphasized: the principle of the unity of the spatial concentration of economic activity and the principle of compact living of the population. The positive and negative effects of agglomeration in the economic and social spheres are studied. Therefore, it is concluded that the agglomeration is helpful in the case when it brings the agglomerative economy (the positive benefits from it exceed the additional costs. A methodology for examination the urban agglomeration and its role in the regional development is offered. The approbation of this methodology on the example of Chelyabinsk and Chelyabinsk region has allowed to carry out the comparative analysis of the regional centre and the whole region by the main socio-economic indexes under static and dynamic conditions, to draw the conclusions on a position of the city and the region based on such socio-economic indexes as an average monthly nominal accrued wage, the cost of fixed assets, the investments into fixed capital, new housing supply, a retail turnover, the volume of self-produced shipped goods, the works and services performed in the region. In the study, the analysis of a launching site of the Chelyabinsk agglomeration is carried out. It has revealed the following main characteristics of the core of the agglomeration in Chelyabinsk (structure feature, population, level of centralization of the core as well as the Chelyabinsk agglomeration in general (coefficient of agglomeration

  18. METHODOLOGY OF RESEARCH AND DEVELOPMENT MANAGEMENT OF REGIONAL NETWORK ECONOMY

    Directory of Open Access Journals (Sweden)

    O.I. Botkin

    2007-06-01

    Full Text Available Information practically of all the Russian regions economy branches and development by managing subjects is information − communicative the Internet technologies render huge influence on economic attitudes development in the environment of regional business: there are new forms of interaction of managing subjects and change is information − organizational structures of regional business management. Integrated image of the set forth above innovations is the regional network economy representing the interactive environment in which on high speed and with minimal transaction (R.H.Coase’s costs are performed social economic and commodity monetary attitudes between managing subjects of region with use of Internet global network interactive opportunities. The urgency of the regional network economy phenomenon research, first of all, is caused by necessity of a substantiation of regional network economy methodology development and management mechanisms development by its infrastructure with the purpose of regional business efficiency increase. In our opinion, the decision of these problems will be the defining factor of effective economic development maintenance and russian regions economy growth in the near future.

  19. HRS Clinical Document Development Methodology Manual and Policies: Executive summary.

    Science.gov (United States)

    Indik, Julia H; Patton, Kristen K; Beardsall, Marianne; Chen-Scarabelli, Carol A; Cohen, Mitchell I; Dickfeld, Timm-Michael L; Haines, David E; Helm, Robert H; Krishnan, Kousik; Nielsen, Jens Cosedis; Rickard, John; Sapp, John L; Chung, Mina

    2017-10-01

    The Heart Rhythm Society (HRS) has been developing clinical practice documents in collaboration and partnership with other professional medical societies since 1996. The HRS formed a Scientific and Clinical Documents Committee (SCDC) with the sole purpose of managing the development of these documents from conception through publication. The SCDC oversees the process for developing clinical practice documents, with input and approval from the HRS Executive Committee and the Board of Trustees. As of May 2017, the HRS has produced more than 80 publications with other professional organizations. This process manual is produced to publicly and transparently declare the standards by which the HRS develops clinical practice documents, which include clinical practice guidelines, expert consensus statements, scientific statements, clinical competency statements, task force policy statements, and proceedings statements. The foundation for this process is informed by the Institute of Medicine's standards for developing trustworthy clinical practice guidelines; the new criteria from the National Guidelines Clearinghouse, effective June 2014; SCDC member discussions; and a review of guideline policies and methodologies used by other professional organizations. Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  20. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  1. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  2. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.

    1994-01-01

    A new methodology for equivalent dose calculations has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neutral network. The research was orientated towards the optimization of the whole set of parameters involves in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neutral network was performed by taking the readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation. (author)

  3. Methodology on the sparger development for Korean next generation reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloudacceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs.

  4. Turbofan Engine Core Compartment Vent Aerodynamic Configuration Development Methodology

    Science.gov (United States)

    Hebert, Leonard J.

    2006-01-01

    This paper presents an overview of the design methodology used in the development of the aerodynamic configuration of the nacelle core compartment vent for a typical Boeing commercial airplane together with design challenges for future design efforts. Core compartment vents exhaust engine subsystem flows from the space contained between the engine case and the nacelle of an airplane propulsion system. These subsystem flows typically consist of precooler, oil cooler, turbine case cooling, compartment cooling and nacelle leakage air. The design of core compartment vents is challenging due to stringent design requirements, mass flow sensitivity of the system to small changes in vent exit pressure ratio, and the need to maximize overall exhaust system performance at cruise conditions.

  5. Development of a low-level waste risk methodology

    International Nuclear Information System (INIS)

    Fisher, J.E.; Falconer, K.L.

    1984-01-01

    A probabilistic risk assessment method is presented for performance evaluation of low-level waste disposal facilities. The associated program package calculates the risk associated with postulated radionuclide release and transport scenarios. Risk is computed as the mathematical product of two statistical variables: the dose consequence of a given release scenario, and its occurrence probability. A sample risk calculation is included which demonstrates the method. This PRA method will facilitate evaluation of facility performance, including identification of high risk scenarios and their mitigation via optimization of site parameters. The method is intended to be used in facility licensing as a demonstration of compliance with the performance objectives set forth in 10 CFR Part 61, or in corresponding state regulations. The Low-Level Waste Risk Methodology is being developed under sponsorship of the Nuclear Regulatory Commission

  6. Methodology on the sparger development for Korean next generation reactor

    International Nuclear Information System (INIS)

    Kim, Hwan Yeol; Hwang, Y.D.; Kang, H.S.; Cho, B.H.; Park, J.K.

    1999-06-01

    In case of an accident, the safety depressurization system of Korean Next Generation Reactor (KNGR) efficiently depressurize the reactor pressure by directly discharge steam of high pressure and temperature from the pressurizer into the in-containment refuelling water storage tank (IRWST) through spargers. This report was generated for the purpose of developing the sparger of KNGR. This report presents the methodology on application of ABB-Atom. Many thermal hydraulic parameters affecting the maximum bubble could pressure were obtained and the maximum bubble cloud pressure transient curve so called forcing function of KNGR was suggested and design inputs for IRWST (bubble cloud radius vs. time, bubble cloud velocity vs. time, bubble cloud acceleration vs. time, etc.) were generated by the analytic using Rayleigh-Plesset equation. (author). 17 refs., 6 tabs., 27 figs

  7. Development of new methodology for dose calculation in photographic dosimetry

    International Nuclear Information System (INIS)

    Daltro, T.F.L.; Campos, L.L.

    1994-01-01

    A new methodology for equivalent dose calculation has been developed at IPEN-CNEN/SP to be applied at the Photographic Dosimetry Laboratory using artificial intelligence techniques by means of neural network. The research was oriented towards the optimization of the whole set of parameters involved in the film processing going from the irradiation in order to obtain the calibration curve up to the optical density readings. The learning of the neural network was performed by taking readings of optical density from calibration curve as input and the effective energy and equivalent dose as output. The obtained results in the intercomparison show an excellent agreement with the actual values of dose and energy given by the National Metrology Laboratory of Ionizing Radiation

  8. SPRINT RA 230: Methodology for knowledge based developments

    International Nuclear Information System (INIS)

    Wallsgrove, R.; Munro, F.

    1991-01-01

    SPRINT RA 230: A Methodology for Knowledge Based Developments, funded by the European Commission, was set up to investigate the use of KBS in the engineering industry. Its aim was to find out low KBS were currently used and what people's conceptions of them was, to disseminate current knowledge and to recommend further research into this area. A survey (by post and face to face interviews) was carried out under SPRINT RA 230 to investigate requirements for more intelligent software. In the survey we looked both at how people think about Knowledge Based Systems (KBS), what they find useful and what is not useful, and what current expertise problems or limitations of conventional software might suggest KBS solutions. (orig./DG)

  9. Development and evaluation of clicker methodology for introductory physics courses

    Science.gov (United States)

    Lee, Albert H.

    Many educators understand that lectures are cost effective but not learning efficient, so continue to search for ways to increase active student participation in this traditionally passive learning environment. In-class polling systems, or "clickers", are inexpensive and reliable tools allowing students to actively participate in lectures by answering multiple-choice questions. Students assess their learning in real time by observing instant polling summaries displayed in front of them. This in turn motivates additional discussions which increase the opportunity for active learning. We wanted to develop a comprehensive clicker methodology that creates an active lecture environment for a broad spectrum of students taking introductory physics courses. We wanted our methodology to incorporate many findings of contemporary learning science. It is recognized that learning requires active construction; students need to be actively involved in their own learning process. Learning also depends on preexisting knowledge; students construct new knowledge and understandings based on what they already know and believe. Learning is context dependent; students who have learned to apply a concept in one context may not be able to recognize and apply the same concept in a different context, even when both contexts are considered to be isomorphic by experts. On this basis, we developed question sequences, each involving the same concept but having different contexts. Answer choices are designed to address students preexisting knowledge. These sequences are used with the clickers to promote active discussions and multiple assessments. We have created, validated, and evaluated sequences sufficient in number to populate all of introductory physics courses. Our research has found that using clickers with our question sequences significantly improved student conceptual understanding. Our research has also found how to best measure student conceptual gain using research-based instruments

  10. In-house developed methodologies and tools for decommissioning projects

    International Nuclear Information System (INIS)

    Detilleux, Michel; Centner, Baudouin

    2007-01-01

    The paper describes different methodologies and tools developed in-house by Tractebel Engineering to facilitate the engineering works to be carried out especially in the frame of decommissioning projects. Three examples of tools with their corresponding results are presented: - The LLWAA-DECOM code, a software developed for the radiological characterization of contaminated systems and equipment. The code constitutes a specific module of more general software that was originally developed to characterize radioactive waste streams in order to be able to declare the radiological inventory of critical nuclides, in particular difficult-to-measure radionuclides, to the Authorities. In the case of LLWAA-DECOM, deposited activities inside contaminated equipment (piping, tanks, heat exchangers...) and scaling factors between nuclides, at any given time of the decommissioning time schedule, are calculated on the basis of physical characteristics of the systems and of operational parameters of the nuclear power plant. This methodology was applied to assess decommissioning costs of Belgian NPPs, to characterize the primary system of Trino NPP in Italy, to characterize the equipment of miscellaneous circuits of Ignalina NPP and of Kozloduy unit 1 and, to calculate remaining dose rates around equipment in the frame of the preparation of decommissioning activities; - The VISIMODELLER tool, a user friendly CAD interface developed to ease the introduction of lay-out areas in a software named VISIPLAN. VISIPLAN is a 3D dose rate assessment tool for ALARA work planning, developed by the Belgian Nuclear Research Centre SCK.CEN. Both softwares were used for projects such as the steam generators replacements in Belgian NPPs or the preparation of the decommissioning of units 1 and 2 of Kozloduy NPP; - The DBS software, a software developed to manage the different kinds of activities that are part of the general time schedule of a decommissioning project. For each activity, when relevant

  11. Trends in scenario development methodologies and integration in NUMO's approach

    International Nuclear Information System (INIS)

    Ebashi, Takeshi; Ishiguro, Katsuhiko; Wakasugi, Keiichiro; Kawamura, Hideki; Gaus, Irina; Vomvoris, Stratis; Martin, Andrew J.; Smith, Paul

    2011-01-01

    The development of scenarios for quantitative or qualitative analysis is a key element of the assessment of the safety of geological disposal systems. As an outcome of an international workshop attended by European and the Japanese implementers, a number of features common to current methodologies could be identified, as well as trends in their evolution over time. In the late nineties, scenario development was often described as a bottom-up process, whereby scenarios were said to be developed in essence from FEP databases. Nowadays, it is recognised that, in practice, the approaches actually adopted are better described as top-down or 'hybrid', taking as their starting point an integrated (top-down) understanding of the system under consideration including uncertainties in initial state, sometimes assisted by the development of 'storyboards'. A bottom-up element remains (hence the term 'hybrid') to the extent that FEP databases or FEP catalogues (including interactions) are still used, but the focus is generally on completeness checking, which occurs parallel to the main assessment process. Recent advances focus on the consistent treatment of uncertainties throughout the safety assessment and on the integration of operational safety and long term safety. (author)

  12. Qualified software development methodologies for nuclear class 1E equipment

    International Nuclear Information System (INIS)

    Koch, Shlomo; Ruether, J.

    1992-01-01

    This article describes the experience learned at Northern States Power and Spectrum Technologies, during the development of a computer based Safeguard Load Sequencer, for Prairie Island Nuclear Generating Plant. The Safeguard Load Sequencer (SLS) performs the function of 4kV emergency bus voltage restoration, load shedding, and emergency diesel generator loading. The system is designed around an Allen-Bradley PLC-5 programmable controller. The Safeguard Load Sequencer is the vehicle to demonstrate the software engineering procedures and methodologies. The article analyzes the requirements imposed by the NUREG 4640 handbook, and the relevant IEEE standards. The article tries to answer the question what is software engineering, and describe the waterfall life cycle phases of software development. The effects of each phase on software quality and V and V plan is described. Issues designing a V and V plan is addressed, and considerations of cost and time to implement the program are described. The article also addresses the subject of tools that can increase productivity and reduce the cost and time of an extensive V and V plan. It describes the tools the authors used, and more importantly presents a wish list of tools that they as developers would like to have. The role of testing is presented. They show that testing at the final stage has a lower impact on software quality then generally assumed. Full coverage of testing is almost always impossible, and they demonstrate how alternative audits and test during the development phase can improve software reliability

  13. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  14. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    International Nuclear Information System (INIS)

    Dorp, F. van

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways (a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  15. System study methodology. Development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Zappellini, G.; Gambi, G.

    1988-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to explicit the rules to apply in scientific research. This methodology is a powerful tool to evaluate the options to be made, compared with conventional analytical methods as a higher number of parameters can be taken into account, with higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships, by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. Experimental values collection, analysis of the problem, search of solutions, sizing of the installation from defined functions, cost evaluation (planning and operating) and ranking of the options as regard all the constraints are the main points considered for the system's application. This method can be limited to a specific objective such as a fusion reactor safety analysis. The possibility of taking into account all the options, possible accidents, quality assurance, exhaustivity of the safety analysis, identification of the residual risk and modelisation of the results are the main advantages of this approach. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design

  16. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  17. Methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Mazumdar, M.; Marshall, J.A.; Chay, S.C.; Gay, R.

    1976-07-01

    In February 1975, Westinghouse Electric Corporation, under contract to Electric Power Research Institute, started a one-year program to develop methodology for statistical evaluation of nuclear-safety-related engineering analyses. The objectives of the program were to develop an understanding of the relative efficiencies of various computational methods which can be used to compute probability distributions of output variables due to input parameter uncertainties in analyses of design basis events for nuclear reactors and to develop methods for obtaining reasonably accurate estimates of these probability distributions at an economically feasible level. A series of tasks was set up to accomplish these objectives. Two of the tasks were to investigate the relative efficiencies and accuracies of various Monte Carlo and analytical techniques for obtaining such estimates for a simple thermal-hydraulic problem whose output variable of interest is given in a closed-form relationship of the input variables and to repeat the above study on a thermal-hydraulic problem in which the relationship between the predicted variable and the inputs is described by a short-running computer program. The purpose of the report presented is to document the results of the investigations completed under these tasks, giving the rationale for choices of techniques and problems, and to present interim conclusions

  18. Development and testing of the methodology for performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Department of Energy (DOE) is in the process of implementing a set of materials control and accountability (MC ampersand A) performance requirements. These graded requirements set a uniform level of performance for similar materials at various facilities against the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a special value and include defense-in-depth requirements. The DOE has conducted an extensive effort over the last 2 1/2 yr to develop a practical methodology to be used in evaluating facility performance against the performance requirements specified in DOE order 5633.3. The major participants in the development process have been the Office of Safeguards and Security (OSS), Brookhaven National Laboratory, and Los Alamos National Laboratory. The process has included careful reviews of related evaluation systems, a review of the intent of the requirements in the order, and site visits to most of the major facilities in the DOE complex. As a result of this extensive effort to develop guidance for the MC ampersand A performance requirements, OSS was able to provide a practical method that will allow facilities to evaluate the performance of their safeguards systems against the performance requirements. In addition, the evaluations can be validated by the cognizant operations offices in a systematic manner

  19. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  20. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  1. Development of a design methodology for hydraulic pipelines carrying rectangular capsules

    International Nuclear Information System (INIS)

    Asim, Taimoor; Mishra, Rakesh; Abushaala, Sufyan; Jain, Anuj

    2016-01-01

    The scarcity of fossil fuels is affecting the efficiency of established modes of cargo transport within the transportation industry. Efforts have been made to develop innovative modes of transport that can be adopted for economic and environmental friendly operating systems. Solid material, for instance, can be packed in rectangular containers (commonly known as capsules), which can then be transported in different concentrations very effectively using the fluid energy in pipelines. For economical and efficient design of such systems, both the local flow characteristics and the global performance parameters need to be carefully investigated. Published literature is severely limited in establishing the effects of local flow features on system characteristics of Hydraulic Capsule Pipelines (HCPs). The present study focuses on using a well validated Computational Fluid Dynamics (CFD) tool to numerically simulate the solid-liquid mixture flow in both on-shore and off-shore HCPs applications including bends. Discrete Phase Modelling (DPM) has been employed to calculate the velocity of the rectangular capsules. Numerical predictions have been used to develop novel semi-empirical prediction models for pressure drop in HCPs, which have then been embedded into a robust and user-friendly pipeline optimisation methodology based on Least-Cost Principle. - Highlights: • Local flow characteristics in a pipeline transporting rectangular capsules. • Development of prediction models for the pressure drop contribution of capsules. • Methodology developed for sizing of Hydraulic Capsule Pipelines. • Implementation of the developed methodology to obtain optimal pipeline diameter.

  2. Application of SADT and ARIS methodologies for modeling and management of business processes of information systems

    Directory of Open Access Journals (Sweden)

    O. V. Fedorova

    2018-01-01

    Full Text Available The article is devoted to application of SADT and ARIS methodologies for modeling and management of business processes of information systems. The relevance of this article is beyond doubt, because the design of the architecture of information systems, based on a thorough system analysis of the subject area, is of paramount importance for the development of information systems in general. The authors conducted a serious work on the analysis of the application of SADT and ARIS methodologies for modeling and managing business processes of information systems. The analysis was carried out both in terms of modeling business processes (notation and applying the CASE-tool, and in terms of business process management. The first point of view reflects the interaction of the business analyst and the programmer in the development of the information system. The second point of view is the interaction of the business analyst and the customer. The basis of many modern methodologies for modeling business processes is the SADT methodology. Using the methodology of the IDEF family, it is possible to efficiently display and analyze the activity models of a wide range of complex information systems in various aspects. CASE-tool ARIS is a complex of tools for analysis and modeling of the organization's activities. The methodical basis of ARIS is a set of different modeling methods that reflect different views on the system under study. The authors' conclusions are fully justified. The results of the work can be useful for specialists in the field of modeling business processes of information systems. In addition, the article has an oriented character when working on the constituent elements of curricula for students specializing in information specialties and management, provides an update of the content and structure of disciplines on modeling the architecture of information systems and organization management, using models.

  3. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  4. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  5. Model identification methodology for fluid-based inerters

    Science.gov (United States)

    Liu, Xiaofu; Jiang, Jason Zheng; Titurus, Branislav; Harrison, Andrew

    2018-06-01

    Inerter is the mechanical dual of the capacitor via the force-current analogy. It has the property that the force across the terminals is proportional to their relative acceleration. Compared with flywheel-based inerters, fluid-based forms have advantages of improved durability, inherent damping and simplicity of design. In order to improve the understanding of the physical behaviour of this fluid-based device, especially caused by the hydraulic resistance and inertial effects in the external tube, this work proposes a comprehensive model identification methodology. Firstly, a modelling procedure is established, which allows the topological arrangement of the mechanical networks to be obtained by mapping the damping, inertance and stiffness effects directly to their respective hydraulic counterparts. Secondly, an experimental sequence is followed, which separates the identification of friction, stiffness and various damping effects. Furthermore, an experimental set-up is introduced, where two pressure gauges are used to accurately measure the pressure drop across the external tube. The theoretical models with improved confidence are obtained using the proposed methodology for a helical-tube fluid inerter prototype. The sources of remaining discrepancies are further analysed.

  6. Integrating FMEA in a Model-Driven Methodology

    Science.gov (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  7. A Design Science Research Methodology for Expert Systems Development

    Directory of Open Access Journals (Sweden)

    Shah Jahan Miah

    2016-11-01

    Full Text Available The knowledge of design science research (DSR can have applications for improving expert systems (ES development research. Although significant progress of utilising DSR has been observed in particular information systems design – such as decision support systems (DSS studies – only rare attempts can be found in the ES design literature. Therefore, the aim of this study is to investigate the use of DSR for ES design. First, we explore the ES development literature to reveal the presence of DSR as a research methodology. For this, we select relevant literature criteria and apply a qualitative content analysis in order to generate themes inductively to match the DSR components. Second, utilising the findings of the comparison, we determine a new DSR approach for designing a specific ES that is guided by another result – the findings of a content analysis of examination scripts in Mathematics. The specific ES artefact for a case demonstration is designed for addressing the requirement of a ‘wicked’ problem in that the key purpose is to assist human assessors when evaluating multi-step question (MSQ solutions. It is anticipated that the proposed design knowledge, in terms of both problem class and functions of ES artefacts, will help ES designers and researchers to address similar issues for designing information system solutions.

  8. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  9. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    Science.gov (United States)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  10. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  11. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  12. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  13. Development of methodology of financial assets accounting in IFRS context

    Directory of Open Access Journals (Sweden)

    V.I. Tsurkanu

    2018-04-01

    Full Text Available In the innovation economy the proportion of resources directed to investment is significantly increasing and therefore the process becomes an integral part of the economic activities of modern organizations. In that situation the organization acquire another type of assets called financial, which differ in their characteristics from tangible and intangible assets. The authors of the present study firstly prove the need for economic interpretation of the financial assets and allocation in the balance their own positions, after the recognition, on the basis of the characteristic of such assets and for accounting and reporting should be assessed. In this context, we reveal methods that can choose the organizations, using business management models implemented by IFRS 9 «Financial instruments» for evaluation of financial assets, depending on their category. Special attention is paid to improving the methodology of accounting for financial assets in accordance with their specific characteristics of recognition and measurement. These issues are investigated not only in theoretical terms, but also on the basis of the comparison of normative and legislative acts of the Republic of Moldova and Ukraine with the regulations of IFRS. In addition, whereas the accounting systems and financial reporting in these countries change in accordance with the requirements of the Directive 2013/34/EU, their impact on the accounting of financial assets is also taken into account. According to the results of the research, drafting conclusions and suggestions are of theoretical nature and are of practical importance.

  14. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  15. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    Science.gov (United States)

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  16. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  17. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  18. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  19. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Ganapathi Iyer, S.; Ali, M.M.; Thantry, S.S.; Verma, R.; Arunachalam, J.; Walvekar, A.P.

    1992-01-01

    In 1976 Indian Researchers suggested the possible use of hair as an indicator of environmental exposure and established through a study of country wide student population and general population of the metropolitan city of Bombay that human scalp hair could indeed be an effective first level monitor in a scheme of multilevel monitoring of environmental exposure to inorganic pollutants. It was in this context and in view of the ready availability of large quantities of scalp hair subjected to minimum treatment by chemicals that they proposed to participate in the preparation of a standard material of hair. It was also recognized that measurements of trace element concentrations at very low levels require cross-validation by different analytical techniques, even within the same laboratory. The programme of work that has been carried out since the first meeting of the CRP had been aimed at these two objectives. These objectives include the preparation of standard material of hair and the development of analytical methodologies for determination of elements and species of interest. 1 refs., 3 tabs

  20. Development of methodology for the characterization of radioactive sealed sources

    International Nuclear Information System (INIS)

    Ferreira, Robson de Jesus

    2010-01-01

    Sealed radioactive sources are widely used in many applications of nuclear technology in industry, medicine, research and others. The International Atomic Energy Agency (IAEA) estimates tens of millions sources in the world. In Brazil, the number is about 500 thousand sources, if the Americium-241 sources present in radioactive lightning rods and smoke detectors are included in the inventory. At the end of the useful life, most sources become disused, constitute a radioactive waste, and are then termed spent sealed radioactive sources (SSRS). In Brazil, this waste is collected by the research institutes of the Nuclear Commission of Nuclear Energy and kept under centralized storage, awaiting definition of the final disposal route. The Waste Management Laboratory (WML) at the Nuclear and Energy Research Institute is the main storage center, having received until July 2010 about 14.000 disused sources, not including the tens of thousands of lightning rod and smoke detector sources. A program is underway in the WML to replacing the original shielding by a standard disposal package and to determining the radioisotope content and activity of each one. The identification of the radionuclides and the measurement of activities will be carried out with a well type ionization chamber. This work aims to develop a methodology for measuring or to determine the activity SSRS stored in the WML accordance with its geometry and determine their uncertainties. (author)

  1. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  2. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  3. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  4. [Systemic inflammation: theoretical and methodological approaches to description of general pathological process model. Part 3. Backgroung for nonsyndromic approach].

    Science.gov (United States)

    Gusev, E Yu; Chereshnev, V A

    2013-01-01

    Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.

  5. Development of probabilistic assessment methodology for geologic disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Kimura, H.; Takahashi, T.

    1998-01-01

    The probabilistic assessment methodology is essential to evaluate uncertainties of long-term radiological consequences associated with geologic disposal of radioactive wastes. We have developed a probabilistic assessment methodology to estimate the influences of parameter uncertainties/variabilities. An exposure scenario considered here is based on a groundwater migration scenario. A computer code system GSRW-PSA thus developed is based on a non site-specific model, and consists of a set of sub modules for sampling of model parameters, calculating the release of radionuclides from engineered barriers, calculating the transport of radionuclides through the geosphere, calculating radiation exposures of the public, and calculating the statistical values relating the uncertainties and sensitivities. The results of uncertainty analyses for α-nuclides quantitatively indicate that natural uranium ( 238 U) concentration is suitable for an alternative safety indicator of long-lived radioactive waste disposal, because the estimated range of individual dose equivalent due to 238 U decay chain is narrower that that due to other decay chain ( 237 Np decay chain). It is internationally necessary to have detailed discussion on the PDF of model parameters and the PSA methodology to evaluated the uncertainties due to conceptual models and scenarios. (author)

  6. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  7. Modeling methodology for a CMOS-MEMS electrostatic comb

    Science.gov (United States)

    Iyer, Sitaraman V.; Lakdawala, Hasnain; Mukherjee, Tamal; Fedder, Gary K.

    2002-04-01

    A methodology for combined modeling of capacitance and force 9in a multi-layer electrostatic comb is demonstrated in this paper. Conformal mapping-based analytical methods are limited to 2D symmetric cross-sections and cannot account for charge concentration effects at corners. Vertex capacitance can be more than 30% of the total capacitance in a single-layer 2 micrometers thick comb with 10 micrometers overlap. Furthermore, analytical equations are strictly valid only for perfectly symmetrical finger positions. Fringing and corner effects are likely to be more significant in a multi- layered CMOS-MEMS comb because of the presence of more edges and vertices. Vertical curling of CMOS-MEMS comb fingers may also lead to reduced capacitance and vertical forces. Gyroscopes are particularly sensitive to such undesirable forces, which therefore, need to be well-quantified. In order to address the above issues, a hybrid approach of superposing linear regression models over a set of core analytical models is implemented. Design of experiments is used to obtain data for capacitance and force using a commercial 3D boundary-element solver. Since accurate force values require significantly higher mesh refinement than accurate capacitance, we use numerical derivatives of capacitance values to compute the forces. The model is formulated such that the capacitance and force models use the same regression coefficients. The comb model thus obtained, fits the numerical capacitance data to within +/- 3% and force to within +/- 10%. The model is experimentally verified by measuring capacitance change in a specially designed test structure. The capacitance model matches measurements to within 10%. The comb model is implemented in an Analog Hardware Description Language (ADHL) for use in behavioral simulation of manufacturing variations in a CMOS-MEMS gyroscope.

  8. Developing a Validation Methodology for TACAIR Soar Agents in EAAGLES

    National Research Council Canada - National Science Library

    Alford III, Lewis E; Dudas, Brian A

    2005-01-01

    ...) environment, but have potential for use in EAAGLES. SIMAF requested research be conducted on a validation methodology to apply to the agents' behavior once they have been successfully imported into the EAAGLES environment...

  9. Methodological development of the process of appreciation of photography Conceptions

    Directory of Open Access Journals (Sweden)

    Yovany Álvarez García

    2012-12-01

    Full Text Available This article discusses the different concepts that are used to methodological appreciation of photography. Since photography is one of the manifestations of the visu al arts with the most commonly interacts daily ; from which can be found in books, magazines and other publications, discusses various methodologies to assess the photographic image. It addresses also the classic themes of photography as well as some expres sive elements.

  10. Methodology for Designing Models Predicting Success of Infertility Treatment

    OpenAIRE

    Alireza Zarinara; Mohammad Mahdi Akhondi; Hojjat Zeraati; Koorsh Kamali; Kazem Mohammad

    2016-01-01

    Abstract Background: The prediction models for infertility treatment success have presented since 25 years ago. There are scientific principles for designing and applying the prediction models that is also used to predict the success rate of infertility treatment. The purpose of this study is to provide basic principles for designing the model to predic infertility treatment success. Materials and Methods: In this paper, the principles for developing predictive models are explained and...

  11. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    Science.gov (United States)

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  12. Agile Methodologies and Software Process Improvement Maturity Models, Current State of Practice in Small and Medium Enterprises

    OpenAIRE

    Koutsoumpos, Vasileios; Marinelarena, Iker

    2013-01-01

    Abstract—Background: Software Process Improvement (SPI) maturity models have been developed to assist organizations to enhance software quality. Agile methodologies are used to ensure productivity and quality of a software product. Amongst others they are applied in Small and Medium – sized Enterprises (SMEs). However, little is known about the combination of Agile methodologies and SPI maturity models regarding SMEs and the results that could emerge, as all the current SPI models are address...

  13. On the fit of models to covariances and methodology to the Bulletin.

    Science.gov (United States)

    Bentler, P M

    1992-11-01

    It is noted that 7 of the 10 top-cited articles in the Psychological Bulletin deal with methodological topics. One of these is the Bentler-Bonett (1980) article on the assessment of fit in covariance structure models. Some context is provided on the popularity of this article. In addition, a citation study of methodology articles appearing in the Bulletin since 1978 was carried out. It verified that publications in design, evaluation, measurement, and statistics continue to be important to psychological research. Some thoughts are offered on the role of the journal in making developments in these areas more accessible to psychologists.

  14. A Methodology For The Development Of Complex Domain Specific Languages

    CERN Document Server

    Risoldi, Matteo; Falquet, Gilles

    2010-01-01

    The term Domain-Specific Modeling Language is used in software development to indicate a modeling (and sometimes programming) language dedicated to a particular problem domain, a particular problem representation technique and/or a particular solution technique. The concept is not new -- special-purpose programming language and all kinds of modeling/specification languages have always existed, but the term DSML has become more popular due to the rise of domain-specific modeling. Domain-specific languages are considered 4GL programming languages. Domain-specific modeling techniques have been adopted for a number of years now. However, the techniques and frameworks used still suffer from problems of complexity of use and fragmentation. Although in recent times some integrated environments are seeing the light, it is not common to see many concrete use cases in which domain-specific modeling has been put to use. The main goal of this thesis is tackling the domain of interactive systems and applying a DSML-based...

  15. METHODOLOGICAL ASPECTS OF RURAL DEVELOPMENT GOVERNANCE CASE STUDY

    Directory of Open Access Journals (Sweden)

    Vitalina TSYBULYAK

    2014-01-01

    Full Text Available The article discusses current approaches to the process of assessing rural development governance, reveals its advantages and disadvantages. The article as well presents performance system indicators of governance process by means of two elements of dynamics assessment, rural development (economic, financial, and social sphere, ecology and population health and management process (assessment of strategic plan (concept of development, program of socioeconomic development of rural areas, current activity of local authorities, in particular. More over, it is suggested to use typology of approaches (objective (evolutionary, command and control, economic (infrastructural, complex, and qualitative to definition of process essence of rural development governance and correlation of traditional functions, performed by the subjects of the governance process of rural development (state authorities institutions, local authorities institutions, economic entities, and community. Adjusting traditional functions, performed by governance subjects of local development, their supplementing with new ones, relevant to the present-to-date model of «shared governance» is an important element of analysis of assessment tools for effectiveness of rural development governance. In addition, the author defines functioning of two forms of rural population involvement into the process of rural development governance: active and passive. Active one suggests that rural population participate in making and implementing governance decisions (public meetings, organization of social discussions, and development of territory community self-governance; passive one suggests that the emphasis is placed only on information distribution among population (meetings with parliament members, direct phone lines with territory governors, publication of normative and legal acts and reports on budget execution

  16. Development of a methodology for doss assessment viewing the use of NORM on building materials

    International Nuclear Information System (INIS)

    Souza, Antonio Fernando Costa de

    2009-01-01

    The objective of this study was to develop a methodology for estimating the radiological impact on man of the residues of naturally occurring radioactive materials (NORMs) that potentially can be used for the construction of homes and roads. Residues of this type, which are being produced in great quantities by the Brazilian mining industry, are typically deposited in non-appropriated conditions such that they may have a long-time adverse impact on the environment, and hence on man. A mathematical model was developed to calculate the doses resulting from the use of NORM residues, thus allowing a preliminary analysis of the possibility to recycle the residues. The model was used to evaluate the external dose due gamma radiation, the dose to skin caused by beta radiation, and the internal dose due to inhalation of radon and its decay products. The model was verified by comparisons with results of other studies about doses due to gamma and beta radiation from finite and infinite radioactive sources, with relatively good agreement. In order to validate the proposed methodology, a comparison was made against experimental results for a house constructed in accordance with CNEN regulations using building materials containing NORM residues. Comparisons were made of the dose due to gamma radiation and the radon concentration in the internal environment. Finally, the methodology was used also to estimate the dose caused by gamma radiation from a road constructed in the state of Rondonia, Brazil, which made use of another NORM residue. (author)

  17. Development of performance assessment methodology for establishment of quantitative acceptance criteria of near-surface radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. R.; Lee, E. Y.; Park, J. W.; Chang, G. M.; Park, H. Y.; Yeom, Y. S. [Korea Hydro and Nuclear Power Co., Ltd., Seoul (Korea, Republic of)

    2002-03-15

    The contents and the scope of this study are as follows : review of state-of-the-art on the establishment of waste acceptance criteria in foreign near-surface radioactive waste disposal facilities, investigation of radiological assessment methodologies and scenarios, investigation of existing models and computer codes used in performance/safety assessment, development of a performance assessment methodology(draft) to derive quantitatively radionuclide acceptance criteria of domestic near-surface disposal facility, preliminary performance/safety assessment in accordance with the developed methodology.

  18. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  19. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  20. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    Science.gov (United States)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  1. Methodological development for selection of significant predictors explaining fatal road accidents.

    Science.gov (United States)

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  2. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  3. The Desired Image of the Future Economy of the Industrial Region: Development Trends and Evaluation Methodology

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2017-09-01

    Full Text Available In the article, the authors emphasize that industrial regions play an important role in the increasing of technological independence of Russia. We show that the decline in the share of processing industries in the gross regional product can not be treated as a negative de-industrialization of the economy. The article proves that the increase in the speed of changements, instability of socio-economic systems, the diverse risks predetermine the need to develop new methodological approaches to predictive research. The studies aimed at developing a technology for the design of the desired image of the future and the methodology for its evaluation are of high importance. For the initial stage of the research, the authors propose the methodological approach for assessing the desired image of the future of metallurgy as one of the most important industry of the region. We propose the term of «technological image of the regional metallurgy». We show that repositioning the image of the regional metallurgical complex is quite a long process. This have determined the need to define the stages of repositioning. The proposed methodology of the evaluation of desired future includes the methodological provisions to quantify the characteristics of goals achieved at the respective stages of the repositioning of the metallurgy. The methodological approach to the design of the desired image of the future implies the following stages: the identification of the priority areas of the technological development of regional metallurgy on the basis of bibliometric and patent analysis; the evaluation of dynamics of the development of the structure of metal products domestic consumption based on comparative analysis and relevant analytical methods as well as its forecasting; the design of the factor model, allowing to identify the parameters quantifying the technological image of the regional metallurgy based on the principal components method,; systematization of

  4. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  5. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  6. Development of CANDU ECCS performance evaluation methodology and guides

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Kwang Hyun; Park, Kyung Soo; Chu, Won Ho [Korea Maritime Univ., Jinhae (Korea, Republic of)

    2003-03-15

    The objectives of the present work are to carry out technical evaluation and review of CANDU safety analysis methods in order to assist development of performance evaluation methods and review guides for CANDU ECCS. The applicability of PWR ECCS analysis models are examined and it suggests that unique data or models for CANDU are required for the following phenomena: break characteristics and flow, frictional pressure drop, post-CHF heat transfer correlations, core flow distribution during blowdown, containment pressure, and reflux rate. For safety analysis of CANDU, conservative analysis or best estimate analysis can be used. The main advantage of BE analysis is a more realistic prediction of margins to acceptance criteria. The expectation is that margins demonstrated with BE methods would be larger that when a conservative approach is applied. Some outstanding safety analysis issues can be resolved by demonstration that accident consequences are more benign than previously predicted. Success criteria for analysis and review of Large LOCA can be developed by top-down approach. The highest-level success criteria can be extracted from C-6 and from them, the lower level criteria can be developed step-by-step, in a logical fashion. The overall objectives for analysis and review are to verify radiological consequences and frequency are met.

  7. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  8. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F van [NAGRA (Switzerland); and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  9. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  10. Value at risk methodologies: Developments, implementation and evaluation

    OpenAIRE

    Dong, Simin

    2006-01-01

    Value at Risk (VaR) is a useful concept in risk disclosure, especially for financial institutions. In this paper, the origin and development as well as the regulatory requirement of VaR are discussed. Furthermore, a hypothetical foreign currency forward contract is used as an example to illustrate the implementation of VaR. Back testing is conducted to test the soundness of each VaR model. Analysis in this paper shows that historical simulation and Monte Carlo simulation approaches have more ...

  11. Development of a simplified methodology for the isotopic determination of fuel spent in Light Water Reactors

    International Nuclear Information System (INIS)

    Hernandez N, H.; Francois L, J.L.

    2005-01-01

    The present work presents a simplified methodology to quantify the isotopic content of the spent fuel of light water reactors; their application is it specific to the Laguna Verde Nucleo electric Central by means of a balance cycle of 18 months. The methodology is divided in two parts: the first one consists on the development of a model of a simplified cell, for the isotopic quantification of the irradiated fuel. With this model the burnt one is simulated 48,000 MWD/TU of the fuel in the core of the reactor, taking like base one fuel assemble type 10x10 and using a two-dimensional simulator for a fuel cell of a light water reactor (CPM-3). The second part of the methodology is based on the creation from an isotopic decay model through an algorithm in C++ (decay) to evaluate the amount, by decay of the radionuclides, after having been irradiated the fuel until the time in which the reprocessing is made. Finally the method used for the quantification of the kilograms of uranium and obtained plutonium of a normalized quantity (1000 kg) of fuel irradiated in a reactor is presented. These results will allow later on to make analysis of the final disposition of the irradiated fuel. (Author)

  12. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  13. SR-Site groundwater flow modelling methodology, setup and results

    International Nuclear Information System (INIS)

    Selroos, Jan-Olof; Follin, Sven

    2010-12-01

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report

  14. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  15. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T.; Desarrollo Metodologico del Modelo Probabilista de Evaluacion de Seguridad de la P.D.T. de Hontomin

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, A; Eguilior, S; Recreo, F

    2011-06-07

    In the framework of CO{sub 2} Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  16. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  17. Methodological NMR imaging developments to measure cerebral perfusion

    International Nuclear Information System (INIS)

    Pannetier, N.

    2010-12-01

    This work focuses on acquisition techniques and physiological models that allow characterization of cerebral perfusion by MRI. The arterial input function (AIF), on which many models are based, is measured by a technique of optical imaging at the carotid artery in rats. The reproducibility and repeatability of the AIF are discussed and a model function is proposed. Then we compare two techniques for measuring the vessel size index (VSI) in rats bearing a glioma. The reference technique, using a USPIO contrast agent (CA), faces the dynamic approach that estimates this parameter during the passage of a bolus of Gd. This last technique has the advantage of being used clinically. The results obtained at 4.7 T by both approaches are similar and use of VSI in clinical protocols is strongly encouraged at high field. The mechanisms involved (R1 and R2* relaxivities) were then studied using a multi gradient -echoes approach. A multi-echoes spiral sequence is developed and a method that allows the refocusing between each echo is presented. This sequence is used to characterize the impact of R1 effects during the passage of two successive injections of Gd. Finally, we developed a tool for simulating the NMR signal on a 2D geometry taking into account the permeability of the BBB and the CA diffusion in the interstitial space. At short TE, the effect of diffusion on the signal is negligible. In contrast, the effects of diffusion and permeability may be separated at long echo time. Finally we show that during the extravasation of the CA, the local magnetic field homogenization due to the decrease of the magnetic susceptibility difference at vascular interfaces is quickly balanced by the perturbations induced by the increase of the magnetic susceptibility difference at the cellular interfaces in the extravascular compartment. (author)

  18. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  19. Development of a methodology for the safety assessment of near surface disposal facilities for radioactive waste

    International Nuclear Information System (INIS)

    Simon, I.; Cancio, D.; Alonso, L.F.; Agueero, A.; Lopez de la Higuera, J.; Gil, E.; Garcia, E.

    2000-01-01

    The Project on the Environmental Radiological Impact in CIEMAT is developing, for the Spanish regulatory body Consejo de Seguridad Nuclear (CSN), a methodology for the Safety Assessment of near surface disposal facilities. This method has been developed incorporating some elements developed through the participation in the IAEA's ISAM Programme (Improving Long Term Safety Assessment Methodologies for Near Surface Radioactive Waste Disposal Facilities). The first step of the approach is the consideration of the assessment context, including the purpose of the assessment, the end-Points, philosophy, disposal system, source term and temporal scales as well as the hypothesis about the critical group. Once the context has been established, and considering the peculiarities of the system, an specific list of features, events and processes (FEPs) is produced. These will be incorporated into the assessment scenarios. The set of scenarios will be represented in the conceptual and mathematical models. By the use of mathematical codes, calculations are performed to obtain results (i.e. in terms of doses) to be analysed and compared against the criteria. The methodology is being tested by the application to an hypothetical engineered disposal system based on an exercise within the ISAM Programme, and will finally be applied to the Spanish case. (author)

  20. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  1. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  2. Methodology to develop a training program as a tool for energy management

    Directory of Open Access Journals (Sweden)

    Mónica Rosario Berenguer-Ungaro

    2017-12-01

    Full Text Available The paperaims to present the methodology to develop a training program improve labor skills that enhance the efficient use of energy resources, which aims to make training a timely and meet the training needs as they arise and that the protagonist of it is he who receives training. It is based on the training-action and action research method and model for evaluating training Krikpatrick, it evaluates four levels, reaction, learning, behavior and results. The methodology is structured in three stages: 1 diagnosis of knowledge, 2 intervention based on the results and 3 evaluation and feedback for continuous improvement. Each stage has identified the objectives and implementation tools. Evaluation is transverse to the entire program and it is through it that decisions for feedback loops are taken.

  3. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  4. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2017-01-01

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or ?Designing for Learning?. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  5. Methodology for assessing electric vehicle charging infrastructure business models

    International Nuclear Information System (INIS)

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, which allows them to recover their costs while, at the same time, offer EV users a charging price which makes electro-mobility comparable to internal combustion engine vehicles. For that purpose, three scenarios are defined, which present different EV charging alternatives, in terms of charging power and charging station ownership and accessibility. A case study is presented for each scenario and the required charging station usage to have a profitable business model is calculated. We demonstrate that private home charging is likely to be the preferred option for EV users who can charge at home, as it offers a lower total cost of ownership under certain conditions, even today. On the contrary, finding a profitable business case for fast charging requires more intensive infrastructure usage. - Highlights: • Ecosystem is a network of actors who collaborate to create a positive business case. • Electro-mobility (electricity-powered road vehicles and ICT) is a complex ecosystem. • Methodological analysis to ensure that all actors benefit from electro-mobility. • Economic analysis of charging infrastructure deployment linked to its usage. • Comparison of EV ownership cost vs. ICE for vehicle users.

  6. Development of a new methodology for the creation of water temperature scenarios using frequency analysis tool.

    Science.gov (United States)

    Val, Jonatan; Pino, María Rosa; Chinarro, David

    2018-03-15

    Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global

  7. Contribution to developing the environment radiation protection methodology

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, A. [Institute of Atomic Power Engineering NRNU MEPhI (Russian Federation); Alexakhin, R.; Dubynina, M. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    The environment sustainable development and biota protection, including the environment radiation protection are issues of nowadays interest in the society. An activity is ongoing on the development of a system of radiation protection for non-human biota. Anthropocentric and eco-centric principles are widely discussed. ICRP Publications 103, 108, 114 and many other reports and articles refer to the topic of environmental protection, reference animals and plants set, corresponding transfer parameters, dose models and derived consideration reference levels. There is still an open field for discussion of methods and approaches to get well-established procedure to assess environmental risks of radiation impacts to different organisms, populations and ecosystems. A huge work has been done by the ICRP and other organizations and research groups to develop and systematize approaches for this difficult subject. This activity, however, is not everywhere well-known and perceived, and more efforts are needed to bring ideas of eco-centric strategy in the environment radiation protection not only to public but to specialists in many countries as well. One of the main points of interest is an assessment of critical doses and doses rates for flora and fauna species. Some aspects of a possible procedure to find their estimates are studied in this work, including criteria for datasets of good quality, models of dose dependence, sensitivity of different umbrella endpoints and methods of original massive datasets treatment. Estimates are done based on information gathered in a database on radiation-induced effects in plants. Data on biological effects in plants (umbrella endpoints of reproductive potential, survival, morbidity, morphological, biochemical, and genetic effects) in dependence on dose and dose rates of ionizing radiation have been collected from reviewed publications and maintained in MS Access format. The database now contains about 7000 datasets and 25000 records

  8. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  9. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  10. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  11. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  12. Development and Evaluation of a Methodology for the Generation of Gridded Isotopic Datasets

    Energy Technology Data Exchange (ETDEWEB)

    Argiriou, A. A.; Salamalikis, V [University of Patras, Department of Physics, Laboratory of Atmospheric Physics, Patras (Greece); Lykoudis, S. P. [National Observatory of Athens, Institute of Environmental and Sustainable Development, Athens (Greece)

    2013-07-15

    The accurate knowledge of the spatial distribution of stable isotopes in precipitation is necessary for several applications. Since the number of rain sampling stations is small and unevenly distributed around the globe, the global distribution of stable isotopes can be calculated via the generation of gridded isotopic data sets. Several methods have been proposed for this purpose. In this work a methodology is proposed for the development of 10'x 10' gridded isotopic data from precipitation in the central and eastern Mediterranean. Statistical models are developed taking into account geographical and meteorological parameters as regressors. The residuals are interpolated onto the grid using ordinary kriging and thin plate splines. The result is added to the model grids, to obtain the final isotopic gridded data sets. Models are evaluated using an independent data set. the overall performance of the procedure is satisfactory and the obtained gridded data reproduce the isotopic parameters successfully. (author)

  13. Development and Evaluation of a Methodology for the Generation of Gridded Isotopic Datasets

    International Nuclear Information System (INIS)

    Argiriou, A.A.; Salamalikis, V; Lykoudis, S.P.

    2013-01-01

    The accurate knowledge of the spatial distribution of stable isotopes in precipitation is necessary for several applications. Since the number of rain sampling stations is small and unevenly distributed around the globe, the global distribution of stable isotopes can be calculated via the generation of gridded isotopic data sets. Several methods have been proposed for this purpose. In this work a methodology is proposed for the development of 10'x 10' gridded isotopic data from precipitation in the central and eastern Mediterranean. Statistical models are developed taking into account geographical and meteorological parameters as regressors. The residuals are interpolated onto the grid using ordinary kriging and thin plate splines. The result is added to the model grids, to obtain the final isotopic gridded data sets. Models are evaluated using an independent data set. the overall performance of the procedure is satisfactory and the obtained gridded data reproduce the isotopic parameters successfully. (author)

  14. A methodology model for quality management in a general hospital.

    Science.gov (United States)

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  15. THEORETIC AND METHODOLOGIC BASICS OF DEVELOPMENT OF THE NATIONAL LOGISTICS SYSTEM IN THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    R. B. Ivut

    2016-01-01

    Full Text Available The article presents the results of a study, the aim of which is the formation of the theoretical and methodological foundations in the framework of scientific maintenance for the further development processes of the national logistics system in the Republic of Belarus. The relevance of the study relates to the fact that at present the introduction of the concept of logistics and the formation of the optimal infrastructure for its implementation are the key factors for economic development of Belarus as a transit country. At the same time the pace of development of the logistic activities in the country is currently slightly lower in comparison with the neighboring countries, as evidenced by the dynamics of the country’s position in international rankings (in particular, according to the LPI index. Overcoming these gaps requires improved competitiveness of the logistics infrastructure in the international market. This, in turn, is possible due to the clear formulation and adherence of the effective functioning principles for macro logistics system of Belarus, as well as by increasing the quality of logistics design by means of applying econometric models and methods presented in the article. The proposed auctorial approach is the differentiation of the general principles of logistics specific to the logistics systems of all levels, and the specific principles of development of the macro level logistics system related to improving its transit attractiveness for international freight carriers. The study also systematizes the model for determining the optimal location of logistics facilities. Particular attention is paid to the methodological basis of the analysis of transport terminals functioning as part of the logistics centers both in the stages of design and operation. The developed theoretical and methodological recommendations are universal and can be used in the design of the logistics infrastructure for various purposes and functions

  16. Advances in Artificial Neural Networks – Methodological Development and Application

    Directory of Open Access Journals (Sweden)

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  17. Engendering Development: Some Methodological Perspectives on Child Labour

    Directory of Open Access Journals (Sweden)

    Erica Burman

    2006-01-01

    Full Text Available In this article I address when and why it is useful to focus on gender in the design and conceptualisation of developmental psychological research. Since methodological debates treated in the abstract tend to lack both the specificity and rigour that application to a particular context or topic imports, I take a particular focus for my discussion: child labour. In doing so I hope to highlight the analytical and practical gains of bringing gendered agendas alongside, and into, developmental research. While child labour may seem a rather curious topic for discussion of developmental psychological research practice, this article will show how it indicates with particular clarity issues that mainstream psychological research often occludes or forgets. In particular, I explore analytical and methodological benefits of exploring the diverse ways gender structures notions of childhood, alongside the developmental commonalities and asymmetries of gender and age as categories. I suggest that the usual assumed elision between women and children is often unhelpful for both women and children. Instead, an analytical attention to the shifting forms and relations of children's work facilitates more differentiated perspectives on how its meanings reflect economic and cultural (including gendered conditions, and so attends better to social inequalities. These inequalities also structure the methodological conditions and paradigms for research with children, and so the article finishes by elaborating from this discussion of child labour four key principles for engendering psychological research with and about children, which also have broader implications for conceptualisations of the relations between gender, childhood, culture and families. URN: urn:nbn:de:0114-fqs060111

  18. The Methodology of Management for Long Term Energy Efficiency Development

    International Nuclear Information System (INIS)

    Zebergs, V.; Kehris, O.; Savickis, J.; Zeltins, N.

    2010-01-01

    The paper has shown that the Member States of the European Union (EU) do what they can in order to accelerate the raising of energy efficiency (EE). In each EU Member State investigations are conducted in the planning and management methods with a view to achieve faster and greater EE gains. In Latvia, which imports almost 70% of the total energy resources consumed, saving of each 'toe' is of great importance. Adaptation of the general policy assessment methodology is being studied for planning and management of the EE process. 12 EE management methods have been analysed and recommendations worked out for the introduction of several most topical methods.(author).

  19. Complex methodology of the model elaboration of the quantified transnationalization process assessment

    Directory of Open Access Journals (Sweden)

    Larysa Rudenko-Sudarieva

    2009-03-01

    Full Text Available In the article there are studied the theoretical fundamentals of transnationalization, the peculiarities of its development based on the studying of the world theory and practices; suggested a systematic approach of the methodical background as for determination of the economic category of «transnationalization» and its author’s definition; developed a complex methodology of the model building of the quantified transnationalization process assessment based on the seven-milestone algorithm of the formation of key indicators; systematized and carried out synthesis of the empiric investigations concerning the state, development of the available tendencies, comparative analysis of the transnationalization level within the separate TNC’s groups.

  20. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  1. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  2. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  3. Development of new assessment methodology for locally corroded pipe

    International Nuclear Information System (INIS)

    Lim, Hwan; Shim, Do Jun; Kim, Yun Jae; Kim, Young Jin

    2002-01-01

    In this paper, a unified methodology based on the local stress concept to estimate residual strength of locally thinned pipes is proposed. An underlying idea of the proposed methodology is that the local stress in the minimum section for locally thinned pipe is related to the reference stress, popularly used in creep problems. Then the problem remains how to define the reference stress, that is the reference load. Extensive three-dimensional Finite Element (FE) analyses were performed to simulate full-scale pipe tests conducted for various shapes of wall thinned area under internal pressure and bending moment. Based on these FE results, the reference load is proposed, which is independent of materials. A natural outcome of this method is the maximum load capacity. By comparing with existing test results, it is shown that the reference stress is related to the fracture stress, which in turn can be posed as the fracture criterion of locally thinned pipes. The proposed method is powerful as it can be easily generalised to more complex problems, such as pipe bends and tee-joints

  4. Development of Risk Assessment Methodology for State's Nuclear Security Regime

    International Nuclear Information System (INIS)

    Jang, Sung Soon; Seo, Hyung Min; Lee, Jung Ho; Kwak, Sung Woo

    2011-01-01

    Threats of nuclear terrorism are increasing after 9/11 terrorist attack. Treats include nuclear explosive device (NED) made by terrorist groups, radiological damage caused by a sabotage aiming nuclear facilities, and radiological dispersion device (RDD), which is also called 'dirty bomb'. In 9/11, Al Qaeda planed to cause radiological consequences by the crash of a nuclear power plant and the captured airplane. The evidence of a dirty bomb experiment was found in Afganistan by the UK intelligence agency. Thus, the international communities including the IAEA work substantial efforts. The leaders of 47 nations attended the 2010 nuclear security summit hosted by President Obama, while the next global nuclear summit will be held in Seoul, 2012. Most states established and are maintaining state's nuclear security regime because of the increasing threat and the international obligations. However, each state's nuclear security regime is different and depends on the state's environment. The methodology for the assessment of state's nuclear security regime is necessary to design and implement an efficient nuclear security regime, and to figure out weak points. The IAEA's INPRO project suggests a checklist method for State's nuclear security regime. The IAEA is now researching more quantitative methods cooperatively with several countries including Korea. In this abstract, methodologies to evaluate state's nuclear security regime by risk assessment are addressed

  5. A methodology to model flow-thermals inside a domestic gas oven

    International Nuclear Information System (INIS)

    Mistry, Hiteshkumar; Ganapathisubbu, S.; Dey, Subhrajit; Bishnoi, Peeush; Castillo, Jose Luis

    2011-01-01

    In this paper, the authors describe development of a CFD based methodology to evaluate performance of a domestic gas oven. This involves modeling three-dimensional, unsteady, forced convective flow field coupled with radiative participating media. Various strategies for capturing transient heat transfer coupled with mixed convection flow field are evaluated considering the trade-off between computational time and accuracy of predictions. A new technique of modeling gas oven that does not require detailed modeling of flow-thermals through the burner is highlighted. Experiments carried out to support this modeling development shows that heat transfer from burners can be represented as non-dimensional false bottom temperature profiles. Transient validation of this model with experiments show less than 6% discrepancy in thermal field during preheating of bake cycle of gas oven.

  6. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  7. A methodology for developing anisotropic AAA phantoms via additive manufacturing.

    Science.gov (United States)

    Ruiz de Galarreta, Sergio; Antón, Raúl; Cazón, Aitor; Finol, Ender A

    2017-05-24

    An Abdominal Aortic Aneurysm (AAA) is a permanent focal dilatation of the abdominal aorta at least 1.5 times its normal diameter. The criterion of maximum diameter is still used in clinical practice, although numerical studies have demonstrated the importance of biomechanical factors for rupture risk assessment. AAA phantoms could be used for experimental validation of the numerical studies and for pre-intervention testing of endovascular grafts. We have applied multi-material 3D printing technology to manufacture idealized AAA phantoms with anisotropic mechanical behavior. Different composites were fabricated and the phantom specimens were characterized by biaxial tensile tests while using a constitutive model to fit the experimental data. One composite was chosen to manufacture the phantom based on having the same mechanical properties as those reported in the literature for human AAA tissue; the strain energy and anisotropic index were compared to make this choice. The materials for the matrix and fibers of the selected composite are, respectively, the digital materials FLX9940 and FLX9960 developed by Stratasys. The fiber proportion for the composite is equal to 0.15. The differences between the composite behavior and the AAA tissue are small, with a small difference in the strain energy (0.4%) and a maximum difference of 12.4% in the peak Green strain ratio. This work represents a step forward in the application of 3D printing technology for the manufacturing of AAA phantoms with anisotropic mechanical behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  9. A hierarchical modeling methodology for the definition and selection of requirements

    Science.gov (United States)

    Dufresne, Stephane

    This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the

  10. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  11. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  12. Development of a Methodology for VHTR Accident Consequence Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joeun; Kim, Jintae; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-05-15

    The substitution of the VHTR for burning fossil fuels conserves these hydrocarbon resources for other uses and eliminates the emissions of greenhouse. In Korea, for these reasons, constructing the VHTR plan for hydrogen production is in progress. In this study, the consequence analysis for the off-site releases of radioactive materials during severe accidents has been performed using the level 3 PRA technology. The offsite consequence analysis for a VHTR using the MACCS code has been performed. Since the passive system such as the RCCS(Reactor Cavity Cooling System) are equipped, the frequency of occurrence of accidents has been evaluated to be very low. For further study, the assessment for characteristic of VHTR safety system and precise quantification of its accident scenarios is expected to conduct more certain consequence analysis. This methodology shown in this study might contribute to enhancing the safety of VHTR design by utilizing the results having far lower effect on the environment than the LWRs.

  13. Towards a general object-oriented software development methodology

    Science.gov (United States)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  14. Development of a methodology for life cycle building energy ratings

    International Nuclear Information System (INIS)

    Hernandez, Patxi; Kenny, Paul

    2011-01-01

    Traditionally the majority of building energy use has been linked to its operation (heating, cooling, lighting, etc.), and much attention has been directed to reduce this energy use through technical innovation, regulatory control and assessed through a wide range of rating methods. However buildings generally employ an increasing amount of materials and systems to reduce the energy use in operation, and energy embodied in these can constitute an important part of the building's life cycle energy use. For buildings with 'zero-energy' use in operation the embodied energy is indeed the only life cycle energy use. This is not addressed by current building energy assessment and rating methods. This paper proposes a methodology to extend building energy assessment and rating methods accounting for embodied energy of building components and systems. The methodology is applied to the EU Building Energy Rating method and, as an illustration, as implemented in Irish domestic buildings. A case study dwelling is used to illustrate the importance of embodied energy on life cycle energy performance, particularly relevant when energy use in operation tends to zero. The use of the Net Energy Ratio as an indicator to select appropriate building improvement measures is also presented and discussed. - Highlights: → The definitions for 'zero energy buildings' and current building energy ratings are examined. → There is a need to integrate a life cycle perspective within building energy ratings. → A life cycle building energy rating method (LC-BER), including embodied energy is presented. → Net Energy Ratio is proposed as an indicator to select building energy improvement options.

  15. A Comparative Study of Three Methodologies for Modeling Dynamic Stall

    Science.gov (United States)

    Sankar, L.; Rhee, M.; Tung, C.; ZibiBailly, J.; LeBalleur, J. C.; Blaise, D.; Rouzaud, O.

    2002-01-01

    During the past two decades, there has been an increased reliance on the use of computational fluid dynamics methods for modeling rotors in high speed forward flight. Computational methods are being developed for modeling the shock induced loads on the advancing side, first-principles based modeling of the trailing wake evolution, and for retreating blade stall. The retreating blade dynamic stall problem has received particular attention, because the large variations in lift and pitching moments encountered in dynamic stall can lead to blade vibrations and pitch link fatigue. Restricting to aerodynamics, the numerical prediction of dynamic stall is still a complex and challenging CFD problem, that, even in two dimensions at low speed, gathers the major difficulties of aerodynamics, such as the grid resolution requirements for the viscous phenomena at leading-edge bubbles or in mixing-layers, the bias of the numerical viscosity, and the major difficulties of the physical modeling, such as the turbulence models, the transition models, whose both determinant influences, already present in static maximal-lift or stall computations, are emphasized by the dynamic aspect of the phenomena.

  16. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    International Nuclear Information System (INIS)

    Ambrosini, W.; Pucciarelli, A.; Borroni, I.

    2015-01-01

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  17. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  18. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  19. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  20. Pathways for scale and discipline reconciliation: current socio-ecological modelling methodologies to explore and reconstitute human prehistoric dynamics

    OpenAIRE

    Saqalli , Mehdi; Baum , Tilman

    2016-01-01

    International audience; This communication elaborates a plea for the necessity of a specific modelling methodology which does not sacrifice two modelling principles: explanation Micro and correlation Macro. Three goals are assigned to modelling strategies: describe, understand and predict. One tendency in historical and spatial modelling is to develop models at a micro level in order to describe and by that way, understand the connection between local ecological contexts, acquired through loc...

  1. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  2. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  3. Development of a Malicious Insider Composite Vulnerability Assessment Methodology

    National Research Council Canada - National Science Library

    King, William H

    2006-01-01

    .... There are very few vulnerability and impact models capable of providing information owners with the ability to comprehensively assess the effectiveness an organization's malicious insider mitigation strategies...

  4. The Development Methodology of the UML Electronic Guide

    Directory of Open Access Journals (Sweden)

    N.A. Magariu

    2006-09-01

    Full Text Available A technological model for realization of the electronic guide to UML language is considered. This model includes description of peculiarities of using the special graphic editor for constructing the UML diagrams, XML vocabularies (XMI, DocBook, SVG, XSLT for representing the text and diagrams and JavaScript code for constructing the tests.

  5. Development of a methodology to evaluate material accountability in pyroprocess

    Science.gov (United States)

    Woo, Seungmin

    This study investigates the effect of the non-uniform nuclide composition in spent fuel on material accountancy in the pyroprocess. High-fidelity depletion simulations are performed using the Monte Carlo code SERPENT in order to determine nuclide composition as a function of axial and radial position within fuel rods and assemblies, and burnup. For improved accuracy, the simulations use short burnups step (25 days or less), Xe-equilibrium treatment (to avoid oscillations over burnup steps), axial moderator temperature distribution, and 30 axial meshes. Analytical solutions of the simplified depletion equations are built to understand the axial non-uniformity of nuclide composition in spent fuel. The cosine shape of axial neutron flux distribution dominates the axial non-uniformity of the nuclide composition. Combined cross sections and time also generate axial non-uniformity, as the exponential term in the analytical solution consists of the neutron flux, cross section and time. The axial concentration distribution for a nuclide having the small cross section gets steeper than that for another nuclide having the great cross section because the axial flux is weighted by the cross section in the exponential term in the analytical solution. Similarly, the non-uniformity becomes flatter as increasing burnup, because the time term in the exponential increases. Based on the developed numerical recipes and decoupling of the results between the axial distributions and the predetermined representative radial distributions by matching the axial height, the axial and radial composition distributions for representative spent nuclear fuel assemblies, the Type-0, -1, and -2 assemblies after 1, 2, and 3 depletion cycles, is obtained. These data are appropriately modified to depict processing for materials in the head-end process of pyroprocess that is chopping, voloxidation and granulation. The expectation and standard deviation of the Pu-to-244Cm-ratio by the single granule

  6. Development of Six Sigma methodology for CNC milling process improvements

    Science.gov (United States)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  7. Methodology for the development of teaching vocational guidance from physics classes in high school

    Directory of Open Access Journals (Sweden)

    Yamila García-Carrión

    2016-11-01

    Full Text Available Pre-university education has as one of its aims to achieve the polytechnic and vocational training of students, which is why it is necessary, vocational guidance thereof, to professions requiring the country, giving priority to teaching, hence this, which is a priority in the education system. The scientific research problem is expressed in the shortcomings revealed in the conception and development of the process of teaching vocational career orientation Physics, from the classes of this subject in high school. As an object the process of learning of physics in high school is required. the development of a methodology for the development of professional orientation of physics teaching career, from the classes of this subject in high school, based on an educational model that theoretically systematize the research and pre-professional approaches are proposed.

  8. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    Directory of Open Access Journals (Sweden)

    Andi Saptono

    2009-09-01

    Full Text Available The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT model. This model describes five required characteristics for a telerehabilitation (TR infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. Keywords: Telerehabilitation, Information Management, Infrastructure Development Methodology, Videoconferencing, Online Portal, Database

  9. Application of a methodology for the development and validation of reliable process control software

    International Nuclear Information System (INIS)

    Ramamoorthy, C.V.; Mok, Y.R.; Bastani, F.B.; Chin, G.

    1980-01-01

    The necessity of a good methodology for the development of reliable software, especially with respect to the final software validation and testing activities, is discussed. A formal specification development and validation methodology is proposed. This methodology has been applied to the development and validation of a pilot software, incorporating typical features of critical software for nuclear power plants safety protection. The main features of the approach include the use of a formal specification language and the independent development of two sets of specifications. 1 ref

  10. Further development of the coupling model

    International Nuclear Information System (INIS)

    Kreuser, A.; Stiller, J.C.; Peschke, J.

    2006-01-01

    Uncertainties arising from different sources have to be considered for the quantification of common cause failures (CCFs). At GRS a CCF model (coupling model) has been developed for the estimation of CCF probabilities. An essential feature of the coupling model is the consideration of these uncertainties by using Bayesian estimation methods. Experiences from applying the coupling model to CCF event data over several years and analyzing the results in detail has led to improvements in the application of the model. In this paper the improved methodology of the coupling model is presented. Special emphasis is given to the description of the sources of uncertainties which are considered in the coupling model and the mathematical methodology, how these uncertainties are represented and propagated through the model. In closing topics of future improvements of the coupling models are discussed. (orig.)

  11. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  12. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    International Nuclear Information System (INIS)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-01-01

    Highlights: • An ensemble model is developed to produce both deterministic and probabilistic wind forecasts. • A deep feature selection framework is developed to optimally determine the inputs to the forecasting methodology. • The developed ensemble methodology has improved the forecasting accuracy by up to 30%. - Abstract: With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  13. Development of a risk monitoring system for nuclear power plants based on GO-FLOW methodology

    International Nuclear Information System (INIS)

    Yang, Jun; Yang, Ming; Yoshikawa, Hidekazu; Yang, Fangqing

    2014-01-01

    Highlights: • A method for developing Living PSA is proposed. • Living PSA is easy to update with online modification to system model file. • A risk monitoring system is designed and developed using the GO-FLOW. • The risk monitoring system is useful for plant daily operation risk management. - Abstract: The paper presents a risk monitoring system developed based on GO-FLOW methodology which is a success-oriented system reliability modeling technique for phased mission as well as time-dependent problems analysis. The risk monitoring system is designed to receive information on plant configuration changes either from equipment failures, operator interventions, or maintenance activities, then update the Living PSA model with online modification to the system GO-FLOW model file which contains all the functional modes of equipment represented by a proposed generalized GO-FLOW modeling structure, and display risk values graphically. The risk monitoring system can be used to assist safety engineers and plant operators in their maintenance management and daily operation risk management at NPPs

  14. Development of a risk monitoring system for nuclear power plants based on GO-FLOW methodology

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Jun, E-mail: youngjun51@hotmail.com [College of Nuclear Science and Technology, Harbin Engineering University, No. 145 Nantong Street, Nangang District, Harbin 150001 (China); Yang, Ming, E-mail: yangming@hrbeu.edu.cn [College of Nuclear Science and Technology, Harbin Engineering University, No. 145 Nantong Street, Nangang District, Harbin 150001 (China); Yoshikawa, Hidekazu, E-mail: yosikawa@kib.biglobe.ne.jp [Symbio Community Forum, Kyoto (Japan); Yang, Fangqing, E-mail: yfq613@163.com [China Nuclear Power Technology Research Institute, 518000 (China)

    2014-10-15

    Highlights: • A method for developing Living PSA is proposed. • Living PSA is easy to update with online modification to system model file. • A risk monitoring system is designed and developed using the GO-FLOW. • The risk monitoring system is useful for plant daily operation risk management. - Abstract: The paper presents a risk monitoring system developed based on GO-FLOW methodology which is a success-oriented system reliability modeling technique for phased mission as well as time-dependent problems analysis. The risk monitoring system is designed to receive information on plant configuration changes either from equipment failures, operator interventions, or maintenance activities, then update the Living PSA model with online modification to the system GO-FLOW model file which contains all the functional modes of equipment represented by a proposed generalized GO-FLOW modeling structure, and display risk values graphically. The risk monitoring system can be used to assist safety engineers and plant operators in their maintenance management and daily operation risk management at NPPs.

  15. Development of margin assessment methodology of decay heat removal function against external hazards. (2) Tornado PRA methodology

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2014-01-01

    Probabilistic Risk Assessment (PRA) for external events has been recognized as an important safety assessment method after the TEPCO's Fukushima Daiichi nuclear power station accident. The PRA should be performed not only for earthquake and tsunami which are especially key events in Japan, but also the PRA methodology should be developed for the other external hazards (e.g. tornado). In this study, the methodology was developed for Sodium-cooled Fast Reactors paying attention to that the ambient air is their final heat sink for removing decay heat under accident conditions. First, tornado hazard curve was estimated by using data recorded in Japan. Second, important structures and components for decay heat removal were identified and an event tree resulting in core damage was developed in terms of wind load and missiles (i.e. steel pipes, boards and cars) caused by a tornado. Main damage cause for important structures and components is the missiles and the tornado missiles that can reach those components and structures placed on high elevations were identified, and the failure probabilities of the components and structures against the tornado missiles were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or outtake in the decay heat removal system, and a probability of failure caused by the missile impacts. Finally, the event tree was quantified. As a result, the core damage frequency was enough lower than 10 -10 /ry. (author)

  16. Methodology for urban rail and construction technology research and development planning

    Science.gov (United States)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  17. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  18. Development of Behavioral Toxicology Methodology for Interactive Exposure Regimens.

    Science.gov (United States)

    1983-12-01

    Inhalation chambers by an air pump (Arthur H. Thomas, Model No. 1050-AlO) through a gas purifier ( Alltech Associates, Model 8128) with indicating...require the administration of an air puff stimulus to elicit the response. A second advantage of this test Is that It measures both fore-and hindiimb...fixed ratio schedules appear to be most sensitive and have the additional advantage of being easily shaped, further *research on novel schedules like

  19. Development of Management Quality Assessment Methodology in the Public Sector: Problems and Contradictions

    Directory of Open Access Journals (Sweden)

    Olga Vladimirovna Kozhevina

    2015-09-01

    Full Text Available The development management quality assessment methodology in the public sector is relevant scientific and practical problem of economic research. The utilization of the results of the assessment on the basis of the authors’ methodology allows us to rate the public sector organizations, to justify decisions on the reorganization and privatization, and to monitor changes in the level of the management quality of the public sector organizations. The study determined the place of the quality of the control processes of the public sector organization in the system of “Quality of public administration — the effective operation of the public sector organization,” the contradictions associated with the assessment of management quality are revealed, the conditions for effective functioning of the public sector organizations are proved, a mechanism of comprehensive assessment and algorithm for constructing and evaluating the control models of management quality are developed, the criteria for assessing the management quality in the public sector organizations, including economic, budgetary, social and public, informational, innovation and institutional criteria are empirically grounded. By utilizing the proposed algorithm, the assessment model of quality management in the public sector organizations, including the financial, economic, social, innovation, informational and institutional indicators is developed. For each indicator of quality management, the coefficients of importance in the management quality assessment model, as well as comprehensive and partial evaluation indicators are determined on the basis of the expert evaluations. The main conclusion of the article is that management quality assessment for the public sector organizations should be based not only on the indicators achieved in the dynamics and utilized for analyzing the effectiveness of management, but also should take into account the reference levels for the values of these

  20. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  1. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  2. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  3. A modelling methodology for assessing the impact of climate variability and climatic change on hydroelectric generation

    International Nuclear Information System (INIS)

    Munoz, J.R.; Sailor, D.J.

    1998-01-01

    A new methodology relating basic climatic variables to hydroelectric generation was developed. The methodology can be implemented in large or small basins with any number of hydro plants. The method was applied to the Sacramento, Eel and Russian river basins in northern California where more than 100 hydroelectric plants are located. The final model predicts the availability of hydroelectric generation for the entire basin provided present and near past climate conditions, with about 90% accuracy. The results can be used for water management purposes or for analyzing the effect of climate variability on hydrogeneration availability in the basin. A wide range of results can be obtained depending on the climate change scenario used. (Author)

  4. Integration of the Scrum methodology in mechatronic product development

    OpenAIRE

    Mauri Also, Joan Josep

    2015-01-01

    The purpose of this study was to demonstrate if it would be possible for a mechatronic product development team to use Scrum, an Agile Development framework with both, the students of UVIC-UCC and the company ITQ GmbH, behind the student project called Mi5. The Agile philosophy and methods have revolutionized the software development industry in the last decade, and therefore it was of interest to see if this new way of working would be applicable in other disciplines. Thus, the study focu...

  5. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  6. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  7. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  8. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  9. Establishing a methodology to develop complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-02-01

    Full Text Available Many modern management systems, such as military command and control, tend to be large and highly interconnected sociotechnical systems operating in a complex environment. Successful development, assessment and implementation of these systems...

  10. Learning challenges and sustainable development: A methodological perspective.

    Science.gov (United States)

    Seppänen, Laura

    2017-01-01

    Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.

  11. Development of a flow structure interaction methodology applicable to a convertible car roof

    International Nuclear Information System (INIS)

    Knight, Jason J.

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and convergence are investigated using the coupled code. The three-dimensional problem is addressed by extending the two-dimensional structural solver to represent a surface by a matrix of line elements with constant tension along their length. This has been successfully coupled with the three-dimensional CFD flow-solution technique. Computed deformations show good agreement with the results of wind tunnel experiments for the well prescribed geometry. In both two-and three-dimensional computations, the flow-structure interaction is found to yield a static deformation to within 1% difference in the displacement variable after three iterations between the fluid and structural codes. The same computational methodology is applied to a real-car application using a third-party structural solver. The methodology is shown to be robust even under conditions beyond those likely to be encountered. The full methodology could be used as a design tool. The present work

  12. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1986-01-01

    The analysis of the processes involved in the burial of nuclear wastes can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission

  13. Development methodology for industrial diesel engines; Entwicklungsmethode fuer Industrie-Dieselmotoren

    Energy Technology Data Exchange (ETDEWEB)

    Bergmann, Dirk; Kech, Johannes [MTU Friedrichshafen GmbH (Germany)

    2011-11-15

    In order to remain cost-effective with relatively low production volumes in spite of the high requirements regarding emissions and durability, MTU uses a clearly structured development methodology with a close interlinking of technology and product development in the development of its large engines. For the new engine of the 4000 Series with cooled EGR, MTU applied this methodology in order to implement the emissions concept from the initial idea right through to the serial product. (orig.)

  14. Methodological Support to Develop Interoperable Applications for Pervasive Healthcare

    NARCIS (Netherlands)

    Cardoso de Moraes, J.L.

    2014-01-01

    The healthcare model currently being used in most countries will soon be inadequate, due to the increasing care costs of a growing population of elderly people, the rapid increase of chronic diseases, the growing demand for new treatments and technologies, and the relative decrease in the number of

  15. Water level management of lakes connected to regulated rivers: An integrated modeling and analytical methodology

    Science.gov (United States)

    Hu, Tengfei; Mao, Jingqiao; Pan, Shunqi; Dai, Lingquan; Zhang, Peipei; Xu, Diandian; Dai, Huichao

    2018-07-01

    Reservoir operations significantly alter the hydrological regime of the downstream river and river-connected lake, which has far-reaching impacts on the lake ecosystem. To facilitate the management of lakes connected to regulated rivers, the following information must be provided: (1) the response of lake water levels to reservoir operation schedules in the near future and (2) the importance of different rivers in terms of affecting the water levels in different lake regions of interest. We develop an integrated modeling and analytical methodology for the water level management of such lakes. The data-driven method is used to model the lake level as it has the potential of producing quick and accurate predictions. A new genetic algorithm-based synchronized search is proposed to optimize input variable time lags and data-driven model parameters simultaneously. The methodology also involves the orthogonal design and range analysis for extracting the influence of an individual river from that of all the rivers. The integrated methodology is applied to the second largest freshwater lake in China, the Dongting Lake. The results show that: (1) the antecedent lake levels are of crucial importance for the current lake level prediction; (2) the selected river discharge time lags reflect the spatial heterogeneity of the rivers' impacts on lake level changes; (3) the predicted lake levels are in very good agreement with the observed data (RMSE ≤ 0.091 m; R2 ≥ 0.9986). This study demonstrates the practical potential of the integrated methodology, which can provide both the lake level responses to future dam releases and the relative contributions of different rivers to lake level changes.

  16. A novel methodology for interpreting air quality measurements from urban streets using CFD modelling

    Science.gov (United States)

    Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming

    2011-09-01

    In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population

  17. Modeling postpartum depression in rats: theoretic and methodological issues

    Science.gov (United States)

    Ming, LI; Shinn-Yi, CHOU

    2016-01-01

    The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions. PMID:27469254

  18. Modeling postpartum depression in rats: theoretic and methodological issues

    Directory of Open Access Journals (Sweden)

    Ming LI

    2018-06-01

    Full Text Available The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions.

  19. A geostatistical methodology to assess the accuracy of unsaturated flow models

    International Nuclear Information System (INIS)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error

  20. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  1. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects......This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... will be developed upon, will be discussed. Also, the parameters for evaluating the PSM will be considered. In establishing the theoretical body of knowledge with respect to CALS, an identification of schools and paradigms within the research area of applying information technology in a manufacturing environment...

  2. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  3. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    International Nuclear Information System (INIS)

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-01-01

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soil and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE's) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This

  4. Methodology if inspections to carry out the nuclear outages model

    International Nuclear Information System (INIS)

    Aycart, J.; Mortenson, S.; Fourquet, J. M.

    2005-01-01

    Before the nuclear generation industry was deregulated in the United States, refueling and maintenance outages in nuclear power plants usually lasted orotund 100 days. After deregulation took effect, improved capability factors and performances became more important. As a result, it became essential to reduce the critical path time during the outage, which meant that activities that had typically been done in series had to be executed in parallel. The new outage model required the development of new tools and new processes, The 360-degree platform developed by GE Energy has made it possible to execute multiple activities in parallel. Various in-vessel visual inspection (IVVI) equipments can now simultaneously perform inspections on the pressurized reactor vessel (RPV) components. The larger number of inspection equipments in turn results in a larger volume of data, with the risk of increasing the time needed for examining them and postponing the end of the analysis phase, which is critical for the outage. To decrease data analysis times, the IVVI Digitalisation process has been development. With this process, the IVVI data are sent via a high-speed transmission line to a site outside the Plant called Center of Excellence (COE), where a team of Level III experts is in charge of analyzing them. The tools for the different product lines are being developed to interfere with each other as little as possible, thus minimizing the impact of the critical path on plant refueling activities. Methods are also being developed to increase the intervals between inspection. In accordance with the guidelines of the Boiling Water Reactor Vessel and Internals project (BWRVIP), the intervals between inspections are typically longer if ultrasound volumetric inspections are performed than if the scope is limited to IVVI. (Author)

  5. Development of a methodology for accident causation research

    Science.gov (United States)

    1983-06-01

    The obj ective of this study was to fully develop and apply a me thodology to : study accident causation, uhich was outlined in a previous study . " Causal" factors : are those pre-crash factors, which are statistically related to the accident rate :...

  6. A Methodology For Developing an Agent Systems Reference Architecture

    Science.gov (United States)

    2010-05-01

    agent framworks , we create an abstraction noting similarities and differences. The differences are documented as points of variation. The result...situated in the physical en- vironment. Addressing how conceptual components of an agent system is beneficial to agent system architects, developers, and

  7. Advances in Artificial Neural Networks - Methodological Development and Application

    Science.gov (United States)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  8. Modeling of Throughput in Production Lines Using Response Surface Methodology and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Federico Nuñez-Piña

    2018-01-01

    Full Text Available The problem of assigning buffers in a production line to obtain an optimum production rate is a combinatorial problem of type NP-Hard and it is known as Buffer Allocation Problem. It is of great importance for designers of production systems due to the costs involved in terms of space requirements. In this work, the relationship among the number of buffer slots, the number of work stations, and the production rate is studied. Response surface methodology and artificial neural network were used to develop predictive models to find optimal throughput values. 360 production rate values for different number of buffer slots and workstations were used to obtain a fourth-order mathematical model and four hidden layers’ artificial neural network. Both models have a good performance in predicting the throughput, although the artificial neural network model shows a better fit (R=1.0000 against the response surface methodology (R=0.9996. Moreover, the artificial neural network produces better predictions for data not utilized in the models construction. Finally, this study can be used as a guide to forecast the maximum or near maximum throughput of production lines taking into account the buffer size and the number of machines in the line.

  9. Efficient methodologies for system matrix modelling in iterative image reconstruction for rotating high-resolution PET

    Energy Technology Data Exchange (ETDEWEB)

    Ortuno, J E; Kontaxakis, G; Rubio, J L; Santos, A [Departamento de Ingenieria Electronica (DIE), Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Guerra, P [Networking Research Center on Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), Madrid (Spain)], E-mail: juanen@die.upm.es

    2010-04-07

    A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

  10. Development of probabilistic risk assessment methodology against extreme snow for sodium-cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Yamano, Hidemasa, E-mail: yamano.hidemasa@jaea.go.jp; Nishino, Hiroyuki; Kurisaka, Kenichi

    2016-11-15

    Highlights: • Snow PRA methodology was developed. • Snow hazard category was defined as the combination of daily snowfall depth (speed) and snowfall duration. • Failure probability models of snow removal action, manual operation of the air cooler dampers and the access route were developed. • Snow PRA showed less than 10{sup −6}/reactor-year of core damage frequency. - Abstract: This paper describes snow probabilistic risk assessment (PRA) methodology development through external hazard and event sequence evaluations mainly in terms of decay heat removal (DHR) function of a sodium-cooled fast reactor (SFR). Using recent 50-year weather data at a typical Japanese SFR site, snow hazard categories were set for the combination of daily snowfall depth (snowfall speed) and snowfall duration which can be calculated by dividing the snow depth by the snowfall speed. For each snow hazard category, the event sequence was evaluated by event trees which consist of several headings representing the loss of DHR. Snow removal action and manual operation of the air cooler dampers were introduced into the event trees as accident managements. Access route failure probability model was also developed for the quantification of the event tree. In this paper, the snow PRA showed less than 10{sup −6}/reactor-year of core damage frequency. The dominant snow hazard category was the combination of 1–2 m/day of snowfall speed and 0.5–0.75 day of snowfall duration. Importance and sensitivity analyses indicated a high risk contribution of the securing of the access routes.

  11. Development of probabilistic risk assessment methodology against extreme snow for sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Yamano, Hidemasa; Nishino, Hiroyuki; Kurisaka, Kenichi

    2016-01-01

    Highlights: • Snow PRA methodology was developed. • Snow hazard category was defined as the combination of daily snowfall depth (speed) and snowfall duration. • Failure probability models of snow removal action, manual operation of the air cooler dampers and the access route were developed. • Snow PRA showed less than 10"−"6/reactor-year of core damage frequency. - Abstract: This paper describes snow probabilistic risk assessment (PRA) methodology development through external hazard and event sequence evaluations mainly in terms of decay heat removal (DHR) function of a sodium-cooled fast reactor (SFR). Using recent 50-year weather data at a typical Japanese SFR site, snow hazard categories were set for the combination of daily snowfall depth (snowfall speed) and snowfall duration which can be calculated by dividing the snow depth by the snowfall speed. For each snow hazard category, the event sequence was evaluated by event trees which consist of several headings representing the loss of DHR. Snow removal action and manual operation of the air cooler dampers were introduced into the event trees as accident managements. Access route failure probability model was also developed for the quantification of the event tree. In this paper, the snow PRA showed less than 10"−"6/reactor-year of core damage frequency. The dominant snow hazard category was the combination of 1–2 m/day of snowfall speed and 0.5–0.75 day of snowfall duration. Importance and sensitivity analyses indicated a high risk contribution of the securing of the access routes.

  12. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)

    2004-07-01

    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  13. Development of a Graphical Tool to integrate the Prometheus AEOlus methodology and Jason Platform

    Directory of Open Access Journals (Sweden)

    Rafhael CUNHA

    2017-07-01

    Full Text Available Software Engineering (SE is an area that intends to build high-quality software in a systematic way. However, traditional software engineering techniques and methods do not support the demand for developing Multiagent Systems (MAS. Therefore a new subarea has been studied, called Agent Oriented Software Engineering (AOSE. The AOSE area proposes solutions to issues related to the development of agent oriented systems. There is still no standardization in this subarea, resulting in several methodologies. Another issue of this subarea is that there are very few tools that are able to automatically generate code. In this work we propose a tool to support the Prometheus AEOlus Methodology because it provides modelling artifacts to all MAS dimensions: agents, environment, interaction, and organization. The tool supports all Prometheus AEOlus artifacts and can automatically generated code to the agent and interaction dimensions in the AgentSpeak Language, which is the language used in the Jason Platform. We have done some validations with the proposed tool and a case study is presented.

  14. New methodologies for calculation of flight parameters on reduced scale wings models in wind tunnel =

    Science.gov (United States)

    Ben Mosbah, Abdallah

    In order to improve the qualities of wind tunnel tests, and the tools used to perform aerodynamic tests on aircraft wings in the wind tunnel, new methodologies were developed and tested on rigid and flexible wings models. A flexible wing concept is consists in replacing a portion (lower and/or upper) of the skin with another flexible portion whose shape can be changed using an actuation system installed inside of the wing. The main purpose of this concept is to improve the aerodynamic performance of the aircraft, and especially to reduce the fuel consumption of the airplane. Numerical and experimental analyses were conducted to develop and test the methodologies proposed in this thesis. To control the flow inside the test sections of the Price-Paidoussis wind tunnel of LARCASE, numerical and experimental analyses were performed. Computational fluid dynamics calculations have been made in order to obtain a database used to develop a new hybrid methodology for wind tunnel calibration. This approach allows controlling the flow in the test section of the Price-Paidoussis wind tunnel. For the fast determination of aerodynamic parameters, new hybrid methodologies were proposed. These methodologies were used to control flight parameters by the calculation of the drag, lift and pitching moment coefficients and by the calculation of the pressure distribution around an airfoil. These aerodynamic coefficients were calculated from the known airflow conditions such as angles of attack, the mach and the Reynolds numbers. In order to modify the shape of the wing skin, electric actuators were installed inside the wing to get the desired shape. These deformations provide optimal profiles according to different flight conditions in order to reduce the fuel consumption. A controller based on neural networks was implemented to obtain desired displacement actuators. A metaheuristic algorithm was used in hybridization with neural networks, and support vector machine approaches and their

  15. Methodology development for the radioecological monitoring effectiveness estimation

    International Nuclear Information System (INIS)

    Gusev, A.E.; Kozlov, A.A.; Lavrov, K.N.; Sobolev, I.A.; Tsyplyakova, T.P.

    1997-01-01

    A general model for estimation of the programs assuring radiation and ecological public protection is described. The complex of purposes and criteria characterizing and giving an opportunity to estimate the effectiveness of environment protection program composition is selected. An algorithm for selecting the optimal management decision from the view point of work cost connected with population protection improvement is considered. The position of radiation-ecological monitoring in general problem of environment pollution is determined. It is shown that the monitoring organizing effectiveness is closely connected with population radiation and ecological protection

  16. Organizational Culture and Scale Development: Methodological Challenges and Future Directions

    Directory of Open Access Journals (Sweden)

    Bavik Ali

    2014-12-01

    Full Text Available Defining and measuring organizational culture (OC is of paramount importance to organizations because a strong culture could potentially increase service quality and yield sustainable competitive advantages. However, such process could be challenging to managers because the scope of OC has been defined differently across disciplines and industries, which has led to the development of various scales for measuring OC. In addition, previously developed OC scales may also not be fully applicable in the hospitality and tourism context. Therefore, by highlighting the key factors affecting the business environment and the unique characteristics of hospitality industry, this paper aims to align the scope of OC closely with the industry and to put forth the need for a new OC scale that accurately responds to the context of the hospitality industry.

  17. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  18. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  19. Regular website transformation to mobile friendly methodology development

    OpenAIRE

    Miščenkov, Ilja

    2017-01-01

    Nowadays, rate of technology improvement grows faster than ever which results in increased mobile device usage. Internet users often choose to browse their favorite websites via computers as well as mobile devices, however, not every website is suited to be displayed on both types of technology. As an example the website of Vilnius University’s Mathematics and Informatics faculty. Therefore the objective of this work is to develop a step-by-step procedure which is used to turn a regular websi...

  20. Development and new applications of quantum chemical simulation methodology

    International Nuclear Information System (INIS)

    Weiss, A. K. H.

    2012-01-01

    The Division of Theoretical Chemistry at the University of Innsbruck is focused on the study of chemical compounds in aqueous solution, in terms of mainly hybrid quantum mechanical / molecular mechanical molecular dynamics simulations (QM/MM MD). Besides the standard means of data analysis employed for such simulations, this study presents several advanced and capable algorithms for the description of structural and dynamic properties of the simulated species and its hydration. The first part of this thesis further presents selected exemplary simulations, in particular a comparative study of Formamide and N-methylformamide, Guanidinium, and Urea. An included review article further summarizes the major advances of these studies. The computer programs developed in the course of this thesis are by now well established in the research field. The second part of this study presents the theory and a development guide for a quantum chemical program, QuMuLuS, that is by now used as a QM program for recent QM/MM simulations at the division. In its course, this part presents newly developed algorithms for electron integral evaluation and point charge embedding. This program is validated in terms of benchmark computations. The associated theory is presented on a detailed level, to serve as a source for contemporary and future studies in the division. In the third and final part, further investigations of related topics are addressed. This covers additional schemes of molecular simulation analysis, new software, as well as a mathematical investigation of a non-standard two-electron integral. (author)

  1. Combinations of options: Methodology for impact analysis. Development plan 1993

    International Nuclear Information System (INIS)

    1992-01-01

    The orientations favored by Hydro-Quebec in terms of electricity supply and demand are based on a few key selection criteria. These criteria, as described in its development plan, pertain to economic benefit for the utility and its customers, compatibility with sustainable development, minimization of costs to customers, preservation of the utility's financial health, generation of economic spinoffs, and ease of adaptation. Impacts are calculated to illustrate the selection criteria. The main methods, assumptions, and components used in evaluating the various impacts are described. The discounted overall cost for Hydro-Quebec and all of its customers, means of meeting electricity requirements, and the economic benefit for Hydro-Quebec of the various market development options are discussed. The indicators chosen for environmental impact assessment are set forth and the method used to calculate long-term supply costs is presented, along with the methods for calculating economic spinoffs. Finally, the concepts of energy mix and energy self-sufficiency are outlined. 1 tab

  2. Using digital models for evaluation of effective solar radiance. Development of methodology and practice application; Empleo de modelos digitales del terreno para la evaluacion de la radiacionsolar effectiva. Desarrollo de una metodologia y aplicacion practica

    Energy Technology Data Exchange (ETDEWEB)

    Izco, E.; De Blas, M.; Torres, J. L.; Garcia, R.

    2004-07-01

    In this communication it has been described the use of advanced tools for determining the effective solar radiance, and its possible passive and active, thermic or photovoltaic, development, in various areas of buildings or urban zones. According to cartographic information of a Digital Elevation Model , and in ECOTEC v5.20 software. An hourly treatment of illuminated and shaded zones has been carried out for several days of the year. In a case study it has been proven that the software ECOTEC v5.20 can work with the Digital Elevation Model of Public University of Navarra, and it has been analyzed illuminated and shaded zones visual and quantitatively for two days of the year, summer and winter solstices . (Author)

  3. Methodology Development for Assessment of Spaceport Technology Returns and Risks

    Science.gov (United States)

    Joglekar, Prafulla; Zapata, Edgar

    2001-01-01

    As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A

  4. Development of a methodology for post closure radiological risk analysis of underground waste repositories. Illustrative assessment of the Harwell site

    International Nuclear Information System (INIS)

    Gralewski, Z.A.; Kane, P.; Nicholls, D.B.

    1987-06-01

    A probabilistic risk analysis (pra) is demonstrated for a number of ground water mediated release scenarios at the Harwell Site for a hypothetical repository at a depth of about 150 metres. This is the second stage of development of an overall risk assessment methodology. A procedure for carrying out multi-scenario assessment using available probabilistic risk assessment (pra) models is presented and a general methodology for combining risk contributions is outlined. Appropriate levels of model complexity in pra are discussed. Modelling requirements for the treatment of multiple simultaneous pathways and of site evolution are outlined. Further developments of pra systems are required to increase the realism of both the models and their mode of application, and hence to improve estimates of risk. (author)

  5. Contracting Selection for the Development of the Range Rule Risk Methodology

    National Research Council Canada - National Science Library

    1997-01-01

    ...-Effectiveness Risk Tool and contractor selection for the development of the Range Rule Risk Methodology. The audit objective was to determine whether the Government appropriately used the Ordnance and Explosives Cost-Effectiveness Risk Tool...

  6. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  7. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  8. An architecture and methodology for the design and development of Technical Information Systems

    NARCIS (Netherlands)

    Capobianchi, R.; Mautref, M.; van Keulen, Maurice; Balsters, H.

    In order to meet demands in the context of Technical Information Systems (TIS) pertaining to reliability, extensibility, maintainability, etc., we have developed an architectural framework with accompanying methodological guidelines for designing such systems. With the framework, we aim at complex

  9. A Software Planning and Development Methodology with Resource Allocation Capability

    Science.gov (United States)

    1986-01-01

    vll ACKNOWLEDGEMENTS There are many people who must be acknowledged for the support they provided during my graduate program at Texas A&M Dr. Lee ...acquisition, research/development, and operations/ maintenance sources. The concept of a resource mm >^"^*»T’i»"<Wt"> i PH D« mm^ ivi i t-il^’lfn" i^ I...James, Unpublished ICAM Industry Days address. New Orleans, Louisiana, May 1982. IllllHUIIIIVf 127 46. Ledbetter , William N., et al., "Education

  10. Development of multitracer methodology for the characterization of petroleum reservoirs

    International Nuclear Information System (INIS)

    Pereira, E.H.T.; Moreira, R.M.; Ferreira Pinto, A.M.; Floresta, D.L.

    2004-01-01

    Amongst other candidate tracers, the use of potassium thiocyanide labelled with 35 S (K 35 SCN) has been investigated. This species is highly water soluble, temperature resistant, and is not adsorbed in the extended solid surfaces of the formation pores. Being a beta emitter, it minimizes radiological protection problems but requires sampling for activity measurement in the laboratory. The paper describes the extraction of the elemental radiosulfur from KCl lattice and the development of an optimized route to synthesize the thiocyanide that avoids lengthy and numerous intermediate reactions and separations. Laboratory and ongoing field tests designed to validate the tracer are also described. (author)

  11. Development cooperation as methodology for teaching social responsibility to engineers

    Science.gov (United States)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  12. Modeling and Analysis of The Pressure Die Casting Using Response Surface Methodology

    International Nuclear Information System (INIS)

    Kittur, Jayant K.; Herwadkar, T. V.; Parappagoudar, M. B.

    2010-01-01

    Pressure die casting is successfully used in the manufacture of Aluminum alloys components for automobile and many other industries. Die casting is a process involving many process parameters having complex relationship with the quality of the cast product. Though various process parameters have influence on the quality of die cast component, major influence is seen by the die casting machine parameters and their proper settings. In the present work, non-linear regression models have been developed for making predictions and analyzing the effect of die casting machine parameters on the performance characteristics of die casting process. Design of Experiments (DOE) with Response Surface Methodology (RSM) has been used to analyze the effect of effect of input parameters and their interaction on the response and further used to develop nonlinear input-output relationships. Die casting machine parameters, namely, fast shot velocity, slow shot to fast shot change over point, intensification pressure and holding time have been considered as the input variables. The quality characteristics of the cast product were determined by porosity, hardness and surface rough roughness (output/responses). Design of experiments has been used to plan the experiments and analyze the impact of variables on the quality of casting. On the other-hand Response Surface Methodology (Central Composite Design) is utilized to develop non-linear input-output relationships (regression models). The developed regression models have been tested for their statistical adequacy through ANOVA test. The practical usefulness of these models has been tested with some test cases. These models can be used to make the predictions about different quality characteristics, for the known set of die casting machine parameters, without conducting the experiments.

  13. Methodology development to support NPR strategic planning. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget

  14. Development of a plastic fracture methodology for nuclear systems

    International Nuclear Information System (INIS)

    Marston, T.U.; Jones, R.L.; Kanninen, M.F.; Mowbray, D.F.

    1981-01-01

    This paper describes research conducted to develop a fundamental basis for flaw tolerance assessment procedures suitable for components exhibiting ductile behavior. The research was composed of an integrated combination of stable crack growth experiments and elastic-plastic analyses. A number of candidate fracture criteria were assembled and investigated to determine the proper basis for plastic fracture mechanics assessments. The results demonstrate that many different fracture criteria can be used as the basis of a resistance curve approach to predicting stable crack growth and fracture instability. While all have some disadvantages, none is completely unacceptable. On balance, the best criteria were found to be the J-integral for initiation and limited amounts of stable crack growth and the local crack-tip opening angle for extended amounts of stable growth. A combination of the two, which may preserve the advantages of each while reducing their disadvantages, also was suggested by these results. The influence of biaxial and mixed flat/shear fracture behavior was investigated and found to not alter the basic results. Further work in the development of simplified ductile fracture analyses for routine engineering assessments of nuclear pressure vessels and piping evolving from this research is also described

  15. A methodology to promote business development from research outcomes in food science and technology

    Directory of Open Access Journals (Sweden)

    Eduardo L. Cardoso

    2015-04-01

    Full Text Available Valorization of knowledge produced in research units has been a major challenge for research universities in contemporary societies. The prevailing forces have led these institutions to develop a “third mission”, the facilitation of technology transfer and activity in an entrepreneurial paradigm. Effective management of challenges encountered in the development of academic entrepreneurship and the associated valorization of knowledge produced by universities are major factors to bridge the gap between research and innovation in Europe.The need to improve the existing institutional knowledge valorization processes, concerning entrepreneurship and business development and the processes required were discussed.A case study was designed to describe the institutional knowledge valorization process in a food science and technology research unit and a related incubator, during a five year evaluation period that ended in 2012.The knowledge valorization processes benefited from the adoption of a structured framework methodology that led to ideas and teams from a business model generation to client development, in parallel, when possible, with an agile product/service development.Although academic entrepreneurship engagement could be improved, this case study demonstrated that stronger skills development was needed to enable the researcher to be more aware of business development fundamentals and therefore contribute to research decisions and the valorisation of individual and institutional knowledge assets. It was noted that the timing for involvement of companies in the research projects or programs varied with the nature of the research.

  16. Enabling Psychiatrists to be Mobile Phone App Developers: Insights Into App Development Methodologies.

    Science.gov (United States)

    Zhang, Melvyn Wb; Tsang, Tammy; Cheow, Enquan; Ho, Cyrus Sh; Yeong, Ng Beng; Ho, Roger Cm

    2014-11-11

    The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical

  17. Development Proliferation Resistance Assessment Methodology for Regulation Purposes

    International Nuclear Information System (INIS)

    Ham, Taekyu; Seo, Janghoon; Lee, Nayoung; Yoo, Hosik

    2015-01-01

    More than 45 countries are considering embarking on nuclear power programs. As a result, the world's nuclear power generating capacity is projected to continue to grow by 2030. The installed total nuclear capacity in 373 GWe in 2012 would reach 435 and 722 GWe by 2030 in low and high scenario predictions, respectively. In Korea, there are 23 nuclear power plants in operation. Thirteen more plants are either under construction or are being planned for completion by 2027. In addition, there are active researches is taking place into pyroprocessing technology for use in treating spent fuel and reducing storage. Measures for analyzing PR of a nuclear energy system were derived by collecting attributes that influence PR and then were categorized into groups. Three measures were then developed by a series of processes; legal and institutional framework, material characteristics, and safeguardability. Since, the extrinsic features are more practical to evaluate when a regulatory body evaluates a system

  18. Development of a methodology for maintenance optimization at Kozloduy NPP

    International Nuclear Information System (INIS)

    Kitchev, E.

    1997-01-01

    The paper presents the overview of a project for development of an applicable strategy and methods for Kozloduy NPP (KNPP) to optimize its maintenance program in order to meet the current risk based maintenance requirements. The strategy in a format of Integrated Maintenance Program (IMP) manual will define the targets of the optimization process, the major stages and elements of this process and their relationships. IMP embodies the aspects of the US NRC Maintenance Rule compliance and facilitates the integration of KNPP programs and processes which impact the plant maintenance and safety. The methods in a format of IMP Instructions (IM-PI) will define how the different IMP stages can be implemented and the IMP targets can be achieved at KNPP environment. (author). 8 refs

  19. Research Activities on Development of Piping Design Methodology of High Temperature Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Nam-Su [Seoul National Univ. of Science and Technology, Seoul(Korea, Republic of); Won, Min-Gu [Sungkyukwan Univ., Suwon (Korea, Republic of); Oh, Young-Jin [KEPCO Engineering and Construction Co. Inc., Gimcheon (Korea, Republic of); Lee, Hyeog-Yeon; Kim, Yoo-Gon [Korea Atomic Energy Research Institute, Daejeon(Korea, Republic of)

    2016-10-15

    A SFR is operated at high temperature and low pressure compared with commercial pressurized water reactor (PWR), and such an operating condition leads to time-dependent damages such as creep rupture, excessive creep deformation, creep-fatigue interaction and creep crack growth. Thus, high temperature design and structural integrity assessment methodology should be developed considering such failure mechanisms. In terms of design of mechanical components of SFR, ASME B and PV Code, Sec. III, Div. 5 and RCC-MRx provide high temperature design and assessment procedures for nuclear structural components operated at high temperature, and a Leak-Before-Break (LBB) assessment procedure for high temperature piping is also provided in RCC-MRx, A16. Three web-based evaluation programs based on the current high temperature codes were developed for structural components of high temperature reactors. Moreover, for the detailed LBB analyses of high temperature piping, new engineering methods for predicting creep C*-integral and creep COD rate based either on GE/EPRI or on reference stress concepts were proposed. Finally, the numerical methods based on Garofalo's model and RCC-MRx have been developed, and they have been implemented into ABAQUS. The predictions based on both models were compared with the experimental results, and it has been revealed that the predictions from Garafalo's model gave somewhat successful results to describe the deformation behavior of Gr. 91 at elevated temperatures.

  20. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  1. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  2. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Blakeman, Edward D [ORNL; Peplow, Douglas E. [ORNL; Wagner, John C [ORNL; Murphy, Brian D [ORNL; Mueller, Don [ORNL

    2007-09-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts.

  3. PWR Facility Dose Modeling Using MCNP5 and the CADIS/ADVANTG Variance-Reduction Methodology

    International Nuclear Information System (INIS)

    Blakeman, Edward D.; Peplow, Douglas E.; Wagner, John C.; Murphy, Brian D.; Mueller, Don

    2007-01-01

    The feasibility of modeling a pressurized-water-reactor (PWR) facility and calculating dose rates at all locations within the containment and adjoining structures using MCNP5 with mesh tallies is presented. Calculations of dose rates resulting from neutron and photon sources from the reactor (operating and shut down for various periods) and the spent fuel pool, as well as for the photon source from the primary coolant loop, were all of interest. Identification of the PWR facility, development of the MCNP-based model and automation of the run process, calculation of the various sources, and development of methods for visually examining mesh tally files and extracting dose rates were all a significant part of the project. Advanced variance reduction, which was required because of the size of the model and the large amount of shielding, was performed via the CADIS/ADVANTG approach. This methodology uses an automatically generated three-dimensional discrete ordinates model to calculate adjoint fluxes from which MCNP weight windows and source bias parameters are generated. Investigative calculations were performed using a simple block model and a simplified full-scale model of the PWR containment, in which the adjoint source was placed in various regions. In general, it was shown that placement of the adjoint source on the periphery of the model provided adequate results for regions reasonably close to the source (e.g., within the containment structure for the reactor source). A modification to the CADIS/ADVANTG methodology was also studied in which a global adjoint source is weighted by the reciprocal of the dose response calculated by an earlier forward discrete ordinates calculation. This method showed improved results over those using the standard CADIS/ADVANTG approach, and its further investigation is recommended for future efforts

  4. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  5. Methodology for identifying parameters for the TRNSYS model Type 210 - wood pellet stoves and boilers

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Tomas; Fiedler, Frank; Nordlander, Svante

    2006-05-15

    This report describes a method how to perform measurements on boilers and stoves and how to identify parameters from the measurements for the boiler/stove-model TRNSYS Type 210. The model can be used for detailed annual system simulations using TRNSYS. Experience from measurements on three different pellet stoves and four boilers were used to develop this methodology. Recommendations for the set up of measurements are given and the required combustion theory for the data evaluation and data preparation are given. The data evaluation showed that the uncertainties are quite large for the measured flue gas flow rate and for boilers and stoves with high fraction of energy going to the water jacket also the calculated heat rate to the room may have large uncertainties. A methodology for the parameter identification process and identified parameters for two different stoves and three boilers are given. Finally the identified models are compared with measured data showing that the model generally agreed well with measured data during both stationary and dynamic conditions.

  6. Limitations of JEDI Models | Jobs and Economic Development Impact Models |

    Science.gov (United States)

    Group's IMPLAN accounting software. For JEDI, these are updated every two years for the best available -output modeling remains a widely used methodology for measuring economic development activity. Definition definition of the geographic area under consideration. Datasets of multipliers from IMPLAN are available at

  7. Development of IRMA reagent and methodology for PSA

    International Nuclear Information System (INIS)

    Najafi, R.

    1997-01-01

    The PSA test is a solid phase two-site immunoassay. Rabbit anti PSA is coated or bound on surface of solid phase and monoclonal anti PSA labeled with 1-125. The PSA molecules present in the standard solution or serum are 'Sandwiched' between the two antibodies. After formation of coated antibody-antigen-labeled antibody complex, the unbound labeled antibody will removed by washing. The complex is measured by gamma counter. The concentration of analyte is proportional to the counts of test sample. In order to develop kits for IRMA PSA, it should be prepared three essential reagents Antibody coated solid phase, labeled antibody, standards and finally optimizing them to obtain an standard curve fit to measure specimen PSA in desired range of concentration. The type of solid phase and procedure(s) to coat or bind to antibody, is still main debatable subject in development and setting up RIA/IRMA kits. In our experiments, polystyrene beads, because of their easy to coat with antibody as well as easy to use, can be considered as a desired solid phase. Most antibodies are passively adsorbed to a plastic surface (e.g. Polystyrene, Propylene, and Polyvinyl chloride) from a diluted buffer. The antibody coated plastic surface, then acts as solid phase reagent. Poor efficiency and time required to reach equilibrium and also lack of reproducibility especially batch-to-batch variation between materials, are disadvantages in this simple coating procedure. Improvements can be made by coating second antibody on surface of beads, and reaction between second and primary antibodies. There is also possible to enhance more coating efficiency of beads by using Staphylococcus ureus-Protein A. Protein A is a major component of staphylococcus aureus cell wall which has an affinity for FC segment of immunoglobulin G (IgG) of some species, including human; rabbit; and mice. This property of Staphylococcal Protein A has made it a very useful tool in the purification of classes and subclasses

  8. The development of evaluation methodology for advanced interactive communication

    International Nuclear Information System (INIS)

    Okamoto, K.

    2005-01-01

    Face-to-face communication is one of the essential style of communication. Trough face-to-face communication, people exchange much information at a time, both verbal and non-verbal information, which is most effective to learn each other. The authors focused on the face-to-face communication, and developed an evaluation method to quantify the effectiveness of communication. We regard conversation as an exchange of keywords. The effectiveness of conversation is valued by the amount of the keywords, and the achievement of mutual understandings. Through two people's face-to-face communication, the author quantified the shared information by measuring the change of the amount of the participants' knowledge. The participants' knowledge is counted by the words they can give. We measured the change in their shared knowledge (number of the words they gave associated to the theme). And we also quantified the discords in their understandings against their partners by measuring the discords between the knowledge that they think they share and the knowledge that they really share. Through these data, we evaluate the effectiveness of communication and analyzed the trends of mutual understanding. (authors)

  9. Recent progress and developments in LWR-PV calculational methodology

    International Nuclear Information System (INIS)

    Maerker, R.E.; Broadhead, B.L.; Williams, M.L.

    1984-01-01

    New and improved techniques for calculating beltline surveillance activities and pressure vessel fluences with reduced uncertainties have recently been developed. These techniques involve the combining of monitored in-core power data with diffusion theory calculated pin-by-pin data to yield absolute source distributions in R-THETA and R-Z geometries suitable for discrete ordinate transport calculations. Effects of finite core height, whenever necessary, can be considered by the use of a three-dimensional fluence rate synthesis procedure. The effects of a time-dependent spatial source distribution may be readily evaluated by applying the concept of the adjoint function, and simplifying the procedure to such a degree that only one forward and one adjoint calculation are required to yield all the dosimeter activities for all beltline surveillance locations at once. The addition of several more adjoint calculations using various fluence rates as responses is all that is needed to determine all the pressure vessel group fluences for all beltline locations for an arbitrary source distribution

  10. Pediatric hospital medicine core competencies: development and methodology.

    Science.gov (United States)

    Stucky, Erin R; Ottolini, Mary C; Maniscalco, Jennifer

    2010-01-01

    Pediatric hospital medicine is the most rapidly growing site-based pediatric specialty. There are over 2500 unique members in the three core societies in which pediatric hospitalists are members: the American Academy of Pediatrics (AAP), the Academic Pediatric Association (APA) and the Society of Hospital Medicine (SHM). Pediatric hospitalists are fulfilling both clinical and system improvement roles within varied hospital systems. Defined expectations and competencies for pediatric hospitalists are needed. In 2005, SHM's Pediatric Core Curriculum Task Force initiated the project and formed the editorial board. Over the subsequent four years, multiple pediatric hospitalists belonging to the AAP, APA, or SHM contributed to the content of and guided the development of the project. Editors and collaborators created a framework for identifying appropriate competency content areas. Content experts from both within and outside of pediatric hospital medicine participated as contributors. A number of selected national organizations and societies provided valuable feedback on chapters. The final product was validated by formal review from the AAP, APA, and SHM. The Pediatric Hospital Medicine Core Competencies were created. They include 54 chapters divided into four sections: Common Clinical Diagnoses and Conditions, Core Skills, Specialized Clinical Services, and Healthcare Systems: Supporting and Advancing Child Health. Each chapter can be used independently of the others. Chapters follow the knowledge, skills, and attitudes educational curriculum format, and have an additional section on systems organization and improvement to reflect the pediatric hospitalist's responsibility to advance systems of care. These competencies provide a foundation for the creation of pediatric hospital medicine curricula and serve to standardize and improve inpatient training practices. (c) 2010 Society of Hospital Medicine.

  11. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...... of parameters was performed in order to get a good model fit to the data. However, not all parameters are identifiable with the given data set and model structure. Sensitivity, identifiability, and uncertainty analysis were completed and a relevant identifiable subset of parameters was determined for a new...

  12. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  13. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  14. New droplet model developments

    International Nuclear Information System (INIS)

    Dorso, C.O.; Myers, W.D.; Swiatecki, W.J.; Moeller, P.; Treiner, J.; Weiss, M.S.

    1985-09-01

    A brief summary is given of three recent contributions to the development of the Droplet Model. The first concerns the electric dipole moment induced in octupole deformed nuclei by the Coulomb redistribution. The second concerns a study of squeezing in nuclei and the third is a study of the improved predictive power of the model when an empirical ''exponential'' term is included. 25 refs., 3 figs

  15. Maturity Models Development in IS Research

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2015-01-01

    Maturity models are widespread in IS research and in particular, IT practitioner communities. However, theoretically sound, methodologically rigorous and empirically validated maturity models are quite rare. This literature review paper focuses on the challenges faced during the development...... literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability...... Maturity Model (CMM). Only recently have there been some research efforts to standardize maturity model development. We also identify three dominant views of maturity models and provide guidelines for various approaches of constructing maturity models with a standard vocabulary. We finally propose using...

  16. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01