WorldWideScience

Sample records for modeling methodology development

  1. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  2. METHODOLOGICAL APPROACHES FOR MODELING THE RURAL SETTLEMENT DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Gorbenkova Elena Vladimirovna

    2017-10-01

    Full Text Available Subject: the paper describes the research results on validation of a rural settlement developmental model. The basic methods and approaches for solving the problem of assessment of the urban and rural settlement development efficiency are considered. Research objectives: determination of methodological approaches to modeling and creating a model for the development of rural settlements. Materials and methods: domestic and foreign experience in modeling the territorial development of urban and rural settlements and settlement structures was generalized. The motivation for using the Pentagon-model for solving similar problems was demonstrated. Based on a systematic analysis of existing development models of urban and rural settlements as well as the authors-developed method for assessing the level of agro-towns development, the systems/factors that are necessary for a rural settlement sustainable development are identified. Results: we created the rural development model which consists of five major systems that include critical factors essential for achieving a sustainable development of a settlement system: ecological system, economic system, administrative system, anthropogenic (physical system and social system (supra-structure. The methodological approaches for creating an evaluation model of rural settlements development were revealed; the basic motivating factors that provide interrelations of systems were determined; the critical factors for each subsystem were identified and substantiated. Such an approach was justified by the composition of tasks for territorial planning of the local and state administration levels. The feasibility of applying the basic Pentagon-model, which was successfully used for solving the analogous problems of sustainable development, was shown. Conclusions: the resulting model can be used for identifying and substantiating the critical factors for rural sustainable development and also become the basis of

  3. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...

  4. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  5. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  6. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  7. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. A PROPOSED MODEL OF AGILE METHODOLOGY IN SOFTWARE DEVELOPMENT

    OpenAIRE

    Anjali Sharma*, Karambir

    2016-01-01

    Agile Software development has been increasing popularity and replacing the traditional methods of software develop-ment. This paper presents the all neural network techniques including General Regression Neural Networks (GRNN), Prob-abilistic Neural Network (PNN), GMDH Polynomial Neural Network, Cascade correlation neural network and a Machine Learning Technique Random Forest. To achieve better prediction, effort estimation of agile projects we will use Random Forest with Story Points Approa...

  9. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    Science.gov (United States)

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means.

  10. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  11. A Methodological Note for the Development of Integrated Aquaculture Production Models

    Directory of Open Access Journals (Sweden)

    Stella Tsani

    2018-01-01

    Full Text Available Aquaculture production can yield significant economic, social, and environmental effects. These exceed the financial costs and benefits aquaculture producers are faced with. We propose a methodology for the development of integrated production models that allow for the inclusion of the socio-economic and environmental effects of aquaculture into the production management. The methodology develops on a Social Cost-Benefit Analysis context and it includes three parts: (i environmental, that captures the interactions of aquaculture with the environment, (ii economic, that makes provision for the incorporation of economic determinants in the production models and (iii social, that introduces the social preferences to the production and management process. Alternatives to address data availability issues are also discussed. The methodology extends the assessment of the costs and benefits of aquaculture beyond pure financial metrics and beyond the quantification of private costs and benefits. It can also support the development of integrated models of aquaculture production that take into consideration both the private and the social costs and benefits associated with externalities and effects not appropriately captured by market mechanisms. The methodology can support aquaculture management and policies targeting sustainable and efficient aquaculture production and financing from an economic, financial, social, and environmental point of view.

  12. Methodology Development for SiC Sensor Signal Modelling in the Nuclear Reactor Radiation Environments

    International Nuclear Information System (INIS)

    Cetnar, J.; Krolikowski, I.P.

    2013-06-01

    This paper deals with SiC detector simulation methodology for signal formation by neutrons and induced secondary radiation as well as its inverse interpretation. The primary goal is to achieve the SiC capability of simultaneous spectroscopic measurements of neutrons and gamma-rays for which an appropriate methodology of the detector signal modelling and its interpretation must be adopted. The process of detector simulation is divided into two basically separate but actually interconnected sections. The first one is the forward simulation of detector signal formation in the field of the primary neutron and secondary radiations, whereas the second one is the inverse problem of finding a representation of the primary radiation, based on the measured detector signals. The applied methodology under development is based on the Monte Carlo description of radiation transport and analysis of the reactor physics. The methodology of SiC detector signal interpretation will be based on the existing experience in neutron metrology developed in the past for various neutron and gamma-ray detection systems. Since the novel sensors based on SiC are characterised by a new structure, yet to be finally designed, the methodology for particle spectroscopic fluence measurement must be developed while giving a productive feed back to the designing process of SiC sensor, in order to arrive at the best possible design. (authors)

  13. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  14. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  15. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  16. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) assessment of property model prediction errors, (iii) effect of outliers and data pre-treatment, (iv) formulation of parameter estimation problem (e.g. weighted least squares, ordinary least squares, robust regression, etc.) In this study a comprehensive methodology is developed to perform a rigorous......) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...

  17. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.

  18. Topobathymetric elevation model development using a new methodology: Coastal National Elevation Database

    Science.gov (United States)

    Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John

    2016-01-01

    During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models

  19. Development of a system dynamics model based on Six Sigma methodology

    Directory of Open Access Journals (Sweden)

    José Jovani Cardiel Ortega

    2017-01-01

    Full Text Available A dynamic model to analyze the complexity associated with the manufacturing systems and to improve the performance of the process through the Six Sigma philosophy is proposed. The research focuses on the implementation of the system dynamics tool to comply with each of the phases of the DMAIC methodology. In the first phase, define, the problem is articulated, collecting data, selecting the variables, and representing them in a mental map that helps build the dynamic hypothesis. In the second phase, measure, model is formulated, equations are developed, and Forrester diagram is developed to carry out the simulation. In the third phase, analyze, the simulation results are studied. For the fourth phase, improving, the model is validated through a sensitivity analysis. Finally, in control phase, operation policies are proposed. This paper presents the development of a dynamic model of the system of knitted textile production knitted developed; the implementation was done in a textile company in southern Guanajuato. The results show an improvement in the process performance by increasing the level of sigma allowing the validation of the proposed approach.

  20. Safeguards methodology development history

    International Nuclear Information System (INIS)

    Chapman, L.D.; Bennett, H.A.; Engi, D.; Grady, L.M.; Hulme, B.L.; Sasser, D.W.

    1979-01-01

    The development of models for the evaluation and design of fixed-site nuclear facility, physical protection systems was under way in 1974 at Sandia Laboratories and has continued to the present. A history of the evolution of these models and the model descriptions are presented. Several models have been and are continuing to be applied to evaluate and design facility protection systems

  1. Scenario Methodology for Modelling of Future Landscape Developments as Basis for Assessing Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Matthias Rosenberg

    2014-04-01

    Full Text Available The ecosystems of our intensively used European landscapes produce a variety of natural goods and services for the benefit of humankind, and secure the basics and quality of life. Because these ecosystems are still undergoing fundamental changes, the interest of the society is to know more about future developments and their ecological impacts. To describe and analyze these changes, scenarios can be developed and an assessment of the ecological changes can be carried out subsequently. In the project „Landscape Saxony 2050“; a methodology for the construction of exploratory scenarios was worked out. The presented methodology provides a possibility to identify the driving forces (socio-cultural, economic and ecological conditions of the landscape development. It allows to indicate possible future paths which lead to a change of structures and processes in the landscape and can influence the capability to provide ecosystem services. One essential component of the applied technique is that an approach for the assessment of the effects of the landscape changes on ecosystem services is integrated into the developed scenario methodology. Another is, that the methodology is strong designed as participatory, i.e. stakeholders are integrated actively. The method is a seven phase model which provides the option for the integration of the stakeholders‘ participation at all levels of scenario development. The scenario framework was applied to the district of Görlitz, an area of 2100 sq km located at the eastern border of Germany. The region is affected by strong demographic as well as economic changes. The core issue focused on the examination of landscape change in terms of biodiversity. Together with stakeholders, a trend scenario and two alternative scenarios were developed. The changes of the landscape structure are represented in story lines, maps and tables. On basis of the driving forces of the issue areas „cultural / social values“ and

  2. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  3. Methodologies Related to Computational models in View of Developing Anti-Alzheimer Drugs: An Overview.

    Science.gov (United States)

    Baheti, Kirtee; Kale, Mayura Ajay

    2018-04-17

    Since last two decades, there has been more focus on the development strategies related to Anti-Alzheimer's drug research. This may be attributed to the fact that most of the Alzheimer's cases are still mostly unknown except for a few cases, where genetic differences have been identified. With the progress of the disease, the symptoms involve intellectual deterioration, memory impairment, abnormal personality and behavioural patterns, confusion, aggression, mood swings, irritability Current therapies available for this disease give only symptomatic relief and do not focus on manipulations of biololecular processes. Nearly all the therapies to treat Alzheimer's disease, target to change the amyloid cascade which is considered to be an important in AD pathogenesis. New drug regimens are not able to keep pace with the ever-increasing understanding about dementia at molecular level. Looking into these aggravated problems, we though to put forth molecular modeling as a drug discovery approach for developing novel drugs to treat Alzheimer disease. The disease is incurable and it gets worst as it advances and finally causes death. Due to this, the design of drugs to treat this disease has become an utmost priority for research. One of the most important emerging technologies applied for this has been Computer-assisted drug design (CADD). It is a research tool that employs large scale computing strategies in an attempt to develop a model receptor site which can be used for designing of an anti-Alzheimer drug. The various models of amyloid-based calcium channels have been computationally optimized. Docking and De novo evolution are used to design the compounds. These are further subjected to absorption, distribution, metabolism, excretion and toxicity (ADMET) studies to finally bring about active compounds that are able to cross BBB. Many novel compounds have been designed which might be promising ones for the treatment of AD. The present review describes the research

  4. Epilepsy Therapy Development: Technical and Methodological Issues in Studies with Animal Models

    Science.gov (United States)

    Galanopoulou, Aristea S.; Kokaia, Merab; Loeb, Jeffrey A.; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A.; Staley, Kevin J.; Whittemore, Vicky H.; Dudek, F. Edward

    2013-01-01

    SUMMARY The search for new treatments for seizures, epilepsies and their comorbidities faces considerable challenges. Partly, this is due to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty to predict the efficacy, tolerability and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Here we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodological and reporting practices that will enhance the uniformity, reliability and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multi-disciplinary approaches. The topics considered include: (a) implementation of better study design and reporting practices, (b) incorporation in the study design and analysis of covariants that may impact outcomes (including species, age, sex), (c) utilization of approaches to document target relevance, exposure and engagement by the tested treatment, (d) utilization of clinically relevant treatment protocols, (e) optimization of the use of video-EEG recordings to best meet the study goals, and (f) inclusion of outcome measures that address the tolerability of the treatment or study endpoints apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and development. We propose several infrastructure

  5. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  6. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Energy Technology Data Exchange (ETDEWEB)

    Georgieva, Emiliya Lyudmilova

    2016-06-06

    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  7. Development of a new damage function model for power plants: Methodology and applications

    International Nuclear Information System (INIS)

    Levy, J.I.; Hammitt, J.K.; Yanagisawa, Y.; Spengler, J.D.

    1999-01-01

    Recent models have estimated the environmental impacts of power plants, but differences in assumptions and analytical methodologies have led to diverging findings. In this paper, the authors present a new damage function model that synthesizes previous efforts and refines components that have been associated with variations in impact estimates. Their model focuses on end-use emissions and quantified the direct human health impacts of criteria air pollutants. To compare their model to previous efforts and to evaluate potential policy applications, the authors assess the impacts of an oil and natural gas-fueled cogeneration power plant in Boston, MA. Impacts under baseline assumptions are estimated to be $0.007/kWh of electricity, $0.23/klb of steam, and $0.004/ton-h of chilled water (representing 2--9% of the market value of outputs). Impacts are largely related to ozone (48%) and particulate matter (42%). Addition of upstream emissions and nonpublic health impacts increases externalities by as much as 50%. Sensitivity analyses demonstrate the importance of plant siting, meteorological conditions, epidemiological assumptions, and the monetary value placed on premature mortality as well as the potential influence of global warming. Comparative analyses demonstrate that their model provides reasonable impact estimates and would therefore be applicable in a broad range of policy settings

  8. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T

    International Nuclear Information System (INIS)

    Hurtado, A.; Eguilior, S.; Recreo, F.

    2011-01-01

    In the framework of CO 2 Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  9. Development Customer Knowledge Management (Ckm) Models in Purbalingga Hospitality Using Soft Systems Methodology (Ssm)

    OpenAIRE

    Chasanah, Nur; Sensuse, Dana Indra; Lusa, Jonathan Sofian

    2014-01-01

    Development of the tourism sector is part of the national development efforts that are being implemented in Indonesia. This research was conducted with the customer to make an overview of knowledge management models to address the existing problems in hospitality in the hospitality Purbalingga as supporting tourism Purbalingga. The model depicts a series of problem-solving activities that result in the hospitality, especially in Purbalingga. This research was action research with methods of S...

  10. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  11. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  12. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  13. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  14. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  15. Developing a Validation Methodology for Expert-Informed Bayesian Network Models Supporting Nuclear Nonproliferation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    White, Amanda M.; Gastelum, Zoe N.; Whitney, Paul D.

    2014-05-13

    Under the auspices of Pacific Northwest National Laboratory’s Signature Discovery Initiative (SDI), the research team developed a series of Bayesian Network models to assess multi-source signatures of nuclear programs. A Bayesian network is a mathematical model that can be used to marshal evidence to assess competing hypotheses. The purpose of the models was to allow non-expert analysts to benefit from the use of expert-informed mathematical models to assess nuclear programs, because such assessments require significant technical expertise ranging from the nuclear fuel cycle, construction and engineering, imagery analysis, and so forth. One such model developed under this research was aimed at assessing the consistency of open-source information about a nuclear facility with the facility’s declared use. The model incorporates factors such as location, security and safety features among others identified by subject matter experts as crucial to their assessments. The model includes key features, observables and their relationships. The model also provides documentation, which serves as training materials for the non-experts.

  16. MODELS OF THE 5 PORTERS COMPETITIVE FORCES METHODOLOGY CHANGES IN COMPANIES STRATEGY DEVELOPMENT ON COMPETITIVE MARKET

    Directory of Open Access Journals (Sweden)

    Sergey I Zubin

    2014-01-01

    Full Text Available There are some different types of approaches to 5 Porters Forces model development in thisarticle. Authors take up the negative attitude researcher reasons to this instrument and inputsuch changes in it, which can help to fi nd the best way to companies growing up on competitive market.

  17. Analysis of methodology for designing education and training model for professional development in the field of radiation technology

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kon Wuk; Lee, Jae Hun; Park, Tai Jin; Song, Myung Jae [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-02-15

    The domestic Radiation Technology is integrated into and utilized in various areas and is closely related to the industrial growth in Korea. The domestic use of radiation and RI (Radioisotope) increases in quantity every year, however the level of technology is poor when compared to other developed countries. Manpower training is essential for the development of Radiation Technology. Therefore, this study aimed to propose a methodology for designing systemic education and training model in the field of measurement and analysis of radiation. A survey was conducted to design education and training model and the training program for measurement and analysis of radiation was developed based on the survey results. The education and training program designed in this study will be utilized as a model for evaluating the professional development and effective recruitment of the professional workforce, and can be further applied to other radiation-related fields.

  18. Analysis of methodology for designing education and training model for professional development in the field of radiation technology

    International Nuclear Information System (INIS)

    Kim, Kon Wuk; Lee, Jae Hun; Park, Tai Jin; Song, Myung Jae

    2015-01-01

    The domestic Radiation Technology is integrated into and utilized in various areas and is closely related to the industrial growth in Korea. The domestic use of radiation and RI (Radioisotope) increases in quantity every year, however the level of technology is poor when compared to other developed countries. Manpower training is essential for the development of Radiation Technology. Therefore, this study aimed to propose a methodology for designing systemic education and training model in the field of measurement and analysis of radiation. A survey was conducted to design education and training model and the training program for measurement and analysis of radiation was developed based on the survey results. The education and training program designed in this study will be utilized as a model for evaluating the professional development and effective recruitment of the professional workforce, and can be further applied to other radiation-related fields

  19. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Laxemar

    International Nuclear Information System (INIS)

    Aneljung, Maria; Sassner, Mona; Gustafsson, Lars-Goeran

    2007-11-01

    This report describes modelling where the hydrological modelling system MIKE SHE has been used to describe surface hydrology, near-surface hydrogeology, advective transport mechanisms, and the contact between groundwater and surface water within the SKB site investigation area at Laxemar. In the MIKE SHE system, surface water flow is described with the one-dimensional modelling tool MIKE 11, which is fully and dynamically integrated with the groundwater flow module in MIKE SHE. In early 2008, a supplementary data set will be available and a process of updating, rebuilding and calibrating the MIKE SHE model based on this data set will start. Before the calibration on the new data begins, it is important to gather as much knowledge as possible on calibration methods, and to identify critical calibration parameters and areas within the model that require special attention. In this project, the MIKE SHE model has been further developed. The model area has been extended, and the present model also includes an updated bedrock model and a more detailed description of the surface stream network. The numerical model has been updated and optimized, especially regarding the modelling of evapotranspiration and the unsaturated zone, and the coupling between the surface stream network in MIKE 11 and the overland flow in MIKE SHE. An initial calibration has been made and a base case has been defined and evaluated. In connection with the calibration, the most important changes made in the model were the following: The evapotranspiration was reduced. The infiltration capacity was reduced. The hydraulic conductivities of the Quaternary deposits in the water-saturated part of the subsurface were reduced. Data from one surface water level monitoring station, four surface water discharge monitoring stations and 43 groundwater level monitoring stations (SSM series boreholes) have been used to evaluate and calibrate the model. The base case simulations showed a reasonable agreement

  20. A flexible hydrological modelling system developed using an object oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Rinde, Trond

    1998-12-31

    The report presents a software system called Process Integrating Network (PINE). The capabilities, working principles, programming technical design and principles of use of the system are described as are some practical applications. PINE is a simulation tool for modelling of hydrological and hydrologically related phenomena. The system is based on object oriented programming principles and was specially designed to provide freedom in the choice of model structures and algorithms for process descriptions. It supports full freedom with regards to spatial distribution and temporal resolution. Geographical information systems (GIS) may be integrated with PINE in order to provide full spatial distribution in system parametrisation, process simulation and visualisation of simulation results. Simulation models are developed by linking components for process description together in a structure. The system can handle compound working media such as water with chemical or biological constituents. Non-hydrological routines may then be included to describe the responses of such constituents. Features such as extensibility and reuse of program components are emphasised in the program design. Separation between process topology, process descriptions and process data facilitates simple and consistent implementation of components for process description. Such components may be automatically prototyped and their response functions may be implemented without knowledge of other parts of the program system and without the need to program import or export routines or a user interface. Model extension is thus a rapid process that does not require extensive programming skills. Components for process descriptions may further be placed in separate program libraries, which can be included in the program as required. The program system can thus be very compact while it still has a large number of process algorithms available. The system can run on both PC and UNIX platforms. 106 figs., 20

  1. Enviro-HIRLAM online integrated meteorology-chemistry modelling system: strategy, methodology, developments and applications (v7.2)

    Science.gov (United States)

    Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe

    2017-08-01

    The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose model configurations for the meteorological and air quality communities are discussed.

  2. Quantitative Developments of Biomolecular Databases, Measurement Methodology, and Comprehensive Transport Models for Bioanalytical Microfluidics

    Science.gov (United States)

    2006-10-01

    chemistry models (beads and surfaces)[38] M11. Biochemistry database integrated with electrochemistry M12. Hydrogel models for surface biochemistry[30] M13 ...bacteria and λ- phage DNA. This device relies on the balance between electroosmotic flow and DEP force on suspended particles. In another application...electrochemistry M12. Hydrogel models for surface biochemistry[30] M13 . Least square-based engine for extraction of kinetic coefficients[38] M14. Rapid ANN

  3. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  4. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  5. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    Energy Technology Data Exchange (ETDEWEB)

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  6. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier

    2009-01-01

    harmonized in an iterative way removing those identified differences which were unintentional or unnecessary and thereby reducing the inter-model variation. A parsimonious (as simple as possible but as complex as needed) and transparent consensus model, USEtox™, was created containing only the most....... The USEtox™ model has been used to calculate characterization factors for several thousand substances and is currently under review with the intention that it shall form the basis of the recommendations from the UNEP-SETAC Life Cycle Initiative regarding characterization of toxic impacts in Life Cycle...

  7. Development and application of compact models of packages based on DELPHI methodology

    CERN Document Server

    Parry, J; Shidore, S

    1997-01-01

    The accurate prediction of the temperatures of critical electronic parts at the package- board- and system-level is seriously hampered by the lack of reliable, standardised input data for the characterisation of the thermal $9 behaviour of these parts. The recently completed collaborative European project, DELPHI has been concerned with the creation and experimental validation of thermal models (both detailed and compact) of a range of electronic parts, $9 including mono-chip packages. This paper demonstrates the reliable performance of thermal compact models in a range of applications, by comparison with the detailed models from which they were derived. (31 refs).

  8. Preliminary Results from a Model-Driven Architecture Methodology for Development of an Event-Driven Space Communications Service Concept

    Science.gov (United States)

    Roberts, Christopher J.; Morgenstern, Robert M.; Israel, David J.; Borky, John M.; Bradley, Thomas H.

    2017-01-01

    NASA's next generation space communications network will involve dynamic and autonomous services analogous to services provided by current terrestrial wireless networks. This architecture concept, known as the Space Mobile Network (SMN), is enabled by several technologies now in development. A pillar of the SMN architecture is the establishment and utilization of a continuous bidirectional control plane space link channel and a new User Initiated Service (UIS) protocol to enable more dynamic and autonomous mission operations concepts, reduced user space communications planning burden, and more efficient and effective provider network resource utilization. This paper provides preliminary results from the application of model driven architecture methodology to develop UIS. Such an approach is necessary to ensure systematic investigation of several open questions concerning the efficiency, robustness, interoperability, scalability and security of the control plane space link and UIS protocol.

  9. Sensitivity analysis and development of calibration methodology for near-surface hydrogeology model of Forsmark

    International Nuclear Information System (INIS)

    Aneljung, Maria; Gustafsson, Lars-Goeran

    2007-04-01

    The hydrological modelling system MIKE SHE has been used to describe near-surface groundwater flow, transport mechanisms and the contact between ground- and surface water at the Forsmark site. The surface water system at Forsmark is described with the 1D modelling tool MIKE 11, which is fully and dynamically integrated with MIKE SHE. In spring 2007, a new data freeze will be available and a process of updating, rebuilding and calibrating the MIKE SHE model will start, based on the latest data set. Prior to this, it is important to gather as much knowledge as possible on calibration methods and to define critical calibration parameters and areas within the model. In this project, an optimization of the numerical description and an initial calibration of the MIKE SHE model has been made, and an updated base case has been defined. Data from 5 surface water level monitoring stations, 4 surface water discharge monitoring stations and 32 groundwater level monitoring stations (SFM soil boreholes) has been used for model calibration and evaluation. The base case simulations generally show a good agreement between calculated and measured water levels and discharges, indicating that the total runoff from the area is well described by the model. Moreover, with two exceptions (SFM0012 and SFM0022) the base case results show very good agreement between calculated and measured groundwater head elevations for boreholes installed below lakes. The model also shows a reasonably good agreement between calculated and measured groundwater head elevations or depths to phreatic surfaces in many other points. The following major types of calculation-measurement differences can be noted: Differences in groundwater level amplitudes due to transpiration processes. Differences in absolute mean groundwater head, due to differences between borehole casing levels and the interpolated DEM. Differences in absolute mean head elevations, due to local errors in hydraulic conductivity values

  10. Development of a fluidized bed agglomeration modeling methodology to include particle-level heterogeneities in ash chemistry and granular physics

    Science.gov (United States)

    Khadilkar, Aditi B.

    The utility of fluidized bed reactors for combustion and gasification can be enhanced if operational issues such as agglomeration are mitigated. The monetary and efficiency losses could be avoided through a mechanistic understanding of the agglomeration process and prediction of operational conditions that promote agglomeration. Pilot-scale experimentation prior to operation for each specific condition can be cumbersome and expensive. So the development of a mathematical model would aid predictions. With this motivation, the study comprised of the following model development stages- 1) development of an agglomeration modeling methodology based on binary particle collisions, 2) study of heterogeneities in ash chemical composition and gaseous atmosphere, 3) computation of a distribution of particle collision frequencies based on granular physics for a poly-disperse particle size distribution, 4) combining the ash chemistry and granular physics inputs to obtain agglomerate growth probabilities and 5) validation of the modeling methodology. The modeling methodology comprised of testing every binary particle collision in the system for sticking, based on the extent of dissipation of the particles' kinetic energy through viscous dissipation by slag-liquid (molten ash) covering the particles. In the modeling methodology developed in this study, thermodynamic equilibrium calculations are used to estimate the amount of slag-liquid in the system, and the changes in particle collision frequencies are accounted for by continuously tracking the number density of the various particle sizes. In this study, the heterogeneities in chemical composition of fuel ash were studied by separating the bulk fuel into particle classes that are rich in specific minerals. FactSage simulations were performed on two bituminous coals and an anthracite to understand the effect of particle-level heterogeneities on agglomeration. The mineral matter behavior of these constituent classes was studied

  11. The development of an interactionist evaluation methodology

    Science.gov (United States)

    Johnston, Jane Susan

    This is an account of the development of an evaluation methodology termed Interactionist Evaluation. Interactionist Evaluation was developed to effectively evaluate the quality of in-service science courses within Trent Polytechnic (now The Nottingham Trent University) and to ascertain their long term impact on the development of primary science within schools. The evaluation methodology was influenced by the complex interactions with and within schools and the in-service courses and by other qualitative evaluation models. Its development attempted to encompass the needs and difficulties of course evaluation as experienced during the initial evaluations. It represents a novel form of evaluation, not described in the literature, and extends the possibilities of course evaluation. In Interactionist Evaluation the evaluator is committed to the aims of the course being evaluated and participates in the course to establish good working relationships with course members. Subsequent interaction in the school context supports the aims of the course in relation to teacher and child development and attempts to enhance the quality of both by observing the teacher in action and engaging them and other staff in educational conversation. It is the form and intentions of this interaction which establishes Interactionist Evaluation as a distinct evaluation methodology. It recognises three different forms of interaction and uses interaction in a positive way to achieve agreed aims. In this way evaluation interaction is able to contribute to the success of the courses and their long term impact, rather than being a negative influence to be accounted for. In Part 1 of this thesis, the influences which acted upon the developing evaluation methodology are discussed. This is followed in Part 2 by a closer look at one of the major influences, the interactions with schools. Each case study represents an important influence on the methodology and this influence is discussed together with

  12. A methodological approach to parametric product modelling in motor car development; Ein methodischer Ansatz zur parametrischen Produktmodellierung in der Fahrzeugentwicklung

    Energy Technology Data Exchange (ETDEWEB)

    Boehme, M.

    2004-07-01

    Continuos improvement of processes and methodologies is one key element to shorten development time, reduce costs, and improve quality, and therefore to answer growing customer demands and global competition. This work describes a new concept of introducing the principles of parametric modeling to the entire product data model in the area of automotive development. Based on the idea, that not only geometric dimensions can be described by parameters, the method of parametric modeling is applied to the complete product model. The concept assumes four major principles: First, the parameters of the product model are handled independently from their proprietary data formats. Secondly, a strictly hierarchical structure is required for the parametric description of the product. The third principle demands an object-based parameterization. Finally the use of parameter-sets for the description of logical units of the product model tree is part of the concept. Those four principles are addressing the following main objectives: Supporting and improving Simultaneous Engineering, achieving data consistency over all development phases, digital approval of product properties, and incorporation of the design intent into the product model. Further improvement of the automotive development process can be achieved with the introduction of parametric product modeling using the principles described in this paper. (orig.) [German] Die Forderung nach kuerzeren Entwicklungszeiten, Reduzierung der Kosten und verbesserter Qualitaet erfordert eine stetige Verbesserung von Prozessen und Methoden in der Produktentwicklung. In dieser Arbeit wird ein neuer Ansatz vorgestellt, der die Methodik des parametrischen Konstruierens auf das gesamte Produktmodell in der Fahrzeugentwicklung anwendet, und somit weitere Potentiale zur Verbesserung des Produktentstehungsprozesses erschliesst. Ausgehend von der Annahme, dass nicht nur geometrische Abmessungen als Parameter beschrieben werden koennen, wird die

  13. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available Rapid development methodologies are popular approaches for the development of modern software systems. The goals of these methodologies are the inclusion of the client into the analysis, design and implementation activities, as well...

  14. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  15. Developing a Sustainable Model of Oral Health Care for Disadvantaged Aboriginal People Living in Rural and Remote Communities in NSW, Using Collective Impact Methodology.

    Science.gov (United States)

    Gwynne, Kylie; Irving, Michelle J; McCowen, Debbie; Rambaldini, Boe; Skinner, John; Naoum, Steve; Blinkhorn, Anthony

    2016-02-01

    A sustainable model of oral health care for disadvantaged Aboriginal people living in rural and remote communities in New South Wales was developed using collective impact methodology. Collective impact is a structured process which draws together organizations to develop a shared agenda and design solutions which are jointly resourced, measured and reported upon.

  16. Numerical investigation of Marine Hydrokinetic Turbines: methodology development for single turbine and small array simulation, and application to flume and full-scale reference models

    Science.gov (United States)

    Javaherchi Mozafari, Amir Teymour

    A hierarchy of numerical models, Single Rotating Reference Frame (SRF) and Blade Element Model (BEM), were used for numerical investigation of horizontal axis Marine Hydrokinetic (MHK) Turbines. In the initial stage the SRF and BEM were used to simulate the performance and turbulent wake of a flume- and a full-scale MHK turbine reference model. A significant level of understanding and confidence was developed in the implementation of numerical models for simulation of a MHK turbine. This was achieved by simulation of the flume-scale turbine experiments and comparison between numerical and experimental results. Then the developed numerical methodology was applied to simulate the performance and wake of the full-scale MHK reference model (DOE Reference Model 1). In the second stage the BEM was used to simulate the experimental study of two different MHK turbine array configurations (i.e. two and three coaxial turbines). After developing a numerical methodology using the experimental comparison to simulate the flow field of a turbine array, this methodology was applied toward array optimization study of a full-scale model with the goal of proposing an optimized MHK turbine configuration with minimal computational cost and time. In the last stage the BEM was used to investigate one of the potential environmental effects of MHK turbine. A general methodological approach was developed and experimentally validated to investigate the effect of MHK turbine wake on the sedimentation process of suspended particles in a tidal channel.

  17. Methodology for the treatment of model uncertainty

    Science.gov (United States)

    Droguett, Enrique Lopez

    The development of a conceptual, unified, framework and methodology for treating model and parameter uncertainties is the subject of this work. Firstly, a discussion on the philosophical grounds of notions such as reality, modeling, models, and their relation is presented. On this, a characterization of the modeling process is presented. The concept of uncertainty, addressing controversial topics such as type and sources of uncertainty, are investigated arguing that uncertainty is fundamentally a characterization of lack of knowledge and as such all uncertainty are of the same type. A discussion about the roles of a model structure and model parameters is presented, in which it is argued that a distinction is for convenience and a function of the stage in the modeling process. From the foregoing discussion, a Bayesian framework for an integrated assessment of model and parameter uncertainties is developed. The methodology has as its central point the treatment of model as source of information regarding the unknown of interest. It allows for the assessment of the model characteristics affecting its performance, such as bias and precision. It also permits the assessment of possible dependencies among multiple models. Furthermore, the proposed framework makes possible the use of not only information from models (e.g., point estimates, qualitative assessments), but also evidence about the models themselves (performance data, confidence in the model, applicability of the model). The methodology is then applied in the context of fire risk models where several examples with real data are studied. These examples demonstrate how the framework and specific techniques developed in this study can address cases involving multiple models, use of performance data to update the predictive capabilities of a model, and the case where a model is applied in a context other than one for which it is designed.

  18. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available foundations, and/or design methodologies. The relevant ecosystem is a novel contribution to assist health system implementers to consider specific components relevant to digital health. The ecosystem may also make some theoretical, methodological... by involving researchers or practitioners from industry and academia to evaluate the relevance of the DHIE. Additions to the knowledge base have assisted in the development of the ecosystem (see Chapter 7, Section B). Guideline 6: Design as a Search...

  19. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  20. Four Phase Methodology for Developing Secure Software

    OpenAIRE

    Carlos Gonzalez-Flores; Ernesto Liñan-García

    2016-01-01

    A simple and robust approach for developing secure software. A Four Phase methodology consists in developing the non-secure software in phase one, and for the next three phases, one phase for each of the secure developing types (i.e. self-protected software, secure code transformation, and the secure shield). Our methodology requires first the determination and understanding of the type of security level needed for the software. The methodology proposes the use of several teams to accomplish ...

  1. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  2. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  3. Disentangling early language development: modeling lexical and grammatical acquisition using an extension of case-study methodology.

    Science.gov (United States)

    Robinson, B F; Mervis, C B

    1998-03-01

    The early lexical and grammatical development of 1 male child is examined with growth curves and dynamic-systems modeling procedures. Lexical-development described a pattern of logistic growth (R2 = .98). Lexical and plural development shared the following characteristics: Plural growth began only after a threshold was reached in vocabulary size; lexical growth slowed as plural growth increased. As plural use reached full mastery, lexical growth began again to increase. It was hypothesized that a precursor model (P. van Geert, 1991) would fit these data. Subsequent testing indicated that the precursor model, modified to incorporate brief yet intensive plural growth, provided a suitable fit. The value of the modified precursor model for the explication of processes implicated in language development is discussed.

  4. Bayesian methodology for reliability model acceptance

    International Nuclear Information System (INIS)

    Zhang Ruoxue; Mahadevan, Sankaran

    2003-01-01

    This paper develops a methodology to assess the reliability computation model validity using the concept of Bayesian hypothesis testing, by comparing the model prediction and experimental observation, when there is only one computational model available to evaluate system behavior. Time-independent and time-dependent problems are investigated, with consideration of both cases: with and without statistical uncertainty in the model. The case of time-independent failure probability prediction with no statistical uncertainty is a straightforward application of Bayesian hypothesis testing. However, for the life prediction (time-dependent reliability) problem, a new methodology is developed in this paper to make the same Bayesian hypothesis testing concept applicable. With the existence of statistical uncertainty in the model, in addition to the application of a predictor estimator of the Bayes factor, the uncertainty in the Bayes factor is explicitly quantified through treating it as a random variable and calculating the probability that it exceeds a specified value. The developed method provides a rational criterion to decision-makers for the acceptance or rejection of the computational model

  5. Study on fermentation conditions of palm juice vinegar by response surface methodology and development of a kinetic model

    Directory of Open Access Journals (Sweden)

    S. Ghosh

    2012-09-01

    Full Text Available Natural vinegar is one of the fermented products which has some potentiality with respect to a nutraceutical standpoint. The present study is an optimization of the fermentation conditions for palm juice vinegar production from palm juice (Borassus flabellifer wine, this biochemical process being aided by Acetobacter aceti (NCIM 2251. The physical parameters of the fermentation conditions such as temperature, pH, and time were investigated by Response Surface Methodology (RSM with 2³ factorial central composite designs (CCD. The optimum pH, temperature and time were 5.5, 30 °C and 72 hrs for the highest yield of acetic acid (68.12 g / L. The quadratic model equation had a R² value of 0.992. RSM played an important role in elucidating the basic mechanisms in a complex situation, thus providing better process control by maximizing acetic acid production with the respective physical parameters. At the optimized conditions of temperature, pH and time and with the help of mathematical kinetic equations, the Monod specific growth rate ( µ max= 0.021 h-1, maximum Logistic specific growth rate ( µ 'max = 0.027 h-1 and various other kinetic parameters were calculated, which helped in validation of the experimental data. Therefore, the established kinetic models may be applied for the production of natural vinegar by fermentation of low cost palm juice.

  6. A Probabilistic Ontology Development Methodology

    Science.gov (United States)

    2014-06-01

    Protege. [Online]. http://protege.stanford.edu/ [22] University of Brasilia - UnB. (2010) UnBBayes. [Online]. http://unbbayes.sourceforge.net...University of Brasilia , he was a key contributor to the development of UnBBayes-MEBN, an implementation of the MEBN probabilistic first-order logic

  7. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  8. Development of effect assessment methodology for the deployment of fast reactor cycle system with dynamic computable general equilibrium model

    International Nuclear Information System (INIS)

    Shiotani, Hiroki; Ono, Kiyoshi

    2009-01-01

    The Global Trade and Analysis Project (GTAP) is a widely used computable general equilibrium (CGE) model developed by Purdue University. Although the GTAP-E, an energy environmental version of the GTAP model, is useful for surveying the energy-economy-environment-trade linkage is economic policy analysis, it does not have the decomposed model of the electricity sector and its analyses are comparatively static. In this study, a recursive dynamic CGE model with a detailed electricity technology bundle with nuclear power generation including FR was developed based on the GTAP-E to evaluate the long-term socioeconomic effects of FR deployment. The capital stock changes caused by international investments and some dynamic constraints of the FR deployment and operation (e.g., load following capability and plutonium mass balance) were incorporated in the analyses. The long-term socioeconomic effects resulting from the deployment of economic competitive FR with innovative technologies can be assessed; the cumulative effects of the FR deployment on GDP calculated using this model costed over 40 trillion yen in Japan and 400 trillion yen worldwide, which were several times more than the cost of the effects calculated using the conventional cost-benefit analysis tool, because of ripple effects and energy substitutions among others. (author)

  9. Modeling Virtual Organization Architecture with the Virtual Organization Breeding Methodology

    Science.gov (United States)

    Paszkiewicz, Zbigniew; Picard, Willy

    While Enterprise Architecture Modeling (EAM) methodologies become more and more popular, an EAM methodology tailored to the needs of virtual organizations (VO) is still to be developed. Among the most popular EAM methodologies, TOGAF has been chosen as the basis for a new EAM methodology taking into account characteristics of VOs presented in this paper. In this new methodology, referred as Virtual Organization Breeding Methodology (VOBM), concepts developed within the ECOLEAD project, e.g. the concept of Virtual Breeding Environment (VBE) or the VO creation schema, serve as fundamental elements for development of VOBM. VOBM is a generic methodology that should be adapted to a given VBE. VOBM defines the structure of VBE and VO architectures in a service-oriented environment, as well as an architecture development method for virtual organizations (ADM4VO). Finally, a preliminary set of tools and methods for VOBM is given in this paper.

  10. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  11. Development of Methodology for Programming Autonomous Agents

    Science.gov (United States)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  12. Integrated management model. Methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization

    International Nuclear Information System (INIS)

    Llovet, R.; Ibanez, R.; Woodcock, J.

    2005-01-01

    A key concern for utilities today is optimizing station aging and realibility management activities in a manner that maximizes the value of those activities withing an affordable budget. The Westinghouse Proactive Asset Management Model is a methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization of those activities. The process and tool support the development of an optimized, station-wide plan for inspection, testing, maintenance, repaor and replacement of aging components. The optimization identifies the benefit and optimal timing of those activities based on minimizing unplanned outage costs (avoided costs) and maximizing station Net Present Value. (Author)

  13. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    OpenAIRE

    DR. NACHAMAI. M; M. SENTHIL VADIVU; VINITA TAPASKAR

    2011-01-01

    Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. ...

  14. Modeling scientific: some theoretical and methodological considerations

    Directory of Open Access Journals (Sweden)

    Carlos Tamayo-Roca

    2017-04-01

    Full Text Available At present widespread use of models as an auxiliary system to penetrate the essence of phenomena related to all areas of cognitive and transforming activity of man, covering as diverse as human sciences fields. In the field of education use it is becoming more common as essential to transform school practice and enrich their theoretical instrument bitter day. The paper deals with the development of theoretical modeling as a scientific method to advance the process to be transformed and characterized by establishing relationships and links between the structural components that comprise it. In this regard it is proposed as an objective socialize some theoretical and methodological considerations that favor the use of modeling method in the scientific research activity of teachers.

  15. Development of a methodology for electronic waste estimation: A material flow analysis-based SYE-Waste Model.

    Science.gov (United States)

    Yedla, Sudhakar

    2016-01-01

    Improved living standards and the share of services sector to the economy in Asia, and the use of electronic equipment is on the rise and results in increased electronic waste generation. A peculiarity of electronic waste is that it has a 'significant' value even after its life time, and to add complication, even after its extended life in its 'dump' stage. Thus, in Indian situations, after its life time is over, the e-material changes hands more than once and finally ends up either in the hands of informal recyclers or in the store rooms of urban dwellings. This character makes it extremely difficult to estimate electronic waste generation. The present study attempts to develop a functional model based on a material flow analysis approach by considering all possible end uses of the material, its transformed goods finally arriving at disposal. It considers various degrees of uses derived of the e-goods regarding their primary use (life time), secondary use (first degree extension of life), third-hand use (second degree extension of life), donation, retention at the respective places (without discarding), fraction shifted to scrap vendor, and the components reaching the final dump site from various end points of use. This 'generic functional model' named SYE-Waste Model, developed based on a material flow analysis approach, can be used to derive 'obsolescence factors' for various degrees of usage of e-goods and also to make a comprehensive estimation of electronic waste in any city/country. © The Author(s) 2015.

  16. Verification of Fault Tree Models with RBDGG Methodology

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    Currently, fault tree analysis is widely used in the field of probabilistic safety assessment (PSA) of nuclear power plants (NPPs). To guarantee the correctness of fault tree models, which are usually manually constructed by analysts, a review by other analysts is widely used for verifying constructed fault tree models. Recently, an extension of the reliability block diagram was developed, which is named as RBDGG (reliability block diagram with general gates). The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system and, therefore, the modeling of a system for a system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar to that of the development of the RGGG (Reliability Graph with General Gates) methodology. The difference between the RBDGG methodology and RGGG methodology is that the RBDGG methodology focuses on the block failures while the RGGG methodology focuses on the connection line failures. But, it is also known that an RGGG model can be converted to an RBDGG model and vice versa. In this paper, a new method for the verification of the constructed fault tree models using the RBDGG methodology is proposed and demonstrated

  17. Model evaluation methodology applicable to environmental assessment models

    International Nuclear Information System (INIS)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes

  18. A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM)

    Science.gov (United States)

    2017-10-01

    TECHNICAL REPORT 3079 October 2017 A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM...Head 55190 Networks Division iii EXECUTIVE SUMMARY This report summarizes the methodology developed to improve the radar threshold modeling...PHASED ARRAY RADAR CONFIGURATION ..................................................................... 1 3. METHODOLOGY

  19. FAA Development of Reliable Modeling Methodologies for Fan Blade Out Containment Analysis. Part 2; Ballistic Impact Testing

    Science.gov (United States)

    Revilock, Duane M.; Pereira, J. Michael

    2008-01-01

    This report summarizes the ballistic impact testing that was conducted to provide validation data for the development of numerical models of blade out events in fabric containment systems. The ballistic impact response of two different fiber materials - Kevlar 49 (E.I. DuPont Nemours and Company) and Zylon AS (Toyobo Co., Ltd.) was studied by firing metal projectiles into dry woven fabric specimens using a gas gun. The shape, mass, orientation and velocity of the projectile were varied and recorded. In most cases the tests were designed such that the projectile would perforate the specimen, allowing measurement of the energy absorbed by the fabric. The results for both Zylon and Kevlar presented here represent a useful set of data for the purposes of establishing and validating numerical models for predicting the response of fabrics under conditions simulating those of a jet engine blade release situations. In addition some useful empirical observations were made regarding the effects of projectile orientation and the relative performance of the different materials.

  20. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  1. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  2. Development of a methodology for microstructural description

    Directory of Open Access Journals (Sweden)

    Vanderley de Vasconcelos

    1999-07-01

    Full Text Available A systematic methodology for microstructural description can help the task of obtaining the processing x microstructure x properties x performance relationships. There are, however, some difficulties in performing this task, which are related mainly to the following three factors: the complexity of the interactions between microstructural features; difficulties in evaluating geometric parameters of microstructural features; and difficulties in relating these geometric parameters to process variables. To solve some of these problems, it is proposed a methodology that embodies the following features: takes into account the different possible types of approaches for the microstructural description problem; includes concepts and tools of Total Quality Management; is supported on techniques of system analysis; and makes use of computer modeling and simulation and statistical design of experiments tools. The methodology was applied on evaluating some topological parameters during sintering process and its results were compared with available experimental data.

  3. Development Methodology for an Integrated Legal Cadastre

    NARCIS (Netherlands)

    Hespanha, J.P.

    2012-01-01

    This Thesis describes the research process followed in order to achieve a development methodology applicable to the reform of cadastral systems with a legal basis. It was motivated by the author’s participation in one of the first surveying and mapping operations for a digital cadastre in Portugal,

  4. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.

  5. Comparative Study on Agile software development methodologies

    OpenAIRE

    Moniruzzaman, A B M; Hossain, Dr Syed Akhter

    2013-01-01

    Today-s business environment is very much dynamic, and organisations are constantly changing their software requirements to adjust with new environment. They also demand for fast delivery of software products as well as for accepting changing requirements. In this aspect, traditional plan-driven developments fail to meet up these requirements. Though traditional software development methodologies, such as life cycle-based structured and object oriented approaches, continue to dominate the sys...

  6. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  7. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  8. Setting priorities in health research using the model proposed by the World Health Organization: development of a quantitative methodology using tuberculosis in South Africa as a worked example.

    Science.gov (United States)

    Hacking, Damian; Cleary, Susan

    2016-02-09

    Setting priorities is important in health research given the limited resources available for research. Various guidelines exist to assist in the priority setting process; however, priority setting still faces significant challenges such as the clear ranking of identified priorities. The World Health Organization (WHO) proposed a Disability Adjusted Life Year (DALY)-based model to rank priorities by research area (basic, health systems and biomedical) by dividing the DALYs into 'unavertable with existing interventions', 'avertable with improved efficiency' and 'avertable with existing but non-cost-effective interventions', respectively. However, the model has conceptual flaws and no clear methodology for its construction. Therefore, the aim of this paper was to amend the model to address these flaws, and develop a clear methodology by using tuberculosis in South Africa as a worked example. An amended model was constructed to represent total DALYs as the product of DALYs per person and absolute burden of disease. These figures were calculated for all countries from WHO datasets. The lowest figures achieved by any country were assumed to represent 'unavertable with existing interventions' if extrapolated to South Africa. The ratio of 'cost per patient treated' (adjusted for purchasing power and outcome weighted) between South Africa and the best country was used to calculate the 'avertable with improved efficiency section'. Finally, 'avertable with existing but non-cost-effective interventions' was calculated using Disease Control Priorities Project efficacy data, and the ratio between the best intervention and South Africa's current intervention, irrespective of cost. The amended model shows that South Africa has a tuberculosis burden of 1,009,837.3 DALYs; 0.009% of DALYs are unavertable with existing interventions and 96.3% of DALYs could be averted with improvements in efficiency. Of the remaining DALYs, a further 56.9% could be averted with existing but non

  9. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  10. Comparing Methodologies for Developing an Early Warning System: Classification and Regression Tree Model versus Logistic Regression. REL 2015-077

    Science.gov (United States)

    Koon, Sharon; Petscher, Yaacov

    2015-01-01

    The purpose of this report was to explicate the use of logistic regression and classification and regression tree (CART) analysis in the development of early warning systems. It was motivated by state education leaders' interest in maintaining high classification accuracy while simultaneously improving practitioner understanding of the rules by…

  11. Thermodynamic modeling of ionic liquid systems: development and detailed overview of novel methodology based on the PC-SAFT.

    Science.gov (United States)

    Paduszyński, Kamil; Domańska, Urszula

    2012-04-26

    We present the results of an extensive study on a novel approach of modeling ionic liquids (ILs) and their mixtures with molecular compounds, incorporating perturbed-chain statistical associating fluid theory (PC-SAFT). PC-SAFT was used to calculate the thermodynamic properties of different homologous series of ILs based on the bis(trifluormethylsulfonyl)imide anion ([NTf2]). First, pure fluid parameters were obtained for each IL by means of fitting the model predictions to experimental liquid densities over a broad range of temperature and pressure. The reliability and physical significance of the parameters as well as the employed molecular scheme were tested by calculation of density, vapor pressure, and other properties of pure ILs (e.g., critical properties, normal boiling point). Additionally, the surface tension of pure ILs was calculated by coupling the PC-SAFT equation of state with density gradient theory (DGT). All correlated/predicted results were compared with literature experimental or simulation data. Afterward, we attempted to model various thermodynamic properties of some binary systems composed of IL and organic solvent or water. The properties under study were the binary vapor-liquid, liquid-liquid, and solid-liquid equilibria and the excess enthalpies of mixing. To calculate cross-interaction energies we used the standard combining rules of Lorentz-Berthelot, Kleiner-Sadowski, and Wolbach-Sandler. It was shown that incorporation of temperature-dependent binary corrections was required to obtain much more accurate results than in the case of conventional predictions. Binary corrections were adjusted to infinite dilution activity coefficients of a particular solute in a given IL determined experimentally or predicted by means of the modified UNIFAC (Dortmund) group contribution method. We concluded that the latter method allows accurate and reliable calculations of bulk-phase properties in a totally predictive manner.

  12. Baseline methodologies for clean development mechanism projects

    International Nuclear Information System (INIS)

    Lee, M.K.; Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-01

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  13. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  14. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  15. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  16. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    climate, omitting the energy in the frequency band between the two lower limits tested can lead to an incomplete characterization of model performance. This methodology was developed to aid in selecting a comparison frequency range that does not needlessly increase computational expense and does not exclude energy to the detriment of model performance analysis.

  17. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    Science.gov (United States)

    2016-06-01

    Filling Designs for Complex System Simulations.” Ph.D. Dissertation, Naval Postgraduate School. MacCalman, Alex, Hyangshim Kwak, Mary McDonald, and...Michael, Peter Bryant, Mike Wilkinson, Paul King, Ady James, and Stuart Arnold. 2012. “Interpreting ‘Systems Architecting.’” Systems Engineering 15(4...Hyangshim Kwak, Mary McDonald, and Stephen Upton. 2015. “Capturing experimental design insights in support of the model-based systems engineering approach

  18. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...... the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main...

  19. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  20. Extending statistical boosting. An overview of recent methodological developments.

    Science.gov (United States)

    Mayr, A; Binder, H; Gefeller, O; Schmid, M

    2014-01-01

    Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately in the literature up to now. The methodological developments on statistical boosting during the last ten years can be grouped into three different lines of research: i) efforts to ensure variable selection leading to sparser models, ii) developments regarding different types of predictor effects and how to choose them, iii) approaches to extend the statistical boosting framework to new regression settings. Statistical boosting algorithms have been adapted to carry out unbiased variable selection and automated model choice during the fitting process and can nowadays be applied in almost any regression setting in combination with a large amount of different types of predictor effects.

  1. NPA4K development system using object-oriented methodology

    International Nuclear Information System (INIS)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced

  2. NPA4K development system using object-oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced.

  3. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  4. Validating agent oriented methodology (AOM) for netlogo modelling and simulation

    Science.gov (United States)

    WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan

    2017-10-01

    AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.

  5. CFD methodology of a model quadrotor

    Science.gov (United States)

    Sunan, Burak

    2013-11-01

    This paper presents an analysis of the aerodynamics characteristics of a quadrotor for both steady and unsteady flows. For steady flow cases, aerodynamics behaviour can be defined readily for any aerial vehicles in wind tunnels. However, unsteady flow conditions in wind tunnels make experimental aerodynamics characterizations difficult. This article describes determination of lift, drag and thrust forces on a model quadrotor by using CFD (Computational Fluid Dynamics) software ANSYS Fluent. A significant issue is to find a new CFD methodology for comparison with the experimental results. After getting sufficiently close agreement with some benchmarking experiments, the CFD methodology can be performed for more complicated geometries. In this paper, propeller performance database experiments from Ref. 1 will be used for validation of the CFD procedure. The results of the study reveals the dynamics characteristics of a quadrotor. This demonstrates feasibility of designing a quadrotor by CFD which saves time and cost compared to experiments.

  6. A methodology for PSA model validation

    International Nuclear Information System (INIS)

    Unwin, S.D.

    1995-09-01

    This document reports Phase 2 of work undertaken by Science Applications International Corporation (SAIC) in support of the Atomic Energy Control Board's Probabilistic Safety Assessment (PSA) review. A methodology is presented for the systematic review and evaluation of a PSA model. These methods are intended to support consideration of the following question: To within the scope and depth of modeling resolution of a PSA study, is the resultant model a complete and accurate representation of the subject plant? This question was identified as a key PSA validation issue in SAIC's Phase 1 project. The validation methods are based on a model transformation process devised to enhance the transparency of the modeling assumptions. Through conversion to a 'success-oriented' framework, a closer correspondence to plant design and operational specifications is achieved. This can both enhance the scrutability of the model by plant personnel, and provide an alternative perspective on the model that may assist in the identification of deficiencies. The model transformation process is defined and applied to fault trees documented in the Darlington Probabilistic Safety Evaluation. A tentative real-time process is outlined for implementation and documentation of a PSA review based on the proposed methods. (author). 11 refs., 9 tabs., 30 refs

  7. Demand Activated Manufacturing Architecture (DAMA) supply chain collaboration development methodology

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN,MARJORIE B.; CHAPMAN,LEON D.

    2000-03-15

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise supply chain collaboration development methodology. The goal of this methodology is to enable a supply chain to work more efficiently and competitively. The outcomes of this methodology include: (1) A definitive description and evaluation of the role of business cultures and supporting business organizational structures in either inhibiting or fostering change to a more competitive supply chain; (2) ``As-Is'' and proposed ``To-Be'' supply chain business process models focusing on information flows and decision-making; and (3) Software tools that enable and support a transition to a more competitive supply chain, which results form a business driven rather than technologically driven approach to software design. This methodology development will continue in FY00 as DAMA engages companies in the soft goods industry in supply chain research and implementation of supply chain collaboration.

  8. METHODOLOGICAL DEVELOPMENTS IN 3D SCANNING AND MODELLING OF ARCHAEOLOGICAL FRENCH HERITAGE SITE : THE BRONZE AGE PAINTED CAVE OF "LES FRAUX", DORDOGNE (FRANCE

    Directory of Open Access Journals (Sweden)

    A. Burens

    2013-07-01

    Full Text Available For six years, an interdisciplinary team of archaeologists, surveyors, environmentalists and archaeometrists have jointly carried out the study of a Bronze Age painted cave, registrered in the French Historical Monuments. The archaeological cave of Les Fraux (Saint-Martin-de-Fressengeas, Dordogne forms a wide network of galleries, characterized by the exceptional richness of its archaeological remains such as ceramic and metal deposits, parietal representation and about domestic fireplaces. This cave is the only protohistorical site in Europe wherein are gathered testimonies of domestic, spiritual and artistic activities. Fortunately, the cave was closed at the end of the Bronze Age, following to the collapse of its entrance. The site was re-discovered in 1989 and its study started in 2007. The study in progress takes place in a new kind of tool founded by the CNRS's Institute of Ecology and Environment. The purpose of this observatory is the promotion of new methodologies and experimental studies in Global Ecology. In that framework, 3D models of the cave constitute the common work support and the best way for scientific communication for the various studies conducted on the site by nearly forty researchers. In this specific context, a partnership among archaeologists and surveyors from INSA Strasbourg allows the team to develop, in an interdisciplinary way, new methods of data acquiring based on contact-free measurements techniques in order to acquire a full 3D-documentation. This work is conducted in compliance with the integrity of the site. Different techniques based on Terrestrial Laser Scanning, Digital Photogrammetry and Spatial Imaging System have been used in order to generate a geometric and photorealistic 3D model from the combination of point clouds and photogrammetric images, for both visualization and accurate documentation purposes. Various scales of acquiring and diverse resolutions have been applied according to the subject

  9. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  10. Transformations of summary statistics as input in meta-analysis for linear dose–response models on a logarithmic scale: a methodology developed within EURRECA

    Directory of Open Access Journals (Sweden)

    Souverein Olga W

    2012-04-01

    Full Text Available Abstract Background To derive micronutrient recommendations in a scientifically sound way, it is important to obtain and analyse all published information on the association between micronutrient intake and biochemical proxies for micronutrient status using a systematic approach. Therefore, it is important to incorporate information from randomized controlled trials as well as observational studies as both of these provide information on the association. However, original research papers present their data in various ways. Methods This paper presents a methodology to obtain an estimate of the dose–response curve, assuming a bivariate normal linear model on the logarithmic scale, incorporating a range of transformations of the original reported data. Results The simulation study, conducted to validate the methodology, shows that there is no bias in the transformations. Furthermore, it is shown that when the original studies report the mean and standard deviation or the geometric mean and confidence interval the results are less variable compared to when the median with IQR or range is reported in the original study. Conclusions The presented methodology with transformations for various reported data provides a valid way to estimate the dose–response curve for micronutrient intake and status using both randomized controlled trials and observational studies.

  11. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model developm......We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  12. Generalized equilibrium modeling: the methodology of the SRI-Gulf energy model. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gazalet, E.G.

    1977-05-01

    The report provides documentation of the generalized equilibrium modeling methodology underlying the SRI-Gulf Energy Model and focuses entirely on the philosophical, mathematical, and computational aspects of the methodology. The model is a highly detailed regional and dynamic model of the supply and demand for energy in the US. The introduction emphasized the need to focus modeling efforts on decisions and the coordinated decomposition of complex decision problems using iterative methods. The conceptual framework is followed by a description of the structure of the current SRI-Gulf model and a detailed development of the process relations that comprise the model. The network iteration algorithm used to compute a solution to the model is described and the overall methodology is compared with other modeling methodologies. 26 references.

  13. Development of mass and energy release analysis methodology

    International Nuclear Information System (INIS)

    Kim, Cheol Woo; Song, Jeung Hyo; Park, Seok Jeong; Kim, Tech Mo; Han, Kee Soo; Choi, Han Rim

    2009-01-01

    Recently, new approaches to the accident analysis using the realistic evaluation have been attempted. These new approaches provide more margins to the plant safety, design, operation and maintenance. KREM (KEPRI Realistic Evaluation Methodology) for a large break loss-of-coolant accident (LOCA) is performed using RELAP5/MOD3 computer code including realistic evaluation models. KOPEC has developed KIMERA (KOPEC Improved Mass and Energy Release Analysis methodology) based on the realistic evaluation to improve the analysis method for the mass and energy (M/E) release and to obtain the adequate margin. KIMERA uses a simplified single code system unlike conventional M/E release analysis methodologies. This simple code system reduces the computing efforts especially for LOCA analysis. The computer code systems of this methodology are RELAP5K/CONTEMPT4 (or RELAP5-ME) like KREM methodology which couples RELAP5/MOD3.1/K and CONTEMPT4/MOD5. The new methodology, KIMERA based on the same engine as KREM, adopted conservative approaches for the M/E release such as break spillage model, multiplier on heat transfer coefficient (HTC), and long-term cooling model. KIMERA is developed based on a LOCA and applied to a main steam line break (MSLB) and approved by Korea Government. KIMERA has an ability of one-through calculation of the various transient stages of LOCAs in a single code system and calculate the M/E release analysis during the long term cooling period with the containment pressure and temperature (P/T) response. The containment P/T analysis results are compared with those of the Ulchin Nuclear Power Plant Units 3 and 4 (UCN 3 and 4) FSAR which is the OPR1000 (Optimized Power Reactor 1000) type nuclear power plant. The results of a large break LOCA and an MSLB are similar to those of FSAR for UCN 3 and 4. However, the containment pressure during the post-blowdown period of a large break LOCA has much lower second peak than the first peak. The resultant containment peak

  14. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  15. Photovoltaic-system costing-methodology development. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Presented are the results of a study to expand the use of standardized costing methodologies in the National Photovoltaics Program. The costing standards, which include SAMIS for manufacturing costs and M and D for marketing and distribution costs, have been applied to concentrator collectors and power-conditioning units. The M and D model was also computerized. Finally, a uniform construction cost-accounting structure was developed for use in photovoltaic test and application projects. The appendices contain example cases which demonstrate the use of the models.

  16. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    Science.gov (United States)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  17. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    To be able to build a secure network, it is essential to model the threats to the network. A methodology for building a threat model has been proposed in the paper. Several existing threat models and methodologies will be compared to the proposed methodology. The aim of the proposed methodology i...... been used. Also risk assessment methods will be discussed. Threat profiles and vulnerability profiles have been presented....

  18. Fractional Order Modeling of Atmospheric Turbulence - A More Accurate Modeling Methodology for Aero Vehicles

    Science.gov (United States)

    Kopasakis, George

    2014-01-01

    The presentation covers a recently developed methodology to model atmospheric turbulence as disturbances for aero vehicle gust loads and for controls development like flutter and inlet shock position. The approach models atmospheric turbulence in their natural fractional order form, which provides for more accuracy compared to traditional methods like the Dryden model, especially for high speed vehicle. The presentation provides a historical background on atmospheric turbulence modeling and the approaches utilized for air vehicles. This is followed by the motivation and the methodology utilized to develop the atmospheric turbulence fractional order modeling approach. Some examples covering the application of this method are also provided, followed by concluding remarks.

  19. Applying Lean on Agile Scrum Development Methodology

    OpenAIRE

    SurendRaj Dharmapal; K. Thirunadana Sikamani

    2015-01-01

    This journal introduces the reader to Agile and Lean concepts and provides a basic level of understanding of each process. This journal will also provide a brief background about applying Lean concepts on each phase of agile scrum methodology and summarize their primary business advantages for delivering value to customer.

  20. Monitoring sustainable biomass flows : General methodology development

    NARCIS (Netherlands)

    Goh, Chun Sheng; Junginger, Martin; Faaij, André

    Transition to a bio-based economy will create new demand for biomass, e.g. the increasing use of bioenergy, but the impacts on existing markets are unclear. Furthermore, there is a growing public concern on the sustainability of biomass. This study proposes a methodological framework for mapping

  1. The use and effectiveness of information system development methodologies in health information systems / Pieter Wynand Conradie.

    OpenAIRE

    Conradie, Pieter Wynand

    2010-01-01

    Abstract The main focus of this study is the identification of factors influencing the use and effectiveness of information system development methodologies (Le., systems development methodologies) in health information systems. In essence, it can be viewed as exploratory research, utilizing a conceptual research model to investigate the relationships among the hypothesised factors. More specifically, classified as behavioural science, it combines two theoretical models, namely...

  2. Fatigue crack growth model RANDOM2 user manual. Appendix 1: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    Science.gov (United States)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN program RANDOM2 is presented in the form of a user's manual. RANDOM2 is based on fracture mechanics using a probabilistic fatigue crack growth model. It predicts the random lifetime of an engine component to reach a given crack size. Details of the theoretical background, input data instructions, and a sample problem illustrating the use of the program are included.

  3. Fatigue strength reduction model: RANDOM3 and RANDOM4 user manual. Appendix 2: Development of advanced methodologies for probabilistic constitutive relationships of material strength models

    Science.gov (United States)

    Boyce, Lola; Lovelace, Thomas B.

    1989-01-01

    FORTRAN programs RANDOM3 and RANDOM4 are documented in the form of a user's manual. Both programs are based on fatigue strength reduction, using a probabilistic constitutive model. The programs predict the random lifetime of an engine component to reach a given fatigue strength. The theoretical backgrounds, input data instructions, and sample problems illustrating the use of the programs are included.

  4. Development of a new methodology for quantifying nuclear safety culture

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2017-01-01

    The present study developed a Safety Culture Impact Assessment Model (SCIAM) which consists of a safety culture assessment methodology and a safety culture impact quantification methodology. The SCIAM uses a safety culture impact index (SCII) to monitor the status of safety culture of NPPs periodically and it uses relative core damage frequency (RCDF) to present the impact of safety culture on the safety of NPPs. As a result of applying the SCIAM to the reference plant (Kori 3), the standard for the healthy safety culture of the reference plant is suggested. SCIAM might contribute to improve the safety of NPPs (Nuclear Power Plants) by monitoring the status of safety culture periodically and presenting the standard of healthy safety culture.

  5. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  6. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  7. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF......). This paper presents an application of the Udwadia-Kalaba Equation for modelling the Reconfigurable Underwater Robots. The constraints developed to enforce the rigid connection between robots in the system is derived through restrictions on relative distances and orientations. To avoid singularities...... in the orientation and, thereby, allow the robots to undertake any relative configuration the attitude is represented in Euler parameters....

  8. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  9. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  10. Development of a computational methodology for internal dose calculations

    CERN Document Server

    Yoriyaz, H

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phanto...

  11. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  12. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  13. Safety-related operator actions: methodology for developing criteria

    International Nuclear Information System (INIS)

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria

  14. Reflood completion report: Volume 1. A phenomenological thermal-hydraulic model of hot rod bundles experiencing simultaneous bottom and top quenching and an optimization methodology for closure development

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.A. Jr.; Pimentel, D.A.; Jolly-Woodruff, S.; Spore, J.

    1998-04-01

    In this report, a phenomenological model of simultaneous bottom-up and top-down quenching is developed and discussed. The model was implemented in the TRAC-PF1/MOD2 computer code. Two sets of closure relationships were compared within the study, the Absolute set and the Conditional set. The Absolute set of correlations is frequently viewed as the pure set because the correlations is frequently viewed as the pure set because the correlations utilize their original coefficients as suggested by the developer. The Conditional set is a modified set of correlations with changes to the correlation coefficient only. Results for these two sets indicate quite similar results. This report also summarizes initial results of an effort to investigate nonlinear optimization techniques applied to the closure model development. Results suggest that such techniques can provide advantages for future model development work, but that extensive expertise is required to utilize such techniques (i.e., the model developer must fully understand both the physics of the process being represented and the computational techniques being employed). The computer may then be used to improve the correlation of computational results with experiments.

  15. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  16. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  17. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  18. Bioclim Deliverable D8b: development of the physical/statistical down-scaling methodology and application to climate model Climber for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    too coarse and simplified. This is why we first need to find these 'physically based' relations between large scale model outputs and regional scale predictors. This is a solution to the specific problem of down-scaling from an intermediate complexity model such as CLIMBER. There are several other types of down-scaling methodologies, such has the dynamical and rule-based method presented in other BIOCLIM deliverables. A specificity of the present method is to attempt to use physical considerations in the down-scaling while a detailed 'dynamical' approach is out of reach because CLIMBER mainly provides the average climate. By contrast, an input of time-variability at various scales is necessary for a more dynamical approach. This report is organised as follows: Section 2 relates to the design and validation of the method, while section 3 reports the application to BIOCLIM simulations. We first present the employed data sources, which are the model results and the observed climatology. We then present the principles of the down-scaling method, the formulation of the predictors and the calibration of the statistical model, including results for the last glacial maximum. In section 3, the results are first presented as time series for each site, then as maps at specific times, or snapshots

  19. Development of intelligent model for personalized guidance on wheelchair tilt and recline usage for people with spinal cord injury: methodology and preliminary report.

    Science.gov (United States)

    Fu, Jicheng; Jones, Maria; Jan, Yih-Kuen

    2014-01-01

    Wheelchair tilt and recline functions are two of the most desirable features for relieving seating pressure to decrease the risk of pressure ulcers. The effective guidance on wheelchair tilt and recline usage is therefore critical to pressure ulcer prevention. The aim of this study was to demonstrate the feasibility of using machine learning techniques to construct an intelligent model to provide personalized guidance to individuals with spinal cord injury (SCI). The motivation stems from the clinical evidence that the requirements of individuals vary greatly and that no universal guidance on tilt and recline usage could possibly satisfy all individuals with SCI. We explored all aspects involved in constructing the intelligent model and proposed approaches tailored to suit the characteristics of this preliminary study, such as the way of modeling research participants, using machine learning techniques to construct the intelligent model, and evaluating the performance of the intelligent model. We further improved the intelligent model's prediction accuracy by developing a two-phase feature selection algorithm to identify important attributes. Experimental results demonstrated that our approaches held the promise: they could effectively construct the intelligent model, evaluate its performance, and refine the participant model so that the intelligent model's prediction accuracy was significantly improved.

  20. The Typology of Methodological Approaches to Development of Innovative Clusters

    Directory of Open Access Journals (Sweden)

    Farat Olexandra V.

    2017-06-01

    Full Text Available The aim of the article is to study the existing methodological approaches to assessing the development of enterprises for further substantiation of possibilities of their using by cluster associations. As a result of research, based on the analysis of scientific literature, the most applicable methodological approaches to assessing the development of enterprises are characterized. 8 methodical approaches to assessing the level of development of enterprises and 4 methodological approaches to assessing the level of development of clusters are singled out. Each of the approaches is characterized by the presence of certain advantages and disadvantages, but none of them allows to obtain a systematic assessment of all areas of cluster functioning, identify possible reserves for cluster competitiveness growth and characterize possible strategies for their future development. Taking into account peculiarities of the functioning and development of cluster associations of enterprises, we propose our own methodological approach for assessing the development of innovative cluster structures.

  1. Development of enterprise architecture management methodology for teaching purposes

    Directory of Open Access Journals (Sweden)

    Dmitry V. Kudryavtsev

    2017-01-01

    Full Text Available Enterprise architecture is considered as a certain object of management, providing in business a general view of the enterprise and the mutual alignment of parts of this enterprise into a single whole, and as the discipline that arose based on this object. The architectural approach to the modeling and design of the enterprise originally arose in the field of information technology and was used to design information systems and technical infrastructure, as well as formalize business requirements. Since the early 2000’s enterprise architecture is increasingly used in organizational development and business transformation projects, especially if information technologies are involved. Enterprise architecture allows describing, analyzing and designing the company from the point of view of its structure, functioning and goal setting (motivation.In the context of this approach, the enterprise is viewed as a system of services, processes, goals and performance indicators, organizational units, information systems, data, technical facilities, etc. Enterprise architecture implements the idea of a systematic approach to managing and changing organizations in the digital economy where business is strongly dependent on information technologies.This increases the relevance of the suggested approach at the present time, when companies need to create and successfully implement a digital business strategy.Teaching enterprise architecture in higher educational institutions is a difficult task due to the interdisciplinary of this subject, its generalized nature and close connection with practical experience. In addition, modern enterprise architecture management methodologies are complex for students and contain many details that are relevant for individual situations.The paper proposes a simplified methodology for enterprise architecture management, which on the one hand will be comprehensible to students, and on the other hand, it will allow students to apply

  2. Thermal ecological risk assessment - methodology for modeling

    International Nuclear Information System (INIS)

    Markandeya, S.G.

    2007-01-01

    Discharge of hot effluents into natural water bodies is a potential risk to the aquatic life. The stipulations imposed by the MoEF, Government of India for protecting the environment are in place. However, due to lack of quality scientific information, these stipulations are generally conservative in nature and hence questionable. A Coordinated Research Project on Thermal Ecological Studies, successfully completed recently came out with a suggestion of implementing multi-factorially estimated mixing zone concept. In the present paper, risk based assessment methodology is proposed as an alternate approach. The methodology is presented only conceptually and briefly over which further refining may be necessary. The methodology would enable to account for variations in the plant operational conditions, climatic conditions and the geographical and hydraulic characteristic conditions of the water body in a suitable manner. (author)

  3. A methodology for overall consequence modeling in chemical industry

    International Nuclear Information System (INIS)

    Arunraj, N.S.; Maiti, J.

    2009-01-01

    Risk assessment in chemical process industry is a very important issue for safeguarding human and the ecosystem from damages caused to them. Consequence assessment is an integral part of risk assessment. However, the commonly used consequence estimation methods involve time-consuming complex mathematical models and simple assimilation of losses without considering all the consequence factors. This lead to the deterioration of quality of estimated risk value. So, the consequence modeling has to be performed in detail considering all major losses with optimal time to improve the decisive value of risk. The losses can be broadly categorized into production loss, assets loss, human health and safety loss, and environment loss. In this paper, a conceptual framework is developed to assess the overall consequence considering all the important components of major losses. Secondly, a methodology is developed for the calculation of all the major losses, which are normalized to yield the overall consequence. Finally, as an illustration, the proposed methodology is applied to a case study plant involving benzene extraction. The case study result using the proposed consequence assessment scheme is compared with that from the existing methodologies.

  4. Cooperative learning as a methodology for inclusive education development

    Directory of Open Access Journals (Sweden)

    Yolanda Muñoz Martínez

    2017-06-01

    Full Text Available This paper presents the methodology of cooperative learning as a strategy to develop the principles of inclusive education. It has a very practical orientation, with the intention of providing tools for teachers who want to implement this methodology in the classroom, starting with a theoretical review, and then a description of a case in which they have worked this methodology for 5 years. We describe specific activities and ways of working with students, later reaching conclusions on the implementation of the methodology.

  5. Methodology Development for Advocate Team Use for Input Evaluation.

    Science.gov (United States)

    Reinhard, Diane L.

    Methodology for input evaluation, as defined by Daniel L. Stufflebeam, is relatively nonexistent. Advocate teams have recently become a popular means of generating and assessing alternative strategies for a set of objectives. This study was undertaken to develop and evaluate methodology for advocate team use in input evaluation. Steps taken…

  6. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  7. METHODOLOGICAL BASES OF PUBLIC ADMINISTRATION OF PUBLIC DEVELOPMENT IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Kyrylo Ohdanskyi

    2016-11-01

    Full Text Available An author in the article examines theoretical bases in the question of dynamics of community development. According to classic canons a dynamic process on any of levels of management hierarchy can be presented as a complex of changes of its ecological, economic and social components. For today, national politics in the field of realization of conception of community development does not take into account most theoretical works, which testify that in our country the mechanism of its effective adjusting is not yet created. In connection to this the author of the article accents the attention on the necessity of the use in modern Ukraine realities of the effective approaches to government control of community development. As the subject of research of the article the author chose the analysis of process of community development and methodological bases for the choice of variants for a management by this process. System approach is chosen by author as a research methodology. The aim. Analysis of theoretical bases and developing of the new approaches to the government administration of community development. An author divides the process of community development by constituents: social, economic and ecological components. From the indicated warning it is necessary to take into account the objective necessity of developing of the new conceptual approaches to the elaboration of tool of adjusting of community development. For the decision of this task the author of the article it is suggested to use the category “dynamics”. An author in the article does the analysis of different interpretations of term “dynamics”and offers his own interpretation in the context of community development. Our researches confirm that, mainly, it is methodologically possible to form the blocks of quantitative and quality factors of specific different information of ecological, economic and social character. Author’s researches confirm that it is methodologically

  8. Development of Engine Loads Methodology, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR seeks to improve the definition of design loads for rocket engine components such that higher performing, lighter weight engines can be developed more...

  9. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    . As a relatively new technology biocatalytic processes often do not immediately fulfil the required process metrics that are key for an economically and/or environmentally competitive process at an industrial scale (high concentration, high reaction yield, high space-time-yield and high biocatalyst yield......). These process metrics can often be attained by improvements in the reaction chemistry, the biocatalyst, and/or by process engineering, which often requires a complex process development strategy. Interestingly this complexity, which arises from the need for integration of biological and process technologies...... in process development is selecting between different process alternatives. The development effort for a novel process is considerable and thus, an increasing number of conceptual process design methods are now applied in chemical industries. Since the natural environment of the biocatalyst is often very...

  10. A Review of Roads Data Development Methodologies

    Directory of Open Access Journals (Sweden)

    Taro Ubukawa

    2014-05-01

    Full Text Available There is a clear need for a public domain data set of road networks with high special accuracy and global coverage for a range of applications. The Global Roads Open Access Data Set (gROADS, version 1, is a first step in that direction. gROADS relies on data from a wide range of sources and was developed using a range of methods. Traditionally, map development was highly centralized and controlled by government agencies due to the high cost or required expertise and technology. In the past decade, however, high resolution satellite imagery and global positioning system (GPS technologies have come into wide use, and there has been significant innovation in web services, such that a number of new methods to develop geospatial information have emerged, including automated and semi-automated road extraction from satellite/aerial imagery and crowdsourcing. In this paper we review the data sources, methods, and pros and cons of a range of road data development methods: heads-up digitizing, automated/semi-automated extraction from remote sensing imagery, GPS technology, crowdsourcing, and compiling existing data sets. We also consider the implications for each method in the production of open data.

  11. Methodology and basic algorithms of the Livermore Economic Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.B.

    1981-03-17

    The methodology and the basic pricing algorithms used in the Livermore Economic Modeling System (EMS) are described. The report explains the derivations of the EMS equations in detail; however, it could also serve as a general introduction to the modeling system. A brief but comprehensive explanation of what EMS is and does, and how it does it is presented. The second part examines the basic pricing algorithms currently implemented in EMS. Each algorithm's function is analyzed and a detailed derivation of the actual mathematical expressions used to implement the algorithm is presented. EMS is an evolving modeling system; improvements in existing algorithms are constantly under development and new submodels are being introduced. A snapshot of the standard version of EMS is provided and areas currently under study and development are considered briefly.

  12. MODEL-Based Methodology for System of Systems Architecture Development with Application to the Recapitalization of the Future Towing and Salvage Platform

    Science.gov (United States)

    2008-09-01

    Structure xv SysML Systems Modeling Language SWISS Shallow Water Intermediate Search Systems T&E Testing and Evaluation TOGAF The Open Group...that meets business needs” ( TOGAF , 2007). An architecture, then, is an organized set of interconnected system capabilities, functions, and... TOGAF “The Open Group Architecture Framework ( TOGAF ),” 2007. Ulrich, Karl T. and Steven D. Eppinger “Product Design and Development,” (McGraw

  13. Modelization of physical phenomena in research reactors with the help of new developments in transport methods, and methodology validation with experimental data

    International Nuclear Information System (INIS)

    Rauck, St.

    2000-10-01

    The aim of this work is to develop a scheme for experimental reactors, based on transport equations. This type of reactors is characterized by a small core, a complex, very heterogeneous geometry and a large leakage. The possible insertion of neutron beams in the reflector and the presence of absorbers in the core increase the difficulty of the 3D-geometrical description and the physical modeling of the component parameters of the reactor. The Orphee reactor has been chosen for our study. Physical models (homogenization, collapsing cross section in few groups, albedo multigroup condition) have been developed in the APOLLO2 and CRONOS2 codes to calculate flux and power maps in a 3D-geometry, with different burnup and through transport equations. Comparisons with experimental measurements have shown the interest of taking into account anisotropy, steep flux gradients by using Sn methods, and on the other hand using a 12-group cross section library. The modeling of neutron beams has been done outside the core modeling through Monte Carlo calculations and with the total geometry, including a large thickness of heavy water. Thanks to this calculations, one can evaluate the neutron beams anti-reactivity and determinate the core cycle. We assure these methods more accurate than usual transport-diffusion calculations will be used for the conception of new research reactors. (author)

  14. Development of risk-based decision methodology for facility design.

    Science.gov (United States)

    2014-06-01

    This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...

  15. Development of a design methodology for asphalt treated mixtures.

    Science.gov (United States)

    2013-12-01

    This report summarizes the results of a study that was conducted to develop a simplified design methodology for asphalt : treated mixtures that are durable, stable, constructible, and cost effective through the examination of the performance of : mix...

  16. Human-Systems Integration (HSI) Methodology Development for NASA Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A technology with game-changing potential for crew to space system interaction will be selected to develop using the HSI Methodology created through the efforts of...

  17. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  18. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  19. A generalized methodology to characterize composite materials for pyrolysis models

    Science.gov (United States)

    McKinnon, Mark B.

    The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to

  20. Development of a Methodology for Predicting Forest Area for Large-Area Resource Monitoring

    Science.gov (United States)

    William H. Cooke

    2001-01-01

    The U.S. Department of Agriculture, Forest Service, Southcm Research Station, appointed a remote-sensing team to develop an image-processing methodology for mapping forest lands over large geographic areds. The team has presented a repeatable methodology, which is based on regression modeling of Advanced Very High Resolution Radiometer (AVHRR) and Landsat Thematic...

  1. Advanced Power Plant Development and Analyses Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    G.S. Samuelsen; A.D. Rao

    2006-02-06

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include ''Zero Emission'' power plants and the ''FutureGen'' H{sub 2} co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the ''Vision 21'' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  2. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  3. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  4. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  5. METHODOLOGICAL GUIDELINES FOR THE TRANSPROFESSIONALISM DEVELOPMENT AMONG VOCATIONAL EDUCATORS

    Directory of Open Access Journals (Sweden)

    E. F. Zeer

    2017-01-01

    Full Text Available Introduction. Nowadays, regarding the 6thwave of technological innovations and emergence of a phenomenon «transfession», there is a need for modernization of the vocational staff training in our country. Transfession is a type of the labour activity realized on the basis of synthesis and convergence of the professional competences that involve different specialized areas. Thus, the authors of the present article propose to use the professional and educational platform, developed by them, taking into account a specialists’ training specialty. The aims of the article are the following: to describe the phenomenon «transprofessionalism», to determine the initial attitudes towards its understanding; to present the block-modular model of the platform for the formation of the transprofessionalism of the teachers of the vocational school. Methodology and research methods. The research is based on the following theoretical and scientific methods: analysis, synthesis, concretization, generalization; hypothetical-deductive method; project-based method. The projecting of the transprofessionalism platform model was constructed on the basis of multidimensional, transdisciplinary, network and project approaches. Results and scientific novelty. The relevance of the discussed phenomenon in the productive-economic sphere is proved. The transprofessionalism requires a brand new content-informative and technological training of specialists. In particular, the concept «profession» has lost its original meaning as an area of the social division of labour during socio-technological development of the Russian economy. Therefore, transprofessionals are becoming more competitive and demanded in the employment market, being capable to perform a wide range of specialized types of professional activities. The structure, principles and mechanisms of the professional-educational platform functioning for transprofessionalism formation among the members of professional

  6. A methodology for modeling barrier island storm-impact scenarios

    Science.gov (United States)

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  7. Physical protection evaluation methodology program development and application

    International Nuclear Information System (INIS)

    Seo, Janghoon; Yoo, Hosik

    2015-01-01

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  8. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  9. Prometheus Reactor I and C Software Development Methodology, for Action

    International Nuclear Information System (INIS)

    T. Hamilton

    2005-01-01

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I and C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I and C Software Development Process Manual and Reactor Module Software Development Plan to NR for information

  10. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...... been used. Also risk assessment methods will be discussed. Threat profiles and vulnerability profiles have been presented....

  11. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning, the establishment and destruction of components at software run-time (dynamic reconfiguration, cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning.Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration, and finally with a mobile architecture (i.e. architecture with dynamic component mobility confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008. Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development. According to (Szyper, 2002 that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process, see (Sommer, 2010 as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of

  12. Efficient methodology of route selection for driving cycle development

    Science.gov (United States)

    Mahayadin, A. R.; Shahriman, A. B.; Hashim, M. S. M.; Razlan, Z. M.; Faizi, M. K.; Harun, A.; Kamarrudin, N. S.; Ibrahim, I.; Saad, M. A. M.; Rani, M. F. H.; Zunaidi, I.; Sahari, M.; Sarip, M. S.; Razali, M. Q. H. A.

    2017-10-01

    Driving cycle is a series of data points representing the speed of vehicle versus time and used to determine the performance of vehicle in general. One of the critical portions of driving cycle development is route selection methodology. This paper describes the efficient methodology of route selection for driving cycle development. Previous data from JKR Road Traffic Volume Malaysia (RTVM) in 2015 is studied and analysed to propose the methodology in route selection. The selected routes are then analysed by using Google Maps. For each region, four (4) routes are selected for each urban and rural. For this paper, the selection of route is focused on northern region of Malaysia specifically in Penang. Penang is chosen for this study because it is one of the developed state in Malaysia that has many urban and rural routes. The methods of route selection constructed in this study could be used by other region to develop their own driving cycles.

  13. A methodology for the parametric modelling of the flow coefficients and flow rate in hydraulic valves

    International Nuclear Information System (INIS)

    Valdés, José R.; Rodríguez, José M.; Saumell, Javier; Pütz, Thomas

    2014-01-01

    Highlights: • We develop a methodology for the parametric modelling of flow in hydraulic valves. • We characterize the flow coefficients with a generic function with two parameters. • The parameters are derived from CFD simulations of the generic geometry. • We apply the methodology to two cases from the automotive brake industry. • We validate by comparing with CFD results varying the original dimensions. - Abstract: The main objective of this work is to develop a methodology for the parametric modelling of the flow rate in hydraulic valve systems. This methodology is based on the derivation, from CFD simulations, of the flow coefficient of the critical restrictions as a function of the Reynolds number, using a generalized square root function with two parameters. The methodology is then demonstrated by applying it to two completely different hydraulic systems: a brake master cylinder and an ABS valve. This type of parametric valve models facilitates their implementation in dynamic simulation models of complex hydraulic systems

  14. Development of radiation risk assessment simulator using system dynamics methodology

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moosung

    2008-01-01

    The potential magnitudes of radionuclide releases under severe accident loadings and offsite consequences as well as the overall risk (the product of accident frequencies and consequences) are analyzed and evaluated quantitatively in this study. The system dynamics methodology has been applied to predict the time-dependent behaviors such as feedback and dependency as well as to model uncertain behavior of complex physical system. It is used to construct the transfer mechanisms of time dependent radioactivity concentration and to evaluate them. Dynamic variations of radio activities are simulated by considering several effects such as deposition, weathering, washout, re-suspension, root uptake, translocation, leaching, senescence, intake, and excretion of soil. The time-dependent radio-ecological model applicable to Korean specific environment has been developed in order to assess the radiological consequences following the short-term deposition of radio-nuclides during severe accidents nuclear power plant. An ingestion food chain model can estimate time dependent radioactivity concentrations in foodstuffs. And it is also shown that the system dynamics approach is useful for analyzing the phenomenon of the complex system as well as the behavior of structure values with respect to time. The output of this model (Bq ingested per Bq m - 2 deposited) may be multiplied by the deposition and a dose conversion factor (Gy Bq -1 ) to yield organ-specific doses. The model may be run deterministically to yield a single estimate or stochastic distributions by 'Monte-Carlo' calculation that reflects uncertainty of parameter and model uncertainties. The results of this study may contribute to identifying the relative importance of various parameters occurred in consequence analysis, as well as to assessing risk reduction effects in accident management. (author)

  15. Development of methodology for early detection of BWR instabilities

    International Nuclear Information System (INIS)

    Alessandro Petruzzi; Shin Chin; Kostadin Ivanov; Asok Ray; Fan-Bill Cheung

    2005-01-01

    Full text of publication follows: The objective of the work presented in this paper research, which is supported by the US Department of Energy under the NEER program, is to develop an early anomaly detection methodology in order to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, based on the US NRC coupled code TRACE/PARCS, is being utilized as a generator of time series data for anomaly detection at an early stage. The concept of the methodology is based on the fact that nonlinear systems show bifurcation, which is a change in the qualitative behavior as the system parameters vary. Some of these parameters may change on their own accord and account for the anomaly, while certain parameters can be altered in a controlled fashion. The non-linear, non-autonomous BWR system model considered in this research exhibits phenomena at two time scales. Anomalies occur at the slow time scale while the observation of the dynamical behavior, based on which inferences are made, takes place at the fast time scale. It is assumed that: (i) the system behavior is stationary at the fast time scale; and (ii) any observable non-stationary behavior is associated with parametric changes evolving at the slow time scale. The goal is to make inferences about evolving anomalies based on the asymptotic behavior derived from the computer simulation. However, only sufficient changes in the slowly varying parameter may lead to detectable difference in the asymptotic behavior. The need to detect such small changes in parameters and hence early detection of an anomaly motivate the utilized stimulus-response approach. In this approach, the model

  16. Modeling Methodologies for Representing Urban Cultural Geographies in Stability Operations

    National Research Council Canada - National Science Library

    Ferris, Todd P

    2008-01-01

    ... 2.0.0, in an effort to provide modeling methodologies for a single simulation tool capable of exploring the complex world of urban cultural geographies undergoing Stability Operations in an irregular warfare (IW) environment...

  17. A vision on methodology for integrated sustainable urban development: bequest

    NARCIS (Netherlands)

    Bentivegna, V.; Curwell, S.; Deakin, M.; Lombardi, P.; Mitchell, G.; Nijkamp, P.

    2002-01-01

    The concepts and visions of sustainable development that have emerged in the post-Brundtland era are explored in terms laying the foundations for a common vision of sustainable urban development (SUD). The described vision and methodology for SUD resulted from the activities of an international

  18. Methodology for development of risk indicators for offshore platforms

    International Nuclear Information System (INIS)

    Oeien, K.; Sklet, S.

    1999-01-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  19. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally

  20. High level models and methodologies for information systems

    CERN Document Server

    Isaias, Pedro

    2014-01-01

    This book introduces methods and methodologies in Information Systems (IS) by presenting, describing, explaining, and illustrating their uses in various contexts, including website development, usability evaluation, quality evaluation, and success assessment.

  1. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available not be the most important pathway of exposure for all pollutants, it is considered the one of major concern for exposure to PM. Related concepts, such as dose, will not be addressed in this chapter. The National Academy of Sciences suggests the fol- lowing model... over time. Other exposure expressions are used to estimate exposures to pollutants in the in- gestion and dermal absorption pathways. Major variables of concern in the estimation of ex- posure, Eq. (2), are the concentration of PM and its constituents...

  2. Conceptual and methodological biases in network models.

    Science.gov (United States)

    Lamm, Ehud

    2009-10-01

    Many natural and biological phenomena can be depicted as networks. Theoretical and empirical analyses of networks have become prevalent. I discuss theoretical biases involved in the delineation of biological networks. The network perspective is shown to dissolve the distinction between regulatory architecture and regulatory state, consistent with the theoretical impossibility of distinguishing a priori between "program" and "data." The evolutionary significance of the dynamics of trans-generational and interorganism regulatory networks is explored and implications are presented for understanding the evolution of the biological categories development-heredity, plasticity-evolvability, and epigenetic-genetic.

  3. Summary of FY-1978 consultation input for Scenario Methodology Development

    International Nuclear Information System (INIS)

    Scott, B.L.; Benson, G.L.; Craig, R.A.; Harwell, M.A.

    1979-11-01

    The Scenario Methodology Development task is concerned with evaluating the geologic system surrounding an underground repository and describing the phenomena (volcanic, seismic, meteorite, hydrologic, tectonic, climate, etc.) which could perturb the system and possibly cause loss of repository integrity. This document includes 14 individual papers. Separate abstracts were prepared for all 14 papers

  4. The Heart of the Matter: Methodological Challenges in Developing a ...

    African Journals Online (AJOL)

    The Heart of the Matter: Methodological Challenges in Developing a Contemporary Reading Programme for Monolingual Lexicography, from the Perspective of the ... This article argues the importance of the reading programme as the pivotal issue in the lexicographic process. ... the definition of South African English,

  5. Embracing Agile methodology during DevOps Developer Internship Program

    OpenAIRE

    Patwardhan, Amol; Kidd, Jon; Urena, Tiffany; Rajgopalan, Aishwarya

    2016-01-01

    The DevOps team adopted agile methodologies during the summer internship program as an initiative to move away from waterfall. The DevOps team implemented the Scrum software development strategy to create an internal data dictionary web application. This article reports on the transition process and lessons learned from the pilot program.

  6. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  7. Novel Computational Methodologies for Structural Modeling of Spacious Ligand Binding Sites of G-Protein-Coupled Receptors: Development and Application to Human Leukotriene B4 Receptor

    OpenAIRE

    Ishino, Yoko; Harada, Takanori

    2012-01-01

    This paper describes a novel method to predict the activated structures of G-protein-coupled receptors (GPCRs) with high accuracy, while aiming for the use of the predicted 3D structures in in silico virtual screening in the future. We propose a new method for modeling GPCR thermal fluctuations, where conformation changes of the proteins are modeled by combining fluctuations on multiple time scales. The core idea of the method is that a molecular dynamics simulation is used to calculate avera...

  8. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral...... by social, political and economic conditions, as by technology. Since the geodetic survey profession has been the keeper of the cadastre, geodetic surveyors will have to deal ever more with social science matters, a fact that universities will have to consider....

  9. European methodology for qualification of NDT as developed by ENIQ

    International Nuclear Information System (INIS)

    Champigny, F.; Sandberg, U.; Engl, G.; Crutzen, S.; Lemaitre, P.

    1997-01-01

    The European Network for Inspection Qualification (ENIQ) groups the major part of the nuclear power plant operators in the European Union (and Switzerland). The main objective of ENIQ is to co-ordinate and manage at European level expertise and resources for the qualification of NDE inspection systems, primarily for nuclear components. In the framework of ENIQ the European methodology for qualification of NDT has been developed. In this paper the main principles of the European methodology are given besides the main activities and organisation of ENIQ. (orig.)

  10. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  11. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  12. Tornado missile simulation and design methodology. Volume 2: model verification and data base updates. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments

  13. Modeling of development and projection of the accumulated recoverable oil volume: methodology and application; Modelagem da evolucao e projecao de volume de oleo recuperavel acumulado: metodologia e aplicacao

    Energy Technology Data Exchange (ETDEWEB)

    Melo, Luciana Cavalcanti de; Ferreira Filho, Virgilio Jose Martins; Rocha, Vinicius Brito [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)

    2004-07-01

    A relevant problem that petroleum companies deal is the estimate of the future levels of reserves The objective of the reserve forecasting is pursued through the construction of mathematical models. Considering that the exploration process is an informed and controlled process, in order to reach the exploration targets, the exploration process is lead inside of a sequence of decisions based on the reached results. Such decisions are taken surrounded by an uncertain environment added to the random nature of the process. Another important assumption that must be taken into consideration is the dependency of the exploration on the conditions, or structure, of the discovered resources and the final potential. The modeling starts with the establishment of a general problem, when the models are being constructed, based on suppositions associated to the main concepts, and ends with the attainment of specific solutions, when the best description, or model, is selected through the estimate of the respective parameters and of the measurement adjustments. The result of this approach reflects the essence of the exploration process and how it is reflected in the incorporation of reserves and history of field discoveries. A case study is used for validation of the models and the estimates. (author)

  14. Towards a general object-oriented software development methodology

    Science.gov (United States)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.

  15. HIERARCHICAL METHODOLOGY FOR MODELING HYDROGEN STORAGE SYSTEMS PART II: DETAILED MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, B; Donald L. Anton, D

    2008-12-22

    There is significant interest in hydrogen storage systems that employ a media which either adsorbs, absorbs or reacts with hydrogen in a nearly reversible manner. In any media based storage system the rate of hydrogen uptake and the system capacity is governed by a number of complex, coupled physical processes. To design and evaluate such storage systems, a comprehensive methodology was developed, consisting of a hierarchical sequence of models that range from scoping calculations to numerical models that couple reaction kinetics with heat and mass transfer for both the hydrogen charging and discharging phases. The scoping models were presented in Part I [1] of this two part series of papers. This paper describes a detailed numerical model that integrates the phenomena occurring when hydrogen is charged and discharged. A specific application of the methodology is made to a system using NaAlH{sub 4} as the storage media.

  16. Novel computational methodologies for structural modeling of spacious ligand binding sites of G-protein-coupled receptors: development and application to human leukotriene B4 receptor.

    Science.gov (United States)

    Ishino, Yoko; Harada, Takanori

    2012-01-01

    This paper describes a novel method to predict the activated structures of G-protein-coupled receptors (GPCRs) with high accuracy, while aiming for the use of the predicted 3D structures in in silico virtual screening in the future. We propose a new method for modeling GPCR thermal fluctuations, where conformation changes of the proteins are modeled by combining fluctuations on multiple time scales. The core idea of the method is that a molecular dynamics simulation is used to calculate average 3D coordinates of all atoms of a GPCR protein against heat fluctuation on the picosecond or nanosecond time scale, and then evolutionary computation including receptor-ligand docking simulations functions to determine the rotation angle of each helix of a GPCR protein as a movement on a longer time scale. The method was validated using human leukotriene B4 receptor BLT1 as a sample GPCR. Our study demonstrated that the proposed method was able to derive the appropriate 3D structure of the active-state GPCR which docks with its agonists.

  17. Novel Computational Methodologies for Structural Modeling of Spacious Ligand Binding Sites of G-Protein-Coupled Receptors: Development and Application to Human Leukotriene B4 Receptor

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2012-01-01

    Full Text Available This paper describes a novel method to predict the activated structures of G-protein-coupled receptors (GPCRs with high accuracy, while aiming for the use of the predicted 3D structures in in silico virtual screening in the future. We propose a new method for modeling GPCR thermal fluctuations, where conformation changes of the proteins are modeled by combining fluctuations on multiple time scales. The core idea of the method is that a molecular dynamics simulation is used to calculate average 3D coordinates of all atoms of a GPCR protein against heat fluctuation on the picosecond or nanosecond time scale, and then evolutionary computation including receptor-ligand docking simulations functions to determine the rotation angle of each helix of a GPCR protein as a movement on a longer time scale. The method was validated using human leukotriene B4 receptor BLT1 as a sample GPCR. Our study demonstrated that the proposed method was able to derive the appropriate 3D structure of the active-state GPCR which docks with its agonists.

  18. Development of an aeroelastic methodology for surface morphing rotors

    Science.gov (United States)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  19. A Monte Carlo methodology for modelling ashfall hazards

    Science.gov (United States)

    Hurst, Tony; Smith, Warwick

    2004-12-01

    We have developed a methodology for quantifying the probability of particular thicknesses of tephra at any given site, using Monte Carlo methods. This is a part of the development of a probabilistic volcanic hazard model (PVHM) for New Zealand, for hazards planning and insurance purposes. We use an established program (ASHFALL) to model individual eruptions, where the likely thickness of ash deposited at selected sites depends on the location of the volcano, eruptive volume, column height and ash size, and the wind conditions. A Monte Carlo procedure allows us to simulate the variations in eruptive volume and in wind conditions by analysing repeat eruptions, each time allowing the parameters to vary randomly according to known or assumed distributions. Actual wind velocity profiles are used, with randomness included by selection of a starting date. This method can handle the effects of multiple volcanic sources, each source with its own characteristics. We accumulate the tephra thicknesses from all sources to estimate the combined ashfall hazard, expressed as the frequency with which any given depth of tephra is likely to be deposited at selected sites. These numbers are expressed as annual probabilities or as mean return periods. We can also use this method for obtaining an estimate of how often and how large the eruptions from a particular volcano have been. Results from sediment cores in Auckland give useful bounds for the likely total volumes erupted from Egmont Volcano (Mt. Taranaki), 280 km away, during the last 130,000 years.

  20. Development of a post-processing methodology for reliable, skillful probabilistic quantitative precipitation forecasts with multi-model ensembles and short training data sets.

    Science.gov (United States)

    Hamill, Thomas M.; Scheuerer, Michael

    2017-04-01

    While many previous studies have shown the benefits and improved forecast reliability from combining predictions from multi-model ensemble systems, our experience is that MMEs of global ensemble precipitation forecasts are still highly unreliable when verified against point observations of precipitation or against high-resolution precipitation analyses. This unreliability is caused by a lack of model resolution as well as systematic errors in the mean precipitation amount. These errors may vary from one ensemble prediction system to the next, and perhaps member by member for some ensemble systems. They can vary from one location to the next and the error is commonly different for light vs. heavy precipitation. MMEs also typically under-forecast the precipitation spread. Typically, producing skillful and reliable post-processed forecast guidance of probabilistic precipitation is challenging with short training data sets given the intermittency of precipitation and the relative rarity of high precipitation. Pooling of training data can increase the sample size needed for effective post-processing, but at the expense of providing geographically relevant adjustments for systematic error. A novel approach for generating probabilistic precipitation forecasts is demonstrated here using global MMEs. The key component is the selective supplementation of training data at every location where a forecast is desired using the training data at other "supplemental locations". These supplemental locations are chosen on the basis of a similarity of terrain characteristics and precipitation climatology, under the presumption that the forecast errors from coarse-resolution prediction systems are often related to mis-representation of terrain-related detail. With training sample size thus enlarged, post-processing is based on quantile mapping for removal of amount-dependent bias and best-member dressing for addressing spread issues. Algorithmic details and the results of the

  1. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-01-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  2. Towards the development of a global probabilistic tsunami risk assessment methodology

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2017-04-01

    The assessment of tsunami risk is on many levels still ambiguous and under discussion. Over the last two decades, various methodologies and models have been developed to quantify tsunami risk, most of the time on a local or regional level, with either deterministic or probabilistic background. Probabilistic modelling has significant difficulties, as the underlying tsunami hazard modelling demands an immense amount of computational time and thus limits the assessment substantially, being often limited to either institutes with supercomputing access or the modellers are forced to reduce modelling resolution either quantitatively or qualitatively. Furthermore, data on the vulnerability of infrastructure and buildings is empirically limited to a few disasters in the recent years. Thus, a reliable quantification of socio-economic vulnerability is still questionable. Nonetheless, significant improvements have been developed recently on both the methodological site as well as computationally. This study, introduces a methodological framework for a globally uniform probabilistic tsunami risk assessment. Here, the power of recently developed hardware for desktop-based parallel computing plays a crucial role in the calculation of numerical tsunami wave propagation, while large-scale parametric models and paleo-seismological data enhances the return period assessment of tsunami-genic megathrust earthquake events. Adaptation of empirical tsunami vulnerability functions in conjunction with methodologies from flood modelling support a more reliable vulnerability quantification. In addition, methodologies for exposure modelling in coastal areas are introduced focusing on the diversity of coastal exposure landscapes and data availability. Overall, this study introduces a first overview of how a global tsunami risk modelling framework may be accomplished, while covering methodological, computational and data-driven aspects.

  3. Development of the affiliate system based on modern development methodologies

    OpenAIRE

    Fajmut, Aljaž

    2016-01-01

    Affiliate partnership is a popular and effective method of online marketing through affiliate partners. The thesis describes the development of a product, which allows us to easily integrate affiliate system into an existing platform (e-commerce or service). This kind of functionality opens up growth opportunities for the business. The system is designed in a way that it requires minimal amount of changes for the implementation into an existing application. The development of the product is ...

  4. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  5. Methodology for Developing Life Tables for Sessile Insects in the Field Using the Whitefly, Bemisia tabaci, in Cotton As a Model System.

    Science.gov (United States)

    Naranjo, Steven E; Ellsworth, Peter C

    2017-11-01

    Life tables provide a means of measuring the schedules of birth and death from populations over time. They also can be used to quantify the sources and rates of mortality in populations, which has a variety of applications in ecology, including agricultural ecosystems. Horizontal, or cohort-based, life tables provide for the most direct and accurate method of quantifying vital population rates because they follow a group of individuals in a population from birth to death. Here, protocols are presented for conducting and analyzing cohort-based life tables in the field that takes advantage of the sessile nature of the immature life stages of a global insect pest, Bemisia tabaci. Individual insects are located on the underside of cotton leaves and are marked by drawing a small circle around the insect with a non-toxic pen. This insect can then be observed repeatedly over time with the aid of hand lenses to measure development from one stage to the next and to identify stage-specific causes of death associated with natural and introduced mortality forces. Analyses explain how to correctly measure multiple mortality forces that act contemporaneously within each stage and how to use such data to provide meaningful population dynamic metrics. The method does not directly account for adult survival and reproduction, which limits inference to dynamics of immature stages. An example is presented that focused on measuring the impact of bottom-up (plant quality) and top-down (natural enemies) effects on the mortality dynamics of B. tabaci in the cotton system.

  6. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Development of a simplified statistical methodology for nuclear fuel rod internal pressure calculation

    International Nuclear Information System (INIS)

    Kim, Kyu Tae; Kim, Oh Hwan

    1999-01-01

    A simplified statistical methodology is developed in order to both reduce over-conservatism of deterministic methodologies employed for PWR fuel rod internal pressure (RIP) calculation and simplify the complicated calculation procedure of the widely used statistical methodology which employs the response surface method and Monte Carlo simulation. The simplified statistical methodology employs the system moment method with a deterministic statistical methodology employs the system moment method with a deterministic approach in determining the maximum variance of RIP. The maximum RIP variance is determined with the square sum of each maximum value of a mean RIP value times a RIP sensitivity factor for all input variables considered. This approach makes this simplified statistical methodology much more efficient in the routine reload core design analysis since it eliminates the numerous calculations required for the power history-dependent RIP variance determination. This simplified statistical methodology is shown to be more conservative in generating RIP distribution than the widely used statistical methodology. Comparison of the significances of each input variable to RIP indicates that fission gas release model is the most significant input variable. (author). 11 refs., 6 figs., 2 tabs

  8. Methodology of modeling fiber reinforcement in concrete elements

    NARCIS (Netherlands)

    Stroeven, P.

    2010-01-01

    This paper’s focus is on the modeling methodology of (steel) fiber reinforcement in concrete. The orthogonal values of fiber efficiency are presented. Bulk as well as boundary situations are covered. Fiber structure is assumed due to external compaction by vibration to display a partially linear

  9. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2008-01-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  10. Energy indicators for sustainable development: Guidelines and methodologies

    International Nuclear Information System (INIS)

    2005-04-01

    This publication is the product of an international initiative to define a set of Energy Indicators for Sustainable Development (EISD) and corresponding methodologies and guidelines. The successful completion of this work is the result of an intensive effort led by the International Atomic Energy Agency (IAEA) in cooperation with the United Nations Department of Economic and Social Affairs (UNDESA), the International Energy Agency (IEA), Eurostat and the European Environment Agency (EEA). The thematic framework, guidelines, methodology sheets and energy indicators set out in this publication reflect the expertise of these various agencies, recognized worldwide as leaders in energy and environmental statistics and analysis. While each agency has an active indicator programme, one goal of this joint endeavour has been to provide users with a consensus by leading experts on definitions, guidelines and methodologies for the development and worldwide use of a single set of energy indicators. No set of energy indicators can be final and definitive. To be useful, indicators must evolve over time to fit country-specific conditions, priorities and capabilities. The purpose of this publication is to present one set of EISD for consideration and use, particularly at the national level, and to serve as a starting point in the development of a more comprehensive and universally accepted set of energy indicators relevant to sustainable development. It is hoped that countries will use the EISD to assess their energy systems and to track their progress towards nationally defined sustainable development goals and objectives. It is also hoped that users of the information presented in this publication will contribute to refinements of energy indicators for sustainable development by adding their own unique perspectives to what is presented herein

  11. Theater for Development Methodology in Childhood Cataract Case Finding

    Directory of Open Access Journals (Sweden)

    Roseline Ekanem Duke

    2016-03-01

    Full Text Available The key informant methodology for case finding for childhood cataract  was utilized  in a rural population in Nigeria to identify suitable children who would benefit surgically from intervene for cataract and restore vision such children. It was however noticed that some parents who had children with cataract did not bring their children to the primary health center for examination and recommendation. The purpose of this study is to investigate the benefits of using the theatre for development approach in childhood cataract case finding. The delay in identification and referral of children with cataract at an appropriate age for surgical intervention and optical rehabilitation is the main cause of poor vision following surgery for the condition as amblyopia results. Therefore early presentation, identification, referral and surgical intervention as well as appropriate optical rehabilitation is the key to successful surgical outcome of childhood cataract and good visual prognosis. The theater for development (TfD approach methodology was implemented in a community in Akpabuyo local government are of Cross River state, Nigeria as a means to enhance community participation, health promotion and education and to complement the key informant methodology in case finding for childhood cataract. Three children with cataracts were referred by the community following the TfD intervention, for cataract surgery and uptake of follow up care after surgery. The TfD approach appears to be a useful method for encouraging community participation in the case finding of childhood cataract.

  12. Methodology of modeling and measuring computer architectures for plasma simulations

    Science.gov (United States)

    Wang, L. P. T.

    1977-01-01

    A brief introduction to plasma simulation using computers and the difficulties on currently available computers is given. Through the use of an analyzing and measuring methodology - SARA, the control flow and data flow of a particle simulation model REM2-1/2D are exemplified. After recursive refinements the total execution time may be greatly shortened and a fully parallel data flow can be obtained. From this data flow, a matched computer architecture or organization could be configured to achieve the computation bound of an application problem. A sequential type simulation model, an array/pipeline type simulation model, and a fully parallel simulation model of a code REM2-1/2D are proposed and analyzed. This methodology can be applied to other application problems which have implicitly parallel nature.

  13. A descriptive framework for investigating problems in the application of systems development methodologies

    OpenAIRE

    Fitzgerald, Brian

    1995-01-01

    peer-reviewed It is generally taken as axiomatic that systems development methodologies (SDMs) play a useful role in guiding the development process, and that their increased adoption would improve the product and process of systems development. This paper begins by briefly reviewing the arguments and pressures in favour of SDMs. Following this, a descriptive model of the system development process is formulated, and this is then used as a framework to map a number of fundam...

  14. A Referential Methodology for Education on Sustainable Tourism Development

    Directory of Open Access Journals (Sweden)

    Burcin Hatipoglu

    2014-08-01

    Full Text Available Sustainable tourism has the potential of contributing to local development while protecting the natural environment and preserving cultural heritage. Implementation of this form of tourism requires human resources that can assume effective leadership in sustainable development. The purpose of the international student program, described in this paper, was to develop and implement an educational methodology to fulfill this need. The study, which was developed and applied by two universities, took place in August 2013, in the study setting of Kastamonu, Turkey. The effectiveness of the program was measured by pre- and post-surveys using the Global Citizenship Scale developed by Morais and Ogden. The findings document a change in intercultural communication, global knowledge and political voice dimensions of the scale.

  15. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving.

    Science.gov (United States)

    Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice

    2016-10-11

    The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle's surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  16. Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving

    Directory of Open Access Journals (Sweden)

    Jos Elfring

    2016-10-01

    Full Text Available The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture.

  17. A New Methodology of Design and Development of Serious Games

    Directory of Open Access Journals (Sweden)

    André F. S. Barbosa

    2014-01-01

    Full Text Available The development of a serious game requires perfect knowledge of the learning domain to obtain the desired results. But it is also true that this may not be enough to develop a successful serious game. First of all, the player has to feel that he is playing a game where the learning is only a consequence of the playing actions. Otherwise, the game is viewed as boring and not as a fun activity and engaging. For example, the player can catch some items in the scenario and then separate them according to its type (i.e., recycle them. Thus, the main action for player is catching the items in the scenario where the recycle action is a second action, which is viewed as a consequence of the first action. Sometimes, the game design relies on a detailed approach based on the ideas of the developers because some educational content are difficult to integrate in the games, while maintaining the fun factor in the first place. In this paper we propose a new methodology of design and development of serious games that facilitates the integration of educational contents in the games. Furthermore, we present a serious game, called “Clean World”, created using this new methodology.

  18. Exploring the possibility of modeling a genetic counseling guideline using agile methodology.

    Science.gov (United States)

    Choi, Jeeyae

    2013-01-01

    Increased demand of genetic counseling services heightened the necessity of a computerized genetic counseling decision support system. In order to develop an effective and efficient computerized system, modeling of genetic counseling guideline is an essential step. Throughout this pilot study, Agile methodology with United Modeling Language (UML) was utilized to model a guideline. 13 tasks and 14 associated elements were extracted. Successfully constructed conceptual class and activity diagrams revealed that Agile methodology with UML was a suitable tool to modeling a genetic counseling guideline.

  19. A computer simulator for development of engineering system design methodologies

    Science.gov (United States)

    Padula, S. L.; Sobieszczanski-Sobieski, J.

    1987-01-01

    A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.

  20. Methodological issues in clinical drug development for essential tremor.

    Science.gov (United States)

    Carranza, Michael A; Snyder, Madeline R; Elble, Rodger J; Boutzoukas, Angelique E; Zesiewicz, Theresa A

    2012-01-01

    Essential tremor (ET) is one of the most common tremor disorders in the world. Despite this, only two medications have received Level A recommendations from the American Academy of Neurology to treat it (primidone and propranolol). Even though these medications provide relief to a large group of ET patients, up to 50% of patients are non-responders. Additional medications to treat ET are needed. This review discusses some of the methodological issues that should be addressed for quality clinical drug development in ET.

  1. Theoretical and methodological foundations of sustainable development of Geosystems

    Science.gov (United States)

    Mandryk, O. M.; Arkhypova, L. M.; Pukish, A. V.; Zelmanovych, A.; Yakovlyuk, Kh

    2017-05-01

    The theoretical and methodological foundations of sustainable development of Geosystems were further evolved. It was grounded the new scientific direction “constructive Hydroecology” - the science that studies the Hydrosphere from the standpoint of natural and technogenic safety based on geosystematical approach. A structural separation for constructive Hydroecology based on objective, subjective, and application characteristics was set. The main object of study of the new scientific field is the hydroecological environment under which the part of Hydrosphere should be understood as a part of the multicomponent dynamic system that is influenced by engineering and economical human activities and, in turn, determines to some extent this activity.

  2. Methodology for developing probabilstic productivity norms in civil engineering

    Directory of Open Access Journals (Sweden)

    Almayouf Khaled Omar

    2016-01-01

    Full Text Available Successful implementation of the Critical Path Method requires availability of clearly defined duration for each activity, while the PERT method is based on personal estimation. However, due to the long duration of the construction and unpredicted delays that accompany this process, it is often difficult or almost impossible to predict exact duration of an activity, and consequently to take it for granted that the given activity will be finished on the very same day that is given in the dynamic plan of construction. The aim of presented research was to establish methodology for developing new productivity norms for construction works for planing under uncertainty.

  3. Modeling myocardial infarction in mice: methodology, monitoring, pathomorphology.

    Science.gov (United States)

    Ovsepyan, A A; Panchenkov, D N; Prokhortchouk, E B; Telegin, G B; Zhigalova, N A; Golubev, E P; Sviridova, T E; Matskeplishvili, S T; Skryabin, K G; Buziashvili, U I

    2011-01-01

    Myocardial infarction is one of the most serious and widespread diseases in the world. In this work, a minimally invasive method for simulating myocardial infarction in mice is described in the Russian Federation for the very first time; the procedure is carried out by ligation of the coronary heart artery or by controlled electrocoagulation. As a part of the methodology, a series of anesthetic, microsurgical and revival protocols are designed, owing to which a decrease in the postoperational mortality from the initial 94.6 to 13.6% is achieved. ECG confirms the development of large-focal or surface myocardial infarction. Postmortal histological examination confirms the presence of necrosis foci in the heart muscles of 87.5% of animals. Altogether, the medical data allow us to conclude that an adequate mouse model for myocardial infarction was generated. A further study is focused on the standardization of the experimental procedure and the use of genetically modified mouse strains, with the purpose of finding the most efficient therapeutic approaches for this disease.

  4. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    Energy Technology Data Exchange (ETDEWEB)

    Knezevic, J.; Odoom, E.R

    2001-07-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets.

  5. TECHNOLOGY FOR DEVELOPMENT OF ELECTRONIC TEXTBOOK ON HANDICRAFTS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Iryna V. Androshchuk

    2017-10-01

    Full Text Available The main approaches to defining the concept of electronic textbook have been analyzed in the article. The main advantages of electronic textbooks in the context of future teachers’ training have been outlined. They are interactivity, feedback provision, availability of navigation and search engine. The author has presented and characterized the main stages in the technology of development of an electronic textbook on Handicraft and Technology Training Methodology: determination of its role and significance in the process of mastering the discipline; justification of its structure; outline of the stages of its development in accordance with the defined structure. The characteristic feature of the developed electronic textbook is availability of macro- and microstructure. Macrostructure is viewed as a sequence of components of the electronic textbook that are manifested in its content; microstructure is considered to be an internal pattern of each component of macrostructure.

  6. Methodological questions for the post-2015 development agenda.

    Directory of Open Access Journals (Sweden)

    Jacopo Bonan

    2014-07-01

    Full Text Available In 2015, the Millennium Development Goals are due to end. Academics, practitioners and the general public are eager to see which development agenda will take their place and a variety of different organizations are currently elaborating proposals for the next “round” of goals and targets. Instead of investigating possible topics of the upcoming agenda, we focus on methodological questions that – according to our view – will play a major role in the definition and implementation of future development goals. We focus on the elaboration of some key questions that should be addressed in the realm of poverty and inequality measurement, the definition of targets, the ability to consider complexity and evidence-based policy making.

  7. Methodology of citrate-based biomaterial development and application

    Science.gov (United States)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  8. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    objective is to attack the boost phase of ballistic missiles using the Airborne Weapons Layer concept (AWL) (Corbett, 2013) and ( Rood , Chilton, Campbell...and analysis techniques used in this research. Chapter 4 provides analysis of the simulation model to illustrate the methodology in Chapter 3 and to... techniques , and procedures. The purpose of our research is to study the use of a new missile system within an air combat environment. Therefore, the

  9. Teaching methodology for modeling reference evapotranspiration with artificial neural networks

    OpenAIRE

    Martí, Pau; Pulido Calvo, Inmaculada; Gutiérrez Estrada, Juan Carlos

    2015-01-01

    [EN] Artificial neural networks are a robust alternative to conventional models for estimating different targets in irrigation engineering, among others, reference evapotranspiration, a key variable for estimating crop water requirements. This paper presents a didactic methodology for introducing students in the application of artificial neural networks for reference evapotranspiration estimation using MatLab c . Apart from learning a specific application of this software wi...

  10. Graphic display development methodology: Volume 1, Theory: Final report

    International Nuclear Information System (INIS)

    Pankrantz, D.

    1986-11-01

    The Graphic Display Development Program is intended to develop computer-based displays which support the symptomatic emergency operating procedures for BWRs. The purpose is to provide a technical basis and methodology for linking two post-TMI safety initiatives: improved operating procedures and the Safety Parameter Display System (NUREG-0737 Supplement 1). Whereas consistency between displays and emergency operating procedures is desirable, no assumption of such an NRC requirement is either expressed or implied in this document. Accordingly, this program should be viewed not as the only acceptable approach to SPDS design but as one of many possible approaches which may be pursued. This program has been supported as a generic activity on behalf of the Boiling Water Reactor Owner's Group (BWROG). No endorsement by any individual utility member of the BWROG is either expressed or implied in this document, nor is any utility obligated to implement this program at any plant

  11. KALIMER database development (database configuration and design methodology)

    International Nuclear Information System (INIS)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Young Bum; Chang, Won Pyo; Hahn, Do Hee

    2001-10-01

    KALIMER Database is an advanced database to utilize the integration management for Liquid Metal Reactor Design Technology Development using Web Applicatins. KALIMER Design database consists of Results Database, Inter-Office Communication (IOC), and 3D CAD database, Team Cooperation system, and Reserved Documents, Results Database is a research results database during phase II for Liquid Metal Reactor Design Technology Develpment of mid-term and long-term nuclear R and D. IOC is a linkage control system inter sub project to share and integrate the research results for KALIMER. 3D CAD Database is s schematic design overview for KALIMER. Team Cooperation System is to inform team member of research cooperation and meetings. Finally, KALIMER Reserved Documents is developed to manage collected data and several documents since project accomplishment. This report describes the features of Hardware and Software and the Database Design Methodology for KALIMER

  12. System study methodology development and potential utilization for fusion

    International Nuclear Information System (INIS)

    Djerassi, H.; Rouillard, J.; Leger, D.; Sarto, S.; Zappellini, G.; Gambi, G.

    1989-01-01

    The objective of this new methodology is to combine systemics with heuristics for engineering applications. The system method considers as a whole a set of dynamically interacting elements, organized for tasks. Heuristics tries to describe the rules to apply in scientific research. This methodology is a powerful tool for evaluating the options, compared with conventional analytical methods as a higher number of parameters can be taken into account, with a higher quality standard while comparing the possible options. The system method takes into account interacting data or random relationships by means of simulation modelling. Thus, a dynamical approach can be deduced and a sensitivity analysis can be performed for a very high number of options and basic data. This method can be limited to a specific objective such as a fusion reactor safety analysis, taking into account other major constraints such as the economical environment. The sophisticated architecture of a fusion reactor includes a large number of interacting systems. The new character of the fusion domain and the wide spectrum of the possible options strongly increase the advantages of a system study as a complete safety analysis can be defined before starting with the design. (orig.)

  13. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  14. CHARACTERISTICS OF RESEARCH METHODOLOGY DEVELOPMENT IN SPECIAL EDUCATION AND REHABILITATION

    Directory of Open Access Journals (Sweden)

    Natasha ANGELOSKA-GALEVSKA

    2004-12-01

    Full Text Available The aim of the text is to point out the developmental tendencies in the research methodology of special education and rehabilitation worldwide and in our country and to emphasize the importance of methodological training of students in special education and rehabilitation at the Faculty of Philosophy in Skopje.The achieved scientific knowledge through research is the fundamental pre-condition for development of special education and rehabilitation theory and practice. The results of the scientific work sometimes cause small, insignificant changes, but, at times, they make radical changes. Thank to the scientific researches and knowledge, certain prejudices were rejected. For example, in the sixth decade of the last century there was a strong prejudice that mentally retarded children should be segregated from the society as aggressive and unfriendly ones or the deaf children should not learn sign language because they would not be motivated to learn lip-reading and would hardly adapt. Piaget and his colleagues from Geneva institute were the pioneers in researching this field and they imposed their belief that handicapped children were not handicapped in each field and they had potentials that could be developed and improved by systematic and organized work. It is important to initiate further researches in the field of special education and rehabilitation, as well as a critical analysis of realized researches. Further development of the scientific research in special education and rehabilitation should be a base for education policy on people with disabilities and development of institutional and non-institutional treatment of this population.

  15. Methodology and Results of Mathematical Modelling of Complex Technological Processes

    Science.gov (United States)

    Mokrova, Nataliya V.

    2018-03-01

    The methodology of system analysis allows us to draw a mathematical model of the complex technological process. The mathematical description of the plasma-chemical process was proposed. The importance the quenching rate and initial temperature decrease time was confirmed for producing the maximum amount of the target product. The results of numerical integration of the system of differential equations can be used to describe reagent concentrations, plasma jet rate and temperature in order to achieve optimal mode of hardening. Such models are applicable both for solving control problems and predicting future states of sophisticated technological systems.

  16. [Methodology for the development of policy brief in public health].

    Science.gov (United States)

    Felt, Emily; Carrasco, José Miguel; Vives-Cases, Carmen

    2018-01-10

    A policy brief is a document that summarizes research to inform policy. In a brief and succinct way, it defines a policy problem, presents a synthesis of relevant evidence, identifies possible courses of action and makes recommendations or key points. The objective of this note is to describe the methodology used to produce a policy brief for communicating public health research. This note is based on the model presented by Eugene Bardach in addition to the authors' own experiences. We describe six steps: 1) identifying the audience; 2) defining the problem; 3) gathering information and evidence; 4) consideration of policy alternatives; 5) projecting results and designing recommendations; and 6) telling the story. We make a case for the use of policy briefs as a part of an overall communications strategy for research that aims to bring together research teams and stakeholders. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Urban Agglomerations in Regional Development: Theoretical, Methodological and Applied Aspects

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Shmidt

    2016-09-01

    Full Text Available The article focuses on the analysis of the major process of modern socio-economic development, such as the functioning of urban agglomerations. A short background of the economic literature on this phenomenon is given. There are the traditional (the concentration of urban types of activities, the grouping of urban settlements by the intensive production and labour communications and modern (cluster theories, theories of network society conceptions. Two methodological principles of studying the agglomeration are emphasized: the principle of the unity of the spatial concentration of economic activity and the principle of compact living of the population. The positive and negative effects of agglomeration in the economic and social spheres are studied. Therefore, it is concluded that the agglomeration is helpful in the case when it brings the agglomerative economy (the positive benefits from it exceed the additional costs. A methodology for examination the urban agglomeration and its role in the regional development is offered. The approbation of this methodology on the example of Chelyabinsk and Chelyabinsk region has allowed to carry out the comparative analysis of the regional centre and the whole region by the main socio-economic indexes under static and dynamic conditions, to draw the conclusions on a position of the city and the region based on such socio-economic indexes as an average monthly nominal accrued wage, the cost of fixed assets, the investments into fixed capital, new housing supply, a retail turnover, the volume of self-produced shipped goods, the works and services performed in the region. In the study, the analysis of a launching site of the Chelyabinsk agglomeration is carried out. It has revealed the following main characteristics of the core of the agglomeration in Chelyabinsk (structure feature, population, level of centralization of the core as well as the Chelyabinsk agglomeration in general (coefficient of agglomeration

  18. METHODOLOGY OF RESEARCH AND DEVELOPMENT MANAGEMENT OF REGIONAL NETWORK ECONOMY

    Directory of Open Access Journals (Sweden)

    O.I. Botkin

    2007-06-01

    Full Text Available Information practically of all the Russian regions economy branches and development by managing subjects is information − communicative the Internet technologies render huge influence on economic attitudes development in the environment of regional business: there are new forms of interaction of managing subjects and change is information − organizational structures of regional business management. Integrated image of the set forth above innovations is the regional network economy representing the interactive environment in which on high speed and with minimal transaction (R.H.Coase’s costs are performed social economic and commodity monetary attitudes between managing subjects of region with use of Internet global network interactive opportunities. The urgency of the regional network economy phenomenon research, first of all, is caused by necessity of a substantiation of regional network economy methodology development and management mechanisms development by its infrastructure with the purpose of regional business efficiency increase. In our opinion, the decision of these problems will be the defining factor of effective economic development maintenance and russian regions economy growth in the near future.

  19. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  20. Development and evaluation of clicker methodology for introductory physics courses

    Science.gov (United States)

    Lee, Albert H.

    Many educators understand that lectures are cost effective but not learning efficient, so continue to search for ways to increase active student participation in this traditionally passive learning environment. In-class polling systems, or "clickers", are inexpensive and reliable tools allowing students to actively participate in lectures by answering multiple-choice questions. Students assess their learning in real time by observing instant polling summaries displayed in front of them. This in turn motivates additional discussions which increase the opportunity for active learning. We wanted to develop a comprehensive clicker methodology that creates an active lecture environment for a broad spectrum of students taking introductory physics courses. We wanted our methodology to incorporate many findings of contemporary learning science. It is recognized that learning requires active construction; students need to be actively involved in their own learning process. Learning also depends on preexisting knowledge; students construct new knowledge and understandings based on what they already know and believe. Learning is context dependent; students who have learned to apply a concept in one context may not be able to recognize and apply the same concept in a different context, even when both contexts are considered to be isomorphic by experts. On this basis, we developed question sequences, each involving the same concept but having different contexts. Answer choices are designed to address students preexisting knowledge. These sequences are used with the clickers to promote active discussions and multiple assessments. We have created, validated, and evaluated sequences sufficient in number to populate all of introductory physics courses. Our research has found that using clickers with our question sequences significantly improved student conceptual understanding. Our research has also found how to best measure student conceptual gain using research-based instruments

  1. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  2. Development of tyre/road noise assessment methodology in India

    Directory of Open Access Journals (Sweden)

    Mohammed Ali Boodihal

    2014-01-01

    Full Text Available The major objective of this research study was to develop a methodology to evaluate tyre/road noise of the various road types and sections in Bangalore. The scope of the effort included field noise measurements of the 17 conventional asphalt concrete (AC, four Portland cement concrete (PCC, and two plastic modified asphalt concrete (PMAC in Bangalore city covering about 24 km of roadway stretches at varying traffic speeds. Field noise measurements were performed using a noise meter mounted underneath a trailer developed in this study and attached to the parent vehicle. Overall, PMAC sections produced the highest noise levels than the PCC followed by the conventional AC sections; PMAC mix type had an average difference of about 6–8 decibels (dB(A compared with the AC mix, and 1–2 dB(A in comparison with the PCC mix types. It is noteworthy that although many traffic noise studies have been conducted in India, the contribution of tyre/road noise to the overall noise has not been developed and/or established till date. The approach taken in this study is first of its kind within the framework of tyre/road noise research and development in India.

  3. Formulación de una Metodología de Formación y Evaluación en Empresarismo, bajo un Modelo de Competencias (Development of an entrepreneurial training and evaluation methodology under the competency - based model

    Directory of Open Access Journals (Sweden)

    Paola Podestá

    2012-12-01

    Full Text Available El presente artículo se deriva de un trabajo de investigaciónque surgió del interés por contar con un modelo deformación y evaluación en empresarismo, para la UniversidadEAFIT. Se elige un modelo por competencias, dada la tendenciaactual de la pedagogía hacia esta perspectiva, entendiendoel concepto de competencia como hacer en contexto.En la actualidad el empresarismo es, junto con los procesosde promoción y acompañamiento, una de las estrategias dedesarrollo de la Universidad EAFI.; El proceso de formaciónes uno de los pilares del programa. El resultado de este trabajode investigación es el modelo de formación y evaluación porcompetencias para empresarismo, modelo que sirve no sóloal interior de la institución, sino también como metodologíareplicable en proyectos de consultoría en el tema.   ABSTRACT This article derives from a research project aimed atthe development of an entrepreneurial training and evaluationmethodology under the competency - based model forthe EAFIT University in Colombia. A competency - basedmodel was selected due to present teaching trends towardthis approach that views competency as “doing in context”.Nowadays, entrepreneurship is, along with promotionaland guidance processes, a development strategy for theEAFIT University. The training process is one of the pillarsupon which the program is built. The outcome of thisresearch is a training and evaluation by competency - basedmodel for entrepreneurship that not only serves the Universityinternally, but also serves a repeatable methodology forconsulting projects.

  4. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    International Nuclear Information System (INIS)

    Dorp, F. van

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways (a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in detail by

  5. Development of a design methodology for hydraulic pipelines carrying rectangular capsules

    International Nuclear Information System (INIS)

    Asim, Taimoor; Mishra, Rakesh; Abushaala, Sufyan; Jain, Anuj

    2016-01-01

    The scarcity of fossil fuels is affecting the efficiency of established modes of cargo transport within the transportation industry. Efforts have been made to develop innovative modes of transport that can be adopted for economic and environmental friendly operating systems. Solid material, for instance, can be packed in rectangular containers (commonly known as capsules), which can then be transported in different concentrations very effectively using the fluid energy in pipelines. For economical and efficient design of such systems, both the local flow characteristics and the global performance parameters need to be carefully investigated. Published literature is severely limited in establishing the effects of local flow features on system characteristics of Hydraulic Capsule Pipelines (HCPs). The present study focuses on using a well validated Computational Fluid Dynamics (CFD) tool to numerically simulate the solid-liquid mixture flow in both on-shore and off-shore HCPs applications including bends. Discrete Phase Modelling (DPM) has been employed to calculate the velocity of the rectangular capsules. Numerical predictions have been used to develop novel semi-empirical prediction models for pressure drop in HCPs, which have then been embedded into a robust and user-friendly pipeline optimisation methodology based on Least-Cost Principle. - Highlights: • Local flow characteristics in a pipeline transporting rectangular capsules. • Development of prediction models for the pressure drop contribution of capsules. • Methodology developed for sizing of Hydraulic Capsule Pipelines. • Implementation of the developed methodology to obtain optimal pipeline diameter.

  6. Methodology and preliminary models for analyzing nuclear-safeguards decisions

    International Nuclear Information System (INIS)

    Judd, B.R.; Weissenberger, S.

    1978-11-01

    This report describes a general analytical tool designed with Lawrence Livermore Laboratory to assist the Nuclear Regulatory Commission in making nuclear safeguards decisions. The approach is based on decision analysis - a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material; demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria); and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  7. Methodology and preliminary models for analyzing nuclear safeguards decisions

    International Nuclear Information System (INIS)

    1978-11-01

    This report describes a general analytical tool designed to assist the NRC in making nuclear safeguards decisions. The approach is based on decision analysis--a quantitative procedure for making decisions under uncertain conditions. The report: describes illustrative models that quantify the probability and consequences of diverted special nuclear material and the costs of safeguarding the material, demonstrates a methodology for using this information to set safeguards regulations (safeguards criteria), and summarizes insights gained in a very preliminary assessment of a hypothetical reprocessing plant

  8. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2010-01-01

    Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning......In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or “Designing for Learning”. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... within the EAtrain2 project as a way of enabling teachers and practitioners to collaboratively design courses. As part of the collaborative e-learning design (CoED) method and the broader learning methodology the authors held a workshop for the project partners in the EATrain2 project, and the results...

  9. Model continuity in discrete event simulation: A framework for model-driven development of simulation models

    NARCIS (Netherlands)

    Cetinkaya, D; Verbraeck, A.; Seck, MD

    2015-01-01

    Most of the well-known modeling and simulation (M&S) methodologies state the importance of conceptual modeling in simulation studies, and they suggest the use of conceptual models during the simulation model development process. However, only a limited number of methodologies refers to how to

  10. A Design Science Research Methodology for Expert Systems Development

    Directory of Open Access Journals (Sweden)

    Shah Jahan Miah

    2016-11-01

    Full Text Available The knowledge of design science research (DSR can have applications for improving expert systems (ES development research. Although significant progress of utilising DSR has been observed in particular information systems design – such as decision support systems (DSS studies – only rare attempts can be found in the ES design literature. Therefore, the aim of this study is to investigate the use of DSR for ES design. First, we explore the ES development literature to reveal the presence of DSR as a research methodology. For this, we select relevant literature criteria and apply a qualitative content analysis in order to generate themes inductively to match the DSR components. Second, utilising the findings of the comparison, we determine a new DSR approach for designing a specific ES that is guided by another result – the findings of a content analysis of examination scripts in Mathematics. The specific ES artefact for a case demonstration is designed for addressing the requirement of a ‘wicked’ problem in that the key purpose is to assist human assessors when evaluating multi-step question (MSQ solutions. It is anticipated that the proposed design knowledge, in terms of both problem class and functions of ES artefacts, will help ES designers and researchers to address similar issues for designing information system solutions.

  11. Development of a New Methodology for Rock Engineering Design

    Science.gov (United States)

    1994-03-01

    Work ....................... 321 REFERENCES ............................................. 325 APPENDIX 1: Design Methodology Flow Charts ( Bieniawski ...1989) ......................................... 19 Figure 13: The design cycle for rock engineering ( Bieniawski 1984) ........ 20 Figure 1.4: Thesis...excavations in rock (Hoek & Brown 1980) 33 Figure 2.4: Design methodology for rock mechanics, including the use of design principles. ( Bieniawski 1991

  12. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  13. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  14. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  15. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  16. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  17. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    Science.gov (United States)

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  18. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  19. Development of Methodologies for IV and V of Neural Networks

    Science.gov (United States)

    Taylor, Brian; Darrah, Marjorie

    2003-01-01

    Non-deterministic systems often rely upon neural network (NN) technology to "lean" to manage flight systems under controlled conditions using carefully chosen training sets. How can these adaptive systems be certified to ensure that they will become increasingly efficient and behave appropriately in real-time situations? The bulk of Independent Verification and Validation (IV&V) research of non-deterministic software control systems such as Adaptive Flight Controllers (AFC's) addresses NNs in well-behaved and constrained environments such as simulations and strict process control. However, neither substantive research, nor effective IV&V techniques have been found to address AFC's learning in real-time and adapting to live flight conditions. Adaptive flight control systems offer good extensibility into commercial aviation as well as military aviation and transportation. Consequently, this area of IV&V represents an area of growing interest and urgency. ISR proposes to further the current body of knowledge to meet two objectives: Research the current IV&V methods and assess where these methods may be applied toward a methodology for the V&V of Neural Network; and identify effective methods for IV&V of NNs that learn in real-time, including developing a prototype test bed for IV&V of AFC's. Currently. no practical method exists. lSR will meet these objectives through the tasks identified and described below. First, ISR will conduct a literature review of current IV&V technology. TO do this, ISR will collect the existing body of research on IV&V of non-deterministic systems and neural network. ISR will also develop the framework for disseminating this information through specialized training. This effort will focus on developing NASA's capability to conduct IV&V of neural network systems and to provide training to meet the increasing need for IV&V expertise in such systems.

  20. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  1. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  2. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  3. Development of methodology for the characterization of radioactive sealed sources

    International Nuclear Information System (INIS)

    Ferreira, Robson de Jesus

    2010-01-01

    Sealed radioactive sources are widely used in many applications of nuclear technology in industry, medicine, research and others. The International Atomic Energy Agency (IAEA) estimates tens of millions sources in the world. In Brazil, the number is about 500 thousand sources, if the Americium-241 sources present in radioactive lightning rods and smoke detectors are included in the inventory. At the end of the useful life, most sources become disused, constitute a radioactive waste, and are then termed spent sealed radioactive sources (SSRS). In Brazil, this waste is collected by the research institutes of the Nuclear Commission of Nuclear Energy and kept under centralized storage, awaiting definition of the final disposal route. The Waste Management Laboratory (WML) at the Nuclear and Energy Research Institute is the main storage center, having received until July 2010 about 14.000 disused sources, not including the tens of thousands of lightning rod and smoke detector sources. A program is underway in the WML to replacing the original shielding by a standard disposal package and to determining the radioisotope content and activity of each one. The identification of the radionuclides and the measurement of activities will be carried out with a well type ionization chamber. This work aims to develop a methodology for measuring or to determine the activity SSRS stored in the WML accordance with its geometry and determine their uncertainties. (author)

  4. Development of probabilistic assessment methodology for geologic disposal of radioactive wastes

    International Nuclear Information System (INIS)

    Kimura, H.; Takahashi, T.

    1998-01-01

    The probabilistic assessment methodology is essential to evaluate uncertainties of long-term radiological consequences associated with geologic disposal of radioactive wastes. We have developed a probabilistic assessment methodology to estimate the influences of parameter uncertainties/variabilities. An exposure scenario considered here is based on a groundwater migration scenario. A computer code system GSRW-PSA thus developed is based on a non site-specific model, and consists of a set of sub modules for sampling of model parameters, calculating the release of radionuclides from engineered barriers, calculating the transport of radionuclides through the geosphere, calculating radiation exposures of the public, and calculating the statistical values relating the uncertainties and sensitivities. The results of uncertainty analyses for α-nuclides quantitatively indicate that natural uranium ( 238 U) concentration is suitable for an alternative safety indicator of long-lived radioactive waste disposal, because the estimated range of individual dose equivalent due to 238 U decay chain is narrower that that due to other decay chain ( 237 Np decay chain). It is internationally necessary to have detailed discussion on the PDF of model parameters and the PSA methodology to evaluated the uncertainties due to conceptual models and scenarios. (author)

  5. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  6. The Development of Trade Union theory and Mainstream Economic Methodology

    OpenAIRE

    Drakopoulos, Stavros A.; Katselidis, Ioannis

    2012-01-01

    The pre-war approaches to trade unions were mainly based on the theoretical and methodological viewpoints of early institutional economics. Trade unions were conceived of as politico-economic organizations whose members were motivated by relative comparisons and also were concerned with issues of equity and justice. In the post-war period, there was a major theoretical and methodological shift towards the idea of unions as optimizing economic units with well-defined objective functions which ...

  7. Methodological development of the process of appreciation of photography Conceptions

    Directory of Open Access Journals (Sweden)

    Yovany Álvarez García

    2012-12-01

    Full Text Available This article discusses the different concepts that are used to methodological appreciation of photography. Since photography is one of the manifestations of the visu al arts with the most commonly interacts daily ; from which can be found in books, magazines and other publications, discusses various methodologies to assess the photographic image. It addresses also the classic themes of photography as well as some expres sive elements.

  8. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    Science.gov (United States)

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  9. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems

    Science.gov (United States)

    Takecian, Pedro L.; Oikawa, Marcio K.; Braghetto, Kelly R.; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S.; Acker, Susan; Carneiro-Proietti, Anna B. F.; Sabino, Ester C.; Custer, Brian; Busch, Michael P.; Ferreira, João E.

    2013-01-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development. PMID:23729945

  10. Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process

    Science.gov (United States)

    Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.

    2015-08-01

    An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.

  11. Development of a methodology for doss assessment viewing the use of NORM on building materials

    International Nuclear Information System (INIS)

    Souza, Antonio Fernando Costa de

    2009-01-01

    The objective of this study was to develop a methodology for estimating the radiological impact on man of the residues of naturally occurring radioactive materials (NORMs) that potentially can be used for the construction of homes and roads. Residues of this type, which are being produced in great quantities by the Brazilian mining industry, are typically deposited in non-appropriated conditions such that they may have a long-time adverse impact on the environment, and hence on man. A mathematical model was developed to calculate the doses resulting from the use of NORM residues, thus allowing a preliminary analysis of the possibility to recycle the residues. The model was used to evaluate the external dose due gamma radiation, the dose to skin caused by beta radiation, and the internal dose due to inhalation of radon and its decay products. The model was verified by comparisons with results of other studies about doses due to gamma and beta radiation from finite and infinite radioactive sources, with relatively good agreement. In order to validate the proposed methodology, a comparison was made against experimental results for a house constructed in accordance with CNEN regulations using building materials containing NORM residues. Comparisons were made of the dose due to gamma radiation and the radon concentration in the internal environment. Finally, the methodology was used also to estimate the dose caused by gamma radiation from a road constructed in the state of Rondonia, Brazil, which made use of another NORM residue. (author)

  12. Development of performance assessment methodology for establishment of quantitative acceptance criteria of near-surface radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. R.; Lee, E. Y.; Park, J. W.; Chang, G. M.; Park, H. Y.; Yeom, Y. S. [Korea Hydro and Nuclear Power Co., Ltd., Seoul (Korea, Republic of)

    2002-03-15

    The contents and the scope of this study are as follows : review of state-of-the-art on the establishment of waste acceptance criteria in foreign near-surface radioactive waste disposal facilities, investigation of radiological assessment methodologies and scenarios, investigation of existing models and computer codes used in performance/safety assessment, development of a performance assessment methodology(draft) to derive quantitatively radionuclide acceptance criteria of domestic near-surface disposal facility, preliminary performance/safety assessment in accordance with the developed methodology.

  13. Geographic modelling of jaw fracture rates in Australia: a methodological model for healthcare planning.

    Science.gov (United States)

    Kruger, Estie; Heitz-Mayfield, Lisa J A; Perera, Irosha; Tennant, Marc

    2010-06-01

    While Australians are one of the healthiest populations in the world, inequalities in access to health care and health outcomes exist for Indigenous Australians and Australians living in rural or urban areas of the country. Hence, the purpose of this study was to develop an innovative methodological approach for predicting the incidence rates of jaw fractures and estimating the demand for oral health services within Australia. Population data were obtained from the Australian Bureau of Statistics and was divided across Australia by statistical local area and related to a validated remoteness index. Every episode of discharge from all hospitals in Western Australia for the financial years 1999/2000 to 2004/2005 indicating a jaw fracture as the principle oral condition, as classified by the International Classification of Disease (ICD-10AM), was the inclusion criterion for the study. Hospitalization data were obtained from the Western Australian Hospital Morbidity Data System. The model estimated almost 10 times higher jaw fracture rates for Indigenous populations than their non-Indigenous counterparts. Moreover, incidence of jaw fractures was higher among Indigenous people living in rural and remote areas compared with their urban and semi-urban counterparts. In contrast, in the non-Indigenous population, higher rates of jaw fractures were estimated for urban and semi-urban inhabitants compared with their rural and remote counterparts. This geographic modelling technique could be improved by methodological refinements and further research. It will be useful in developing strategies for health management and reducing the burden of jaw fractures and the cost of treatment within Australia. This model will also have direct implications for strategic planning for prevention and management policies in Australia aimed at reducing the inequalities gap both in terms of geography as well as Aboriginality.

  14. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel W [Los Alamos National Laboratory; O' Brien, David A [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Chavez, Gregory M [Los Alamos National Laboratory

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  15. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  16. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  17. METHODOLOGICAL BASES OF ECOLOGICAL CULTURE FORMATION OF PUPILS ON THE BASIS OF ECO-DEVELOPMENT IDEAS

    Directory of Open Access Journals (Sweden)

    Natalia F. Vinokurova

    2016-01-01

    Full Text Available Aim. The article describes methodological bases of formation of ecological culture of students as the aim of innovative training for a sustainable future. The authors take into account international and the Russian experience, connected with development of ecological culture as an educational resource of society adaptation to environmental constraints, risks, crises and present-day consolidated actions towards sustainable development of civilization. Methods. The methodological basis of constructing of the model formation of pupils’ ecological culture is developed from the standpoint of the idea of eco-development (noosphere, co-evolution, sustainable development and a set of axiological, cultural, personal-activity, co-evolutionary, cultural and ecological approaches. Justified methodical basis has allowed to construct educational level of formation of ecological culture of pupils, comprising interconnected unity of the target, substantive, procedural, effectively and appraisal components. Results and scientific novelty. The article presents the results of many years research of authors on environmental education for sustainable development in the framework of the Nizhny Novgorod scientific school. A characteristic of ecological culture of students as the goal of environmental education based on ecodevelopment ideas is given. It is shown that the ecological culture of students directs them to new values in life meanings, methods of ecological-oriented actions and behavior, changing settings of the consumer society and ensuring the development of the younger generation of co-evolutionary, spiritual guidance in a postindustrial society. The authors’ model of the formation of ecological culture of pupils is represented by conjugation philosophical and methodological, theoretical, methodological and pedagogical levels that ensure the integrity and hierarchical pedagogical research on the issue. The article discloses a pedagogical assessment

  18. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  19. Methodological development for selection of significant predictors explaining fatal road accidents.

    Science.gov (United States)

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  20. The Desired Image of the Future Economy of the Industrial Region: Development Trends and Evaluation Methodology

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2017-09-01

    Full Text Available In the article, the authors emphasize that industrial regions play an important role in the increasing of technological independence of Russia. We show that the decline in the share of processing industries in the gross regional product can not be treated as a negative de-industrialization of the economy. The article proves that the increase in the speed of changements, instability of socio-economic systems, the diverse risks predetermine the need to develop new methodological approaches to predictive research. The studies aimed at developing a technology for the design of the desired image of the future and the methodology for its evaluation are of high importance. For the initial stage of the research, the authors propose the methodological approach for assessing the desired image of the future of metallurgy as one of the most important industry of the region. We propose the term of «technological image of the regional metallurgy». We show that repositioning the image of the regional metallurgical complex is quite a long process. This have determined the need to define the stages of repositioning. The proposed methodology of the evaluation of desired future includes the methodological provisions to quantify the characteristics of goals achieved at the respective stages of the repositioning of the metallurgy. The methodological approach to the design of the desired image of the future implies the following stages: the identification of the priority areas of the technological development of regional metallurgy on the basis of bibliometric and patent analysis; the evaluation of dynamics of the development of the structure of metal products domestic consumption based on comparative analysis and relevant analytical methods as well as its forecasting; the design of the factor model, allowing to identify the parameters quantifying the technological image of the regional metallurgy based on the principal components method,; systematization of

  1. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  2. The status of development and practical use of probabilistic safety assessment methodology in CSSR

    International Nuclear Information System (INIS)

    Hrehor, M.

    1987-01-01

    The first part of the paper gives a brief summary of the current status of the nuclear energy programme and its regulatory background in CSSR emphasizing a leading role of the State Nuclear Safety Inspectorate of CsAEC in the development and practical use of probabilistic safety assessment methodology. In the second part a simple practical technique is presented enabling calculation of MTBF and MTTR in the cases which cannot be directly modelled by means of logical operators commonly used in fault tree models. (orig.)

  3. Discussion on experimental methodology for research and development of engineering barrier

    International Nuclear Information System (INIS)

    Yang Zhongtian

    2012-01-01

    According to the concept design for the repository of HLW disposal, near field environmental conditions during thermal period are gathered, especially the thermal and radiation conditions. Safety assessment requirements and current methodology problems are analyzed also. Referring to the experimental methodology developed by Sandia National Laboratories, Research methodology and experimental plan are put forward. (author)

  4. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  5. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  6. A methodology for least-squares local quasi-geoid modelling using a noisy satellite-only gravity field model

    Science.gov (United States)

    Klees, R.; Slobbe, D. C.; Farahani, H. H.

    2018-04-01

    The paper is about a methodology to combine a noisy satellite-only global gravity field model (GGM) with other noisy datasets to estimate a local quasi-geoid model using weighted least-squares techniques. In this way, we attempt to improve the quality of the estimated quasi-geoid model and to complement it with a full noise covariance matrix for quality control and further data processing. The methodology goes beyond the classical remove-compute-restore approach, which does not account for the noise in the satellite-only GGM. We suggest and analyse three different approaches of data combination. Two of them are based on a local single-scale spherical radial basis function (SRBF) model of the disturbing potential, and one is based on a two-scale SRBF model. Using numerical experiments, we show that a single-scale SRBF model does not fully exploit the information in the satellite-only GGM. We explain this by a lack of flexibility of a single-scale SRBF model to deal with datasets of significantly different bandwidths. The two-scale SRBF model performs well in this respect, provided that the model coefficients representing the two scales are estimated separately. The corresponding methodology is developed in this paper. Using the statistics of the least-squares residuals and the statistics of the errors in the estimated two-scale quasi-geoid model, we demonstrate that the developed methodology provides a two-scale quasi-geoid model, which exploits the information in all datasets.

  7. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  8. A methodology for ecosystem-scale modeling of selenium.

    Science.gov (United States)

    Presser, Theresa S; Luoma, Samuel N

    2010-10-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determine how Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  9. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  10. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  11. Development of a simplified methodology for the isotopic determination of fuel spent in Light Water Reactors

    International Nuclear Information System (INIS)

    Hernandez N, H.; Francois L, J.L.

    2005-01-01

    The present work presents a simplified methodology to quantify the isotopic content of the spent fuel of light water reactors; their application is it specific to the Laguna Verde Nucleo electric Central by means of a balance cycle of 18 months. The methodology is divided in two parts: the first one consists on the development of a model of a simplified cell, for the isotopic quantification of the irradiated fuel. With this model the burnt one is simulated 48,000 MWD/TU of the fuel in the core of the reactor, taking like base one fuel assemble type 10x10 and using a two-dimensional simulator for a fuel cell of a light water reactor (CPM-3). The second part of the methodology is based on the creation from an isotopic decay model through an algorithm in C++ (decay) to evaluate the amount, by decay of the radionuclides, after having been irradiated the fuel until the time in which the reprocessing is made. Finally the method used for the quantification of the kilograms of uranium and obtained plutonium of a normalized quantity (1000 kg) of fuel irradiated in a reactor is presented. These results will allow later on to make analysis of the final disposition of the irradiated fuel. (Author)

  12. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  13. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T.; Desarrollo Metodologico del Modelo Probabilista de Evaluacion de Seguridad de la P.D.T. de Hontomin

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, A.; Eguilior, S.; Recreo, F.

    2011-06-07

    In the framework of CO{sub 2} Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  14. Screening methodologies for the development of spray-dried amorphous solid dispersions.

    Science.gov (United States)

    Duarte, Íris; Santos, José Luís; Pinto, João F; Temtem, Márcio

    2015-01-01

    To present a new screening methodology intended to be used in the early development of spray-dried amorphous solid dispersions. A model that combines thermodynamic, kinetic and manufacturing considerations was implemented to obtain estimates of the miscibility and phase behavior of different itraconazole-based solid dispersions. Additionally, a small-scale solvent casting protocol was developed to enable a fast assessment on the amorphous stability of the different drug-polymer systems. Then, solid dispersions at predefined drug loads were produced in a lab-scale spray dryer for powder characterization and comparison of the results generated by the model and solvent cast samples. The results obtained with the model enabled the ranking of the polymers from a miscibility standpoint. Such ranking was consistent with the experimental data obtained by solvent casting and spray drying. Moreover, the range of optimal drug load determined by the model was as well consistent with the experimental results. The screening methodology presented in this work showed that a set of amorphous formulation candidates can be assessed in a computer model, enabling not only the determination of the most suitable polymers, but also of the optimal drug load range to be tested in laboratory experiments. The set of formulation candidates can then be further fine-tuned with solvent casting experiments using a small amount of API, which will then provide the decision for the final candidate formulations to be assessed in spray drying experiments.

  15. Methodological NMR imaging developments to measure cerebral perfusion

    International Nuclear Information System (INIS)

    Pannetier, N.

    2010-12-01

    This work focuses on acquisition techniques and physiological models that allow characterization of cerebral perfusion by MRI. The arterial input function (AIF), on which many models are based, is measured by a technique of optical imaging at the carotid artery in rats. The reproducibility and repeatability of the AIF are discussed and a model function is proposed. Then we compare two techniques for measuring the vessel size index (VSI) in rats bearing a glioma. The reference technique, using a USPIO contrast agent (CA), faces the dynamic approach that estimates this parameter during the passage of a bolus of Gd. This last technique has the advantage of being used clinically. The results obtained at 4.7 T by both approaches are similar and use of VSI in clinical protocols is strongly encouraged at high field. The mechanisms involved (R1 and R2* relaxivities) were then studied using a multi gradient -echoes approach. A multi-echoes spiral sequence is developed and a method that allows the refocusing between each echo is presented. This sequence is used to characterize the impact of R1 effects during the passage of two successive injections of Gd. Finally, we developed a tool for simulating the NMR signal on a 2D geometry taking into account the permeability of the BBB and the CA diffusion in the interstitial space. At short TE, the effect of diffusion on the signal is negligible. In contrast, the effects of diffusion and permeability may be separated at long echo time. Finally we show that during the extravasation of the CA, the local magnetic field homogenization due to the decrease of the magnetic susceptibility difference at vascular interfaces is quickly balanced by the perturbations induced by the increase of the magnetic susceptibility difference at the cellular interfaces in the extravascular compartment. (author)

  16. Development and testing of methodology for evaluating the performance of multi-input/multi-output digital control systems

    Science.gov (United States)

    Polotzky, Anthony S.; Wieseman, Carol; Hoadley, Sherwood Tiffany; Mukhopadhyay, Vivek

    1990-01-01

    The development of a controller performance evaluation (CPE) methodology for multiinput/multioutput digital control systems is described. The equations used to obtain the open-loop plant, controller transfer matrices, and return-difference matrices are given. Results of applying the CPE methodology to evaluate MIMO digital flutter suppression systems being tested on an active flexible wing wind-tunnel model are presented to demonstrate the CPE capability.

  17. Methodologies, models and parameters for environmental, impact assessment of hazardous and radioactive contaminants

    International Nuclear Information System (INIS)

    Aguero, A.; Cancio, D.; Garcia-Olivares, A.; Romero, L.; Pinedo, P.; Robles, B.; Rodriguez, J.; Simon, I.; Suanez, A.

    2003-01-01

    An Environmental Impact Assessment Methodology to assess the impact arising from contaminants present in hazardous and radioactive wastes has been developed. Taking into account of the background information on legislation, waste categories and contaminants inventory, and disposal, recycling and waste treatment options, an Environmental Impact Assessment Methodology (MEIA) is proposed. This is applicable to (i) several types of solid wastes (hazardous, radioactive and mixed wastes; (ii) several management options (recycling and temporal and final storage (in shallow and deep disposal)), (iii) several levels of data availability. Conceptual and mathematical models and software tools needed for the application of the MEIA have been developed. Bearing in mind that this is a complex process, both the models and tools have to be developed following an iterative approaches, involving refinement of the models and go as to better correspond the described system. The selection of suitable parameters for the models is based on information derived from field and laboratory measurements and experiments, nd then applying a data elicitation protocol.. It is shown an application performed for a hypothetical shallow radioactive waste disposal facility (test case), with all the steps of the MEIA applied sequentially. In addition, the methodology is applied to an actual cases of waste management for hazardous wastes from the coal fuel cycle, demonstrating several possibilities for application of the MEIA from a practical perspective. The experience obtained in the development of the work shows that the use of the MEIA for the assessment of management options for hazardous and radioactive wastes gives important advantages, simplifying the execution of the assessment, its tracability and the dissemination of methodology assessment results to to other interested parties. (Author)

  18. Development of a methodology for the safety assessment of near surface disposal facilities for radioactive waste

    International Nuclear Information System (INIS)

    Simon, I.; Cancio, D.; Alonso, L.F.; Agueero, A.; Lopez de la Higuera, J.; Gil, E.; Garcia, E.

    2000-01-01

    The Project on the Environmental Radiological Impact in CIEMAT is developing, for the Spanish regulatory body Consejo de Seguridad Nuclear (CSN), a methodology for the Safety Assessment of near surface disposal facilities. This method has been developed incorporating some elements developed through the participation in the IAEA's ISAM Programme (Improving Long Term Safety Assessment Methodologies for Near Surface Radioactive Waste Disposal Facilities). The first step of the approach is the consideration of the assessment context, including the purpose of the assessment, the end-Points, philosophy, disposal system, source term and temporal scales as well as the hypothesis about the critical group. Once the context has been established, and considering the peculiarities of the system, an specific list of features, events and processes (FEPs) is produced. These will be incorporated into the assessment scenarios. The set of scenarios will be represented in the conceptual and mathematical models. By the use of mathematical codes, calculations are performed to obtain results (i.e. in terms of doses) to be analysed and compared against the criteria. The methodology is being tested by the application to an hypothetical engineered disposal system based on an exercise within the ISAM Programme, and will finally be applied to the Spanish case. (author)

  19. Assessment of ALWR passive safety system reliability. Phase 1: Methodology development and component failure quantification

    International Nuclear Information System (INIS)

    Hake, T.M.; Heger, A.S.

    1995-04-01

    Many advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive systems to perform safety functions, rather than active systems as in current reactor designs. These passive systems depend to a great extent on physical processes such as natural circulation for their driving force, and not on active components, such as pumps. An NRC-sponsored study was begun at Sandia National Laboratories to develop and implement a methodology for evaluating ALWR passive system reliability in the context of probabilistic risk assessment (PRA). This report documents the first of three phases of this study, including methodology development, system-level qualitative analysis, and sequence-level component failure quantification. The methodology developed addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. Traditional PRA methods, such as fault and event tree modeling, are applied to the component failure aspect. Thermal-hydraulic calculations are incorporated into a formal expert judgment process to address uncertainties in selected natural processes and success criteria. The first phase of the program has emphasized the component failure element of passive system reliability, rather than the natural process uncertainties. Although cursory evaluation of the natural processes has been performed as part of Phase 1, detailed assessment of these processes will take place during Phases 2 and 3 of the program

  20. Methodology for assessing electric vehicle charging infrastructure business models

    International Nuclear Information System (INIS)

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, which allows them to recover their costs while, at the same time, offer EV users a charging price which makes electro-mobility comparable to internal combustion engine vehicles. For that purpose, three scenarios are defined, which present different EV charging alternatives, in terms of charging power and charging station ownership and accessibility. A case study is presented for each scenario and the required charging station usage to have a profitable business model is calculated. We demonstrate that private home charging is likely to be the preferred option for EV users who can charge at home, as it offers a lower total cost of ownership under certain conditions, even today. On the contrary, finding a profitable business case for fast charging requires more intensive infrastructure usage. - Highlights: • Ecosystem is a network of actors who collaborate to create a positive business case. • Electro-mobility (electricity-powered road vehicles and ICT) is a complex ecosystem. • Methodological analysis to ensure that all actors benefit from electro-mobility. • Economic analysis of charging infrastructure deployment linked to its usage. • Comparison of EV ownership cost vs. ICE for vehicle users.

  1. Methodology for evaluating transportation-induced regional development

    OpenAIRE

    Ahn, Seung B.

    1996-01-01

    There has long been a recognition that efficient transport plays a key role in supporting a dynamic economy and a high quality of life. However, traffic increases along with population and income, and traffic congestion and accidents are negative results of this increase, as is environmental damage. There has been a need for a methodology to evaluate user, nonuser benefits and the environmental impacts of transportation investments and policies through rational, objective scien...

  2. Theater for Development Methodology in Childhood Cataract Case Finding

    OpenAIRE

    Roseline Ekanem Duke

    2016-01-01

    The key informant methodology for case finding for childhood cataract  was utilized  in a rural population in Nigeria to identify suitable children who would benefit surgically from intervene for cataract and restore vision such children. It was however noticed that some parents who had children with cataract did not bring their children to the primary health center for examination and recommendation. The purpose of this study is to investigate the benefits of using the theatre for developmen...

  3. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects...

  4. Methodology to develop a training program as a tool for energy management

    Directory of Open Access Journals (Sweden)

    Mónica Rosario Berenguer-Ungaro

    2017-12-01

    Full Text Available The paperaims to present the methodology to develop a training program improve labor skills that enhance the efficient use of energy resources, which aims to make training a timely and meet the training needs as they arise and that the protagonist of it is he who receives training. It is based on the training-action and action research method and model for evaluating training Krikpatrick, it evaluates four levels, reaction, learning, behavior and results. The methodology is structured in three stages: 1 diagnosis of knowledge, 2 intervention based on the results and 3 evaluation and feedback for continuous improvement. Each stage has identified the objectives and implementation tools. Evaluation is transverse to the entire program and it is through it that decisions for feedback loops are taken.

  5. Contribution to developing the environment radiation protection methodology

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, A. [Institute of Atomic Power Engineering NRNU MEPhI (Russian Federation); Alexakhin, R.; Dubynina, M. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    The environment sustainable development and biota protection, including the environment radiation protection are issues of nowadays interest in the society. An activity is ongoing on the development of a system of radiation protection for non-human biota. Anthropocentric and eco-centric principles are widely discussed. ICRP Publications 103, 108, 114 and many other reports and articles refer to the topic of environmental protection, reference animals and plants set, corresponding transfer parameters, dose models and derived consideration reference levels. There is still an open field for discussion of methods and approaches to get well-established procedure to assess environmental risks of radiation impacts to different organisms, populations and ecosystems. A huge work has been done by the ICRP and other organizations and research groups to develop and systematize approaches for this difficult subject. This activity, however, is not everywhere well-known and perceived, and more efforts are needed to bring ideas of eco-centric strategy in the environment radiation protection not only to public but to specialists in many countries as well. One of the main points of interest is an assessment of critical doses and doses rates for flora and fauna species. Some aspects of a possible procedure to find their estimates are studied in this work, including criteria for datasets of good quality, models of dose dependence, sensitivity of different umbrella endpoints and methods of original massive datasets treatment. Estimates are done based on information gathered in a database on radiation-induced effects in plants. Data on biological effects in plants (umbrella endpoints of reproductive potential, survival, morbidity, morphological, biochemical, and genetic effects) in dependence on dose and dose rates of ionizing radiation have been collected from reviewed publications and maintained in MS Access format. The database now contains about 7000 datasets and 25000 records

  6. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  7. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  8. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  9. A methodology model for quality management in a general hospital.

    Science.gov (United States)

    Stern, Z; Naveh, E

    1997-01-01

    A reappraisal is made of the relevance of industrial modes of quality management to the issues of medical care. Analysis of the nature of medical care, which differentiates it from the supplier-client relationships of industry, presents the main intrinsic characteristics, which create problems in application of the industrial quality management approaches to medical care. Several examples are the complexity of the relationship between the medical action and the result obtained, the client's nonacceptance of economic profitability as a value in his medical care, and customer satisfaction biased by variable standards of knowledge. The real problems unique to hospitals are addressed, and a methodology model for their quality management is offered. Included is a sample of indicator vectors, measurements of quality care, cost of medical care, quality of service, and human resources. These are based on the trilogy of planning quality, quality control, and improving quality. The conclusions confirm the inadequacy of industrial quality management approaches for medical institutions and recommend investment in formulation of appropriate concepts.

  10. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  11. THEORETIC AND METHODOLOGIC BASICS OF DEVELOPMENT OF THE NATIONAL LOGISTICS SYSTEM IN THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    R. B. Ivut

    2016-01-01

    Full Text Available The article presents the results of a study, the aim of which is the formation of the theoretical and methodological foundations in the framework of scientific maintenance for the further development processes of the national logistics system in the Republic of Belarus. The relevance of the study relates to the fact that at present the introduction of the concept of logistics and the formation of the optimal infrastructure for its implementation are the key factors for economic development of Belarus as a transit country. At the same time the pace of development of the logistic activities in the country is currently slightly lower in comparison with the neighboring countries, as evidenced by the dynamics of the country’s position in international rankings (in particular, according to the LPI index. Overcoming these gaps requires improved competitiveness of the logistics infrastructure in the international market. This, in turn, is possible due to the clear formulation and adherence of the effective functioning principles for macro logistics system of Belarus, as well as by increasing the quality of logistics design by means of applying econometric models and methods presented in the article. The proposed auctorial approach is the differentiation of the general principles of logistics specific to the logistics systems of all levels, and the specific principles of development of the macro level logistics system related to improving its transit attractiveness for international freight carriers. The study also systematizes the model for determining the optimal location of logistics facilities. Particular attention is paid to the methodological basis of the analysis of transport terminals functioning as part of the logistics centers both in the stages of design and operation. The developed theoretical and methodological recommendations are universal and can be used in the design of the logistics infrastructure for various purposes and functions

  12. Advances in Artificial Neural Networks – Methodological Development and Application

    Directory of Open Access Journals (Sweden)

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  13. Development of a flow structure interaction methodology applicable to a convertible car roof

    CERN Document Server

    Knight, J J

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and co...

  14. Theoretical framework and methodological development of common subjective health outcome measures in osteoarthritis: a critical review

    Directory of Open Access Journals (Sweden)

    Johnston Marie

    2007-03-01

    Full Text Available Abstract Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model? and ii methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?. Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological

  15. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  16. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  17. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available perform according to customers’ expectations. Software methodologies are an important aspect in software development companies. Maddison (1984) defines a methodology as a “recommended collection of philosophies, phases, procedures, rules, techniques..., tools, documentation, management, and training for developers of information systems”. Hence, understanding the differences in customer interaction between software methodologies is not only important to the software team but also important...

  18. Modeling the Capacity and Emissions Impacts of Reduced Electricity Demand. Part 1. Methodology and Preliminary Results

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Shen, Hongxia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; McDevitt, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Sturges, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division

    2013-02-07

    Policies aimed at energy conservation and efficiency have broad environmental and economic impacts. Even if these impacts are relatively small, they may be significant compared to the cost of implementing the policy. Methodologies that quantify the marginal impacts of reduced demand for energy have an important role to play in developing accurate measures of both the benefits and costs of a given policy choice. This report presents a methodology for estimating the impacts of reduced demand for electricity on the electric power sector as a whole. The approach uses the National Energy Modeling System (NEMS), a mid-range energy forecast model developed and maintained by the U.S. Department of Energy, Energy Information Administration (EIA)(DOE EIA 2013). The report is organized as follows: In the rest of this section the traditional NEMS-BT approach is reviewed and an outline of the new reduced form NEMS methodology is presented. Section 2 provides an overview of how the NEMS model works, and describes the set of NEMS-BT runs that are used as input to the reduced form approach. Section 3 presents our NEMS-BT simulation results and post-processing methods. In Section 4 we show how the NEMS-BT output can be generalized to apply to a broader set of end-uses. In Section 5 we disuss the application of this approach to policy analysis, and summarize some of the issues that will be further investigated in Part 2 of this study.

  19. A methodology for developing anisotropic AAA phantoms via additive manufacturing.

    Science.gov (United States)

    Ruiz de Galarreta, Sergio; Antón, Raúl; Cazón, Aitor; Finol, Ender A

    2017-05-24

    An Abdominal Aortic Aneurysm (AAA) is a permanent focal dilatation of the abdominal aorta at least 1.5 times its normal diameter. The criterion of maximum diameter is still used in clinical practice, although numerical studies have demonstrated the importance of biomechanical factors for rupture risk assessment. AAA phantoms could be used for experimental validation of the numerical studies and for pre-intervention testing of endovascular grafts. We have applied multi-material 3D printing technology to manufacture idealized AAA phantoms with anisotropic mechanical behavior. Different composites were fabricated and the phantom specimens were characterized by biaxial tensile tests while using a constitutive model to fit the experimental data. One composite was chosen to manufacture the phantom based on having the same mechanical properties as those reported in the literature for human AAA tissue; the strain energy and anisotropic index were compared to make this choice. The materials for the matrix and fibers of the selected composite are, respectively, the digital materials FLX9940 and FLX9960 developed by Stratasys. The fiber proportion for the composite is equal to 0.15. The differences between the composite behavior and the AAA tissue are small, with a small difference in the strain energy (0.4%) and a maximum difference of 12.4% in the peak Green strain ratio. This work represents a step forward in the application of 3D printing technology for the manufacturing of AAA phantoms with anisotropic mechanical behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    Science.gov (United States)

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for

  1. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  2. A Fault-tolerant Development Methodology for Industrial Control Systems

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Thybo, C.

    2004-01-01

    Developing advanced detection schemes is not the lone factor for obtaining a successful fault diagnosis performance. Acquiring significant achievements in applying Fault-tolerance in industrial development requires that fault diagnosis and recovery schemes are developed in a consistent and logica......Developing advanced detection schemes is not the lone factor for obtaining a successful fault diagnosis performance. Acquiring significant achievements in applying Fault-tolerance in industrial development requires that fault diagnosis and recovery schemes are developed in a consistent...

  3. A Comparative Study of Three Methodologies for Modeling Dynamic Stall

    Science.gov (United States)

    Sankar, L.; Rhee, M.; Tung, C.; ZibiBailly, J.; LeBalleur, J. C.; Blaise, D.; Rouzaud, O.

    2002-01-01

    During the past two decades, there has been an increased reliance on the use of computational fluid dynamics methods for modeling rotors in high speed forward flight. Computational methods are being developed for modeling the shock induced loads on the advancing side, first-principles based modeling of the trailing wake evolution, and for retreating blade stall. The retreating blade dynamic stall problem has received particular attention, because the large variations in lift and pitching moments encountered in dynamic stall can lead to blade vibrations and pitch link fatigue. Restricting to aerodynamics, the numerical prediction of dynamic stall is still a complex and challenging CFD problem, that, even in two dimensions at low speed, gathers the major difficulties of aerodynamics, such as the grid resolution requirements for the viscous phenomena at leading-edge bubbles or in mixing-layers, the bias of the numerical viscosity, and the major difficulties of the physical modeling, such as the turbulence models, the transition models, whose both determinant influences, already present in static maximal-lift or stall computations, are emphasized by the dynamic aspect of the phenomena.

  4. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  5. Development of a methodology for life cycle building energy ratings

    International Nuclear Information System (INIS)

    Hernandez, Patxi; Kenny, Paul

    2011-01-01

    Traditionally the majority of building energy use has been linked to its operation (heating, cooling, lighting, etc.), and much attention has been directed to reduce this energy use through technical innovation, regulatory control and assessed through a wide range of rating methods. However buildings generally employ an increasing amount of materials and systems to reduce the energy use in operation, and energy embodied in these can constitute an important part of the building's life cycle energy use. For buildings with 'zero-energy' use in operation the embodied energy is indeed the only life cycle energy use. This is not addressed by current building energy assessment and rating methods. This paper proposes a methodology to extend building energy assessment and rating methods accounting for embodied energy of building components and systems. The methodology is applied to the EU Building Energy Rating method and, as an illustration, as implemented in Irish domestic buildings. A case study dwelling is used to illustrate the importance of embodied energy on life cycle energy performance, particularly relevant when energy use in operation tends to zero. The use of the Net Energy Ratio as an indicator to select appropriate building improvement measures is also presented and discussed. - Highlights: → The definitions for 'zero energy buildings' and current building energy ratings are examined. → There is a need to integrate a life cycle perspective within building energy ratings. → A life cycle building energy rating method (LC-BER), including embodied energy is presented. → Net Energy Ratio is proposed as an indicator to select building energy improvement options.

  6. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    International Nuclear Information System (INIS)

    Lahtinen, J.; Launiainen, T.; Heljanko, K.; Ropponen, J.

    2012-01-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  7. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  8. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  9. Development of a Malicious Insider Composite Vulnerability Assessment Methodology

    National Research Council Canada - National Science Library

    King, William H

    2006-01-01

    .... There are very few vulnerability and impact models capable of providing information owners with the ability to comprehensively assess the effectiveness an organization's malicious insider mitigation strategies...

  10. Modeling Tourism Sustainable Development

    Science.gov (United States)

    Shcherbina, O. A.; Shembeleva, E. A.

    The basic approaches to decision making and modeling tourism sustainable development are reviewed. Dynamics of a sustainable development is considered in the Forrester's system dynamics. Multidimensionality of tourism sustainable development and multicriteria issues of sustainable development are analyzed. Decision Support Systems (DSS) and Spatial Decision Support Systems (SDSS) as an effective technique in examining and visualizing impacts of policies, sustainable tourism development strategies within an integrated and dynamic framework are discussed. Main modules that may be utilized for integrated modeling sustainable tourism development are proposed.

  11. The Development Methodology of the UML Electronic Guide

    Directory of Open Access Journals (Sweden)

    N.A. Magariu

    2006-09-01

    Full Text Available A technological model for realization of the electronic guide to UML language is considered. This model includes description of peculiarities of using the special graphic editor for constructing the UML diagrams, XML vocabularies (XMI, DocBook, SVG, XSLT for representing the text and diagrams and JavaScript code for constructing the tests.

  12. Development of a methodology to evaluate material accountability in pyroprocess

    Science.gov (United States)

    Woo, Seungmin

    This study investigates the effect of the non-uniform nuclide composition in spent fuel on material accountancy in the pyroprocess. High-fidelity depletion simulations are performed using the Monte Carlo code SERPENT in order to determine nuclide composition as a function of axial and radial position within fuel rods and assemblies, and burnup. For improved accuracy, the simulations use short burnups step (25 days or less), Xe-equilibrium treatment (to avoid oscillations over burnup steps), axial moderator temperature distribution, and 30 axial meshes. Analytical solutions of the simplified depletion equations are built to understand the axial non-uniformity of nuclide composition in spent fuel. The cosine shape of axial neutron flux distribution dominates the axial non-uniformity of the nuclide composition. Combined cross sections and time also generate axial non-uniformity, as the exponential term in the analytical solution consists of the neutron flux, cross section and time. The axial concentration distribution for a nuclide having the small cross section gets steeper than that for another nuclide having the great cross section because the axial flux is weighted by the cross section in the exponential term in the analytical solution. Similarly, the non-uniformity becomes flatter as increasing burnup, because the time term in the exponential increases. Based on the developed numerical recipes and decoupling of the results between the axial distributions and the predetermined representative radial distributions by matching the axial height, the axial and radial composition distributions for representative spent nuclear fuel assemblies, the Type-0, -1, and -2 assemblies after 1, 2, and 3 depletion cycles, is obtained. These data are appropriately modified to depict processing for materials in the head-end process of pyroprocess that is chopping, voloxidation and granulation. The expectation and standard deviation of the Pu-to-244Cm-ratio by the single granule

  13. Integrating Social Activity Theory and Critical Discourse Analysis: A Multilayered Methodological Model for Examining Knowledge Mediation in Mentoring

    Science.gov (United States)

    Becher, Ayelet; Orland-Barak, Lily

    2016-01-01

    This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…

  14. Development of Six Sigma methodology for CNC milling process improvements

    Science.gov (United States)

    Ismail, M. N.; Rose, A. N. M.; Mohammed, N. Z.; Rashid, M. F. F. Ab

    2017-10-01

    Quality and productivity have been identified as an important role in any organization, especially for manufacturing sectors to gain more profit that leads to success of a company. This paper reports a work improvement project in Kolej Kemahiran Tinggi MARA Kuantan. It involves problem identification in production of “Khufi” product and proposing an effective framework to improve the current situation effectively. Based on the observation and data collection on the work in progress (WIP) product, the major problem has been identified related to function of the product which is the parts can’t assemble properly due to dimension of the product is out of specification. The six sigma has been used as a methodology to study and improve of the problems identified. Six Sigma is a highly statistical and data driven approach to solving complex business problems. It uses a methodical five phase approach define, measure, analysis, improve and control (DMAIC) to help understand the process and the variables that affect it so that can be optimized the processes. Finally, the root cause and solution for the production of “Khufi” problem has been identified and implemented then the result for this product was successfully followed the specification of fitting.

  15. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    International Nuclear Information System (INIS)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-01-01

    Highlights: • An ensemble model is developed to produce both deterministic and probabilistic wind forecasts. • A deep feature selection framework is developed to optimally determine the inputs to the forecasting methodology. • The developed ensemble methodology has improved the forecasting accuracy by up to 30%. - Abstract: With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  16. Methodology for urban rail and construction technology research and development planning

    Science.gov (United States)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  17. Modeling the Cloud: Methodology for Cloud Computing Strategy and Design

    Science.gov (United States)

    2011-05-17

    roadmap” 4. Leverage an enterprise architecture methodology, such as TOGAF and/or DODAF, to build integrated artifacts 5. Extend the business and...the Patriot Act. - 41 - Transition Planning - 42 - Transition Planning: Leveraging TOGAF Phases E, F, G &H: • Opportunities and Solutions

  18. Development of core technology for KNGR system design; development of methodology to determine hydrogen ignitor position in KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Park, Goon Chul; Choi, Hee Dong; Whang, Yong Seok; Lee, Un Jang; Lee, Jung Jae; Kim, Do Yeun [Seoul National University, Seoul (Korea)

    2002-04-01

    The scope of this project is to generate fundamental engineering data of hydrogen behavior required in establishing the design guidance and regulation requirement with respect to hydrogen control and severe accident management. This is ultimately to arrange engineering hydrogen control scheme to clear factors threatening integrity of the containment building by hazard of explosion due to burning of hydrogen generated during the severe accident in APR 1400. On the basis of this study, to ensure the safety of nuclear power plant against the severe accident by serving efficient methodology for the positioning of hydrogen igniter is the aim of the project. The project proceeded in 6 phases. Details are followings;1. Capability examination of hydrogen controller. 2. Three dimensional hydrogen mixing experiments and development of He detector. 3. Verification on the three dimensional hydrogen analysis model of the GOTHIC code 4. Development of three dimensional hydrogen mixing code HYCA3D 5. Examination of PAR models 6. Development of methodology to determine hydrogen igniter position. 29 refs., 76 figs., 10 tabs. (Author)

  19. Systematic iteration between model and methodology: A proposed approach to evaluating unintended consequences.

    Science.gov (United States)

    Morell, Jonathan A

    2017-09-18

    This article argues that evaluators could better deal with unintended consequences if they improved their methods of systematically and methodically combining empirical data collection and model building over the life cycle of an evaluation. This process would be helpful because it can increase the timespan from when the need for a change in methodology is first suspected to the time when the new element of the methodology is operational. The article begins with an explanation of why logic models are so important in evaluation, and why the utility of models is limited if they are not continually revised based on empirical evaluation data. It sets the argument within the larger context of the value and limitations of models in the scientific enterprise. Following will be a discussion of various issues that are relevant to model development and revision. What is the relevance of complex system behavior for understanding predictable and unpredictable unintended consequences, and the methods needed to deal with them? How might understanding of unintended consequences be improved with an appreciation of generic patterns of change that are independent of any particular program or change effort? What are the social and organizational dynamics that make it rational and adaptive to design programs around single-outcome solutions to multi-dimensional problems? How does cognitive bias affect our ability to identify likely program outcomes? Why is it hard to discern change as a result of programs being embedded in multi-component, continually fluctuating, settings? The last part of the paper outlines a process for actualizing systematic iteration between model and methodology, and concludes with a set of research questions that speak to how the model/data process can be made efficient and effective. Copyright © 2017. Published by Elsevier Ltd.

  20. Development of margin assessment methodology of decay heat removal function against external hazards. (2) Tornado PRA methodology

    International Nuclear Information System (INIS)

    Nishino, Hiroyuki; Kurisaka, Kenichi; Yamano, Hidemasa

    2014-01-01

    Probabilistic Risk Assessment (PRA) for external events has been recognized as an important safety assessment method after the TEPCO's Fukushima Daiichi nuclear power station accident. The PRA should be performed not only for earthquake and tsunami which are especially key events in Japan, but also the PRA methodology should be developed for the other external hazards (e.g. tornado). In this study, the methodology was developed for Sodium-cooled Fast Reactors paying attention to that the ambient air is their final heat sink for removing decay heat under accident conditions. First, tornado hazard curve was estimated by using data recorded in Japan. Second, important structures and components for decay heat removal were identified and an event tree resulting in core damage was developed in terms of wind load and missiles (i.e. steel pipes, boards and cars) caused by a tornado. Main damage cause for important structures and components is the missiles and the tornado missiles that can reach those components and structures placed on high elevations were identified, and the failure probabilities of the components and structures against the tornado missiles were calculated as a product of two probabilities: i.e., a probability for the missiles to enter the intake or outtake in the decay heat removal system, and a probability of failure caused by the missile impacts. Finally, the event tree was quantified. As a result, the core damage frequency was enough lower than 10 -10 /ry. (author)

  1. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  2. On-line maintenance methodology development and its applications

    International Nuclear Information System (INIS)

    Kim, J.; Jae, M.

    2012-01-01

    With the increasing economic pressures being faced and the potential for shortening outage times under the conditions of deregulated electricity markets in the world, licensees are motivated to get an increasing amount of the on-line maintenance (OLM). The benefits of the OLM includes increased system and plant reliability, reduction of plant equipment and system material condition deficiencies that could adversely impact operations, and reduction of work scope during plant refueling outages. In Korea, allowance guidelines of risk assessment is specified in the safety regulation guidelines 16.7 and 16.8 of the Korea Inst. of Nuclear Safety (KINS), which is 'General guidelines of Risk-informed application for requesting permission of changes' and 'Requesting permission of changes of Risk-informed application for Technical Specification'. We select the emergency diesel generator (EDG) of the Ulchin unit 3 and 4 for risk assessment analysis by applying configuration changes. The EDG which has plant safety level IE belongs to on-site standby power (A, B train EDG) in electric distribution system. The EDG is important component because it should maintain standby status during plant is operating, therefore we select the EDG for target component of risk assessment analysis. The risk assessment is limited to CDF. The risk assessment is performed by using AIMS-PSA Release2. We evaluate CDF by applying the configuration changes with some assumptions. Evaluation of the full power operation and Low power/Shut down operation was performed. This study has been performed for introducing a methodology and performing risk assessment. (authors)

  3. Development of Management Quality Assessment Methodology in the Public Sector: Problems and Contradictions

    Directory of Open Access Journals (Sweden)

    Olga Vladimirovna Kozhevina

    2015-09-01

    Full Text Available The development management quality assessment methodology in the public sector is relevant scientific and practical problem of economic research. The utilization of the results of the assessment on the basis of the authors’ methodology allows us to rate the public sector organizations, to justify decisions on the reorganization and privatization, and to monitor changes in the level of the management quality of the public sector organizations. The study determined the place of the quality of the control processes of the public sector organization in the system of “Quality of public administration — the effective operation of the public sector organization,” the contradictions associated with the assessment of management quality are revealed, the conditions for effective functioning of the public sector organizations are proved, a mechanism of comprehensive assessment and algorithm for constructing and evaluating the control models of management quality are developed, the criteria for assessing the management quality in the public sector organizations, including economic, budgetary, social and public, informational, innovation and institutional criteria are empirically grounded. By utilizing the proposed algorithm, the assessment model of quality management in the public sector organizations, including the financial, economic, social, innovation, informational and institutional indicators is developed. For each indicator of quality management, the coefficients of importance in the management quality assessment model, as well as comprehensive and partial evaluation indicators are determined on the basis of the expert evaluations. The main conclusion of the article is that management quality assessment for the public sector organizations should be based not only on the indicators achieved in the dynamics and utilized for analyzing the effectiveness of management, but also should take into account the reference levels for the values of these

  4. A Framework for Selection of DSS Development Methodology

    Science.gov (United States)

    1992-03-01

    PSOA A.--ORS, Foote, Marcus G. . a -,:- 0: D OR; P3: /E CO..ERED 74 DATE OF REPORT (Year, Month, Day) 15 PAGE COjN- Master’s Thesis =ov TO March 1992 63...and Carlson, 1982, p. 62) However, once the system is completely developed, it will reach full strength faster than the other approaches, develop the

  5. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  6. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M.; Iglesias, Josué; Casar, José R.

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym. PMID:23012544

  7. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  8. A modelling methodology for assessing the impact of climate variability and climatic change on hydroelectric generation

    International Nuclear Information System (INIS)

    Munoz, J.R.; Sailor, D.J.

    1998-01-01

    A new methodology relating basic climatic variables to hydroelectric generation was developed. The methodology can be implemented in large or small basins with any number of hydro plants. The method was applied to the Sacramento, Eel and Russian river basins in northern California where more than 100 hydroelectric plants are located. The final model predicts the availability of hydroelectric generation for the entire basin provided present and near past climate conditions, with about 90% accuracy. The results can be used for water management purposes or for analyzing the effect of climate variability on hydrogeneration availability in the basin. A wide range of results can be obtained depending on the climate change scenario used. (Author)

  9. Development of a methodology for analysis of the impact of modifying neutron cross sections

    International Nuclear Information System (INIS)

    Wenner, M. T.; Haghighat, A.; Adams, J. M.; Carlson, A. D.; Grimes, S. M.; Massey, T. N.

    2004-01-01

    Monte Carlo analysis of a Time-of-Flight (TOF) experiment can be utilized to examine the accuracy of nuclear cross section data. Accurate determination of this data is paramount in characterization of reactor lifetime. We have developed a methodology to examine the impact of modifying the current cross section libraries available in ENDF-6 format (1) where deficiencies may exist, and have shown that this methodology may be an effective methodology for examining the accuracy of nuclear cross section data. The new methodology has been applied to the iron scattering cross sections, and the use of the revised cross sections suggests that reactor pressure vessel fluence may be underestimated. (authors)

  10. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  11. Modeling postpartum depression in rats: theoretic and methodological issues

    Science.gov (United States)

    Ming, LI; Shinn-Yi, CHOU

    2016-01-01

    The postpartum period is when a host of changes occur at molecular, cellular, physiological and behavioral levels to prepare female humans for the challenge of maternity. Alteration or prevention of these normal adaptions is thought to contribute to disruptions of emotion regulation, motivation and cognitive abilities that underlie postpartum mental disorders, such as postpartum depression. Despite the high incidence of this disorder, and the detrimental consequences for both mother and child, its etiology and related neurobiological mechanisms remain poorly understood, partially due to the lack of appropriate animal models. In recent decades, there have been a number of attempts to model postpartum depression disorder in rats. In the present review, we first describe clinical symptoms of postpartum depression and discuss known risk factors, including both genetic and environmental factors. Thereafter, we discuss various rat models that have been developed to capture various aspects of this disorder and knowledge gained from such attempts. In doing so, we focus on the theories behind each attempt and the methods used to achieve their goals. Finally, we point out several understudied areas in this field and make suggestions for future directions. PMID:27469254

  12. Establishing a methodology to develop complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-02-01

    Full Text Available Many modern management systems, such as military command and control, tend to be large and highly interconnected sociotechnical systems operating in a complex environment. Successful development, assessment and implementation of these systems...

  13. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  14. Learning challenges and sustainable development: A methodological perspective.

    Science.gov (United States)

    Seppänen, Laura

    2017-01-01

    Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.

  15. Development of a flow structure interaction methodology applicable to a convertible car roof

    International Nuclear Information System (INIS)

    Knight, Jason J.

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and convergence are investigated using the coupled code. The three-dimensional problem is addressed by extending the two-dimensional structural solver to represent a surface by a matrix of line elements with constant tension along their length. This has been successfully coupled with the three-dimensional CFD flow-solution technique. Computed deformations show good agreement with the results of wind tunnel experiments for the well prescribed geometry. In both two-and three-dimensional computations, the flow-structure interaction is found to yield a static deformation to within 1% difference in the displacement variable after three iterations between the fluid and structural codes. The same computational methodology is applied to a real-car application using a third-party structural solver. The methodology is shown to be robust even under conditions beyond those likely to be encountered. The full methodology could be used as a design tool. The present work

  16. Numerical simulation methodologies for design and development of Diffuser-Augmented Wind Turbines - analysis and comparison

    Science.gov (United States)

    Michał, Lipian; Maciej, Karczewski; Jakub, Molinski; Krzysztof, Jozwik

    2016-01-01

    Different numerical computation methods used to develop a methodology for fast, efficient, reliable design and comparison of Diffuser-Augmented Wind Turbine (DAWT) geometries are presented. The demand for such methods is evident, following the multitude of geometrical parameters that influence the flow character through ducted turbines. The results of the Actuator Disk Model (ADM) simulations will be confronted with a simulation method of higher order of accuracy, i.e. the 3D Fully-resolved Rotor Model (FRM) in the rotor design point. Both will be checked for consistency with the experimental results measured in the wind tunnel at the Institute of Turbo-machinery (IMP), Lodz University of Technology (TUL). An attempt to find an efficient method (with a compromise between accuracy and design time) for the flow analysis pertinent to the DAWT is a novel approach presented in this paper.

  17. Methodological Support to Develop Interoperable Applications for Pervasive Healthcare

    NARCIS (Netherlands)

    Cardoso de Moraes, J.L.

    2014-01-01

    The healthcare model currently being used in most countries will soon be inadequate, due to the increasing care costs of a growing population of elderly people, the rapid increase of chronic diseases, the growing demand for new treatments and technologies, and the relative decrease in the number of

  18. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  19. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    International Nuclear Information System (INIS)

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-01-01

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soil and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE's) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This

  20. Methodology for Evaluating the Rural Tourism Potentials: A Tool to Ensure Sustainable Development of Rural Settlements

    Directory of Open Access Journals (Sweden)

    Alexander Trukhachev

    2015-03-01

    Full Text Available The paper analyses potentials, challenges and problems of the rural tourism from the point of view of its impact on sustainable rural development. It explores alternative sources of income for rural people by means of tourism and investigates effects of the rural tourism on agricultural production in local rural communities. The aim is to identify the existing and potential tourist attractions within the rural areas in Southern Russia and to provide solutions to be introduced in particular rural settlements in order to make them attractive for tourists. The paper includes the elaboration and testing of a methodology for evaluating the rural tourism potentials using the case of rural settlements of Stavropol Krai, Russia. The paper concludes with a ranking of the selected rural settlements according to their rural tourist capacity and substantiation of the tourism models to be implemented to ensure a sustainable development of the considered rural areas.

  1. Modeling of Throughput in Production Lines Using Response Surface Methodology and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Federico Nuñez-Piña

    2018-01-01

    Full Text Available The problem of assigning buffers in a production line to obtain an optimum production rate is a combinatorial problem of type NP-Hard and it is known as Buffer Allocation Problem. It is of great importance for designers of production systems due to the costs involved in terms of space requirements. In this work, the relationship among the number of buffer slots, the number of work stations, and the production rate is studied. Response surface methodology and artificial neural network were used to develop predictive models to find optimal throughput values. 360 production rate values for different number of buffer slots and workstations were used to obtain a fourth-order mathematical model and four hidden layers’ artificial neural network. Both models have a good performance in predicting the throughput, although the artificial neural network model shows a better fit (R=1.0000 against the response surface methodology (R=0.9996. Moreover, the artificial neural network produces better predictions for data not utilized in the models construction. Finally, this study can be used as a guide to forecast the maximum or near maximum throughput of production lines taking into account the buffer size and the number of machines in the line.

  2. Advances in Artificial Neural Networks - Methodological Development and Application

    Science.gov (United States)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  3. Development of a standardised methodology for event impact ...

    African Journals Online (AJOL)

    The Western Cape Government (WCG) developed an Integrated Events Strategy for Cape Town and the Western Cape, supporting events to maximise brand building potential and triple bottom line benefits. WCG acknowledges that assessing the impacts of events in the province has become increasingly complex and ...

  4. Development of a methodology for accident causation research

    Science.gov (United States)

    1983-06-01

    The obj ective of this study was to fully develop and apply a me thodology to : study accident causation, uhich was outlined in a previous study . " Causal" factors : are those pre-crash factors, which are statistically related to the accident rate :...

  5. Prioritization Methodology for Development of Required Operational Capabilities

    Science.gov (United States)

    2010-04-01

    developed in the 1960’s to meet the increasing requirements of human society and the environment. In 1980 F. Seo [18] suggested a MCDM method that... MCDM Tool for the Acquisition of Military Equipment, RTP-MP-SAS-080, pp. 17/1 – 17/10. [9]. Keeney, R.L, (1976) A group preference axiomatization

  6. A novel methodology improves reservoir characterization models using geologic fuzzy variables

    Energy Technology Data Exchange (ETDEWEB)

    Soto B, Rodolfo [DIGITOIL, Maracaibo (Venezuela); Soto O, David A. [Texas A and M University, College Station, TX (United States)

    2004-07-01

    One of the research projects carried out in Cusiana field to explain its rapid decline during the last years was to get better permeability models. The reservoir of this field has a complex layered system that it is not easy to model using conventional methods. The new technique included the development of porosity and permeability maps from cored wells following the same trend of the sand depositions for each facie or layer according to the sedimentary facie and the depositional system models. Then, we used fuzzy logic to reproduce those maps in three dimensions as geologic fuzzy variables. After multivariate statistical and factor analyses, we found independence and a good correlation coefficient between the geologic fuzzy variables and core permeability and porosity. This means, the geologic fuzzy variable could explain the fabric, the grain size and the pore geometry of the reservoir rock trough the field. Finally, we developed a neural network permeability model using porosity, gamma ray and the geologic fuzzy variable as input variables. This model has a cross-correlation coefficient of 0.873 and average absolute error of 33% compared with the actual model with a correlation coefficient of 0.511 and absolute error greater than 250%. We tested different methodologies, but this new one showed dramatically be a promiser way to get better permeability models. The use of the models have had a high impact in the explanation of well performance and workovers, and reservoir simulation models. (author)

  7. New methodologies for calculation of flight parameters on reduced scale wings models in wind tunnel =

    Science.gov (United States)

    Ben Mosbah, Abdallah

    In order to improve the qualities of wind tunnel tests, and the tools used to perform aerodynamic tests on aircraft wings in the wind tunnel, new methodologies were developed and tested on rigid and flexible wings models. A flexible wing concept is consists in replacing a portion (lower and/or upper) of the skin with another flexible portion whose shape can be changed using an actuation system installed inside of the wing. The main purpose of this concept is to improve the aerodynamic performance of the aircraft, and especially to reduce the fuel consumption of the airplane. Numerical and experimental analyses were conducted to develop and test the methodologies proposed in this thesis. To control the flow inside the test sections of the Price-Paidoussis wind tunnel of LARCASE, numerical and experimental analyses were performed. Computational fluid dynamics calculations have been made in order to obtain a database used to develop a new hybrid methodology for wind tunnel calibration. This approach allows controlling the flow in the test section of the Price-Paidoussis wind tunnel. For the fast determination of aerodynamic parameters, new hybrid methodologies were proposed. These methodologies were used to control flight parameters by the calculation of the drag, lift and pitching moment coefficients and by the calculation of the pressure distribution around an airfoil. These aerodynamic coefficients were calculated from the known airflow conditions such as angles of attack, the mach and the Reynolds numbers. In order to modify the shape of the wing skin, electric actuators were installed inside the wing to get the desired shape. These deformations provide optimal profiles according to different flight conditions in order to reduce the fuel consumption. A controller based on neural networks was implemented to obtain desired displacement actuators. A metaheuristic algorithm was used in hybridization with neural networks, and support vector machine approaches and their

  8. A Consistent Methodology Based Parameter Estimation for a Lactic Acid Bacteria Fermentation Model

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2017-01-01

    Lactic acid bacteria are used in many industrial applications, e.g. as starter cultures in the dairy industry or as probiotics, and research on their cell production is highly required. A first principles kinetic model was developed to describe and understand the biological, physical, and chemical...... mechanisms in a lactic acid bacteria fermentation. We present here a consistent approach for a methodology based parameter estimation for a lactic acid fermentation. In the beginning, just an initial knowledge based guess of parameters was available and an initial parameter estimation of the complete set...

  9. Development of a Graphical Tool to integrate the Prometheus AEOlus methodology and Jason Platform

    Directory of Open Access Journals (Sweden)

    Rafhael CUNHA

    2017-07-01

    Full Text Available Software Engineering (SE is an area that intends to build high-quality software in a systematic way. However, traditional software engineering techniques and methods do not support the demand for developing Multiagent Systems (MAS. Therefore a new subarea has been studied, called Agent Oriented Software Engineering (AOSE. The AOSE area proposes solutions to issues related to the development of agent oriented systems. There is still no standardization in this subarea, resulting in several methodologies. Another issue of this subarea is that there are very few tools that are able to automatically generate code. In this work we propose a tool to support the Prometheus AEOlus Methodology because it provides modelling artifacts to all MAS dimensions: agents, environment, interaction, and organization. The tool supports all Prometheus AEOlus artifacts and can automatically generated code to the agent and interaction dimensions in the AgentSpeak Language, which is the language used in the Jason Platform. We have done some validations with the proposed tool and a case study is presented.

  10. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  11. Additional methodology development for statistical evaluation of reactor safety analyses

    International Nuclear Information System (INIS)

    Marshall, J.A.; Shore, R.W.; Chay, S.C.; Mazumdar, M.

    1977-03-01

    The project described is motivated by the desire for methods to quantify uncertainties and to identify conservatisms in nuclear power plant safety analysis. The report examines statistical methods useful for assessing the probability distribution of output response from complex nuclear computer codes, considers sensitivity analysis and several other topics, and also sets the path for using the developed methods for realistic assessment of the design basis accident

  12. Combinations of options: Methodology for impact analysis. Development plan 1993

    International Nuclear Information System (INIS)

    1992-01-01

    The orientations favored by Hydro-Quebec in terms of electricity supply and demand are based on a few key selection criteria. These criteria, as described in its development plan, pertain to economic benefit for the utility and its customers, compatibility with sustainable development, minimization of costs to customers, preservation of the utility's financial health, generation of economic spinoffs, and ease of adaptation. Impacts are calculated to illustrate the selection criteria. The main methods, assumptions, and components used in evaluating the various impacts are described. The discounted overall cost for Hydro-Quebec and all of its customers, means of meeting electricity requirements, and the economic benefit for Hydro-Quebec of the various market development options are discussed. The indicators chosen for environmental impact assessment are set forth and the method used to calculate long-term supply costs is presented, along with the methods for calculating economic spinoffs. Finally, the concepts of energy mix and energy self-sufficiency are outlined. 1 tab

  13. SHARING ON WEB 3D MODELS OF ANCIENT THEATRES. A METHODOLOGICAL WORKFLOW

    Directory of Open Access Journals (Sweden)

    A. Scianna

    2016-06-01

    Full Text Available In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn’t exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  14. Development of a methodology for post closure radiological risk analysis of underground waste repositories. Illustrative assessment of the Harwell site

    International Nuclear Information System (INIS)

    Gralewski, Z.A.; Kane, P.; Nicholls, D.B.

    1987-06-01

    A probabilistic risk analysis (pra) is demonstrated for a number of ground water mediated release scenarios at the Harwell Site for a hypothetical repository at a depth of about 150 metres. This is the second stage of development of an overall risk assessment methodology. A procedure for carrying out multi-scenario assessment using available probabilistic risk assessment (pra) models is presented and a general methodology for combining risk contributions is outlined. Appropriate levels of model complexity in pra are discussed. Modelling requirements for the treatment of multiple simultaneous pathways and of site evolution are outlined. Further developments of pra systems are required to increase the realism of both the models and their mode of application, and hence to improve estimates of risk. (author)

  15. Methodology Development for Assessment of Spaceport Technology Returns and Risks

    Science.gov (United States)

    Joglekar, Prafulla; Zapata, Edgar

    2001-01-01

    As part of Kennedy Space Center's (KSC's) challenge to open the space frontier, new spaceport technologies must be developed, matured and successfully transitioned to operational systems. R&D investment decisions can be considered from multiple perspectives. Near mid and far term technology horizons must be understood. Because a multitude of technology investment opportunities are available, we must identify choices that promise the greatest likelihood of significant lifecycle At the same time, the costs and risks of any choice must be well understood and balanced against its potential returns The problem is not one of simply rank- ordering projects in terms of their desirability. KSC wants to determine a portfolio of projects that simultaneously satisfies multiple goals, such as getting the biggest bang for the buck, supporting projects that may be too risky for private funding, staying within annual budget cycles without foregoing the requirements of a long term technology vision, and ensuring the development of a diversity of technologies that, support the variety of operational functions involved in space transportation. This work aims to assist in the development of in methods and techniques that support strategic technology investment decisions and ease the process of determining an optimal portfolio of spaceport R&D investments. Available literature on risks and returns to R&D is reviewed and most useful pieces are brought to the attention of the Spaceport Technology Development Office (STDO). KSC's current project management procedures are reviewed. It is found that the "one size fits all" nature of KSC's existing procedures and project selection criteria is not conducive to prudent decision-making. Directions for improving KSC's - procedures and criteria are outlined. With help of a contractor, STDO is currently developing a tool, named Change Management Analysis Tool (CMAT)/ Portfolio Analysis Tool (PAT), to assist KSC's R&D portfolio determination. A

  16. Modeling and Analysis of The Pressure Die Casting Using Response Surface Methodology

    International Nuclear Information System (INIS)

    Kittur, Jayant K.; Herwadkar, T. V.; Parappagoudar, M. B.

    2010-01-01

    Pressure die casting is successfully used in the manufacture of Aluminum alloys components for automobile and many other industries. Die casting is a process involving many process parameters having complex relationship with the quality of the cast product. Though various process parameters have influence on the quality of die cast component, major influence is seen by the die casting machine parameters and their proper settings. In the present work, non-linear regression models have been developed for making predictions and analyzing the effect of die casting machine parameters on the performance characteristics of die casting process. Design of Experiments (DOE) with Response Surface Methodology (RSM) has been used to analyze the effect of effect of input parameters and their interaction on the response and further used to develop nonlinear input-output relationships. Die casting machine parameters, namely, fast shot velocity, slow shot to fast shot change over point, intensification pressure and holding time have been considered as the input variables. The quality characteristics of the cast product were determined by porosity, hardness and surface rough roughness (output/responses). Design of experiments has been used to plan the experiments and analyze the impact of variables on the quality of casting. On the other-hand Response Surface Methodology (Central Composite Design) is utilized to develop non-linear input-output relationships (regression models). The developed regression models have been tested for their statistical adequacy through ANOVA test. The practical usefulness of these models has been tested with some test cases. These models can be used to make the predictions about different quality characteristics, for the known set of die casting machine parameters, without conducting the experiments.

  17. Development cooperation as methodology for teaching social responsibility to engineers

    Science.gov (United States)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  18. Applying axiomatic design methodology in developing modified libertation products

    Directory of Open Access Journals (Sweden)

    Bibiana Margarita Vallejo Díaz

    2004-09-01

    Full Text Available Some conceptual elements regarding the axiomatic design method were applied to a specific case-study regarding developing modified liberation compressed product (CLM-UN, for use in the agricultural sector as pH regulating agent in solil. The study was orientated towards defining functional requeriments, design parameters and process variables for manufacturing the product. Independence and information were evaluated, supporting axiomatic design as an alternative for integral product and process design (as a rational and systemic exercise, facilitating producing products having the quality which future users expect from them.

  19. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  20. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  1. An architecture and methodology for the design and development of Technical Information Systems

    NARCIS (Netherlands)

    Capobianchi, R.; Mautref, M.; van Keulen, Maurice; Balsters, H.

    In order to meet demands in the context of Technical Information Systems (TIS) pertaining to reliability, extensibility, maintainability, etc., we have developed an architectural framework with accompanying methodological guidelines for designing such systems. With the framework, we aim at complex

  2. Contracting Selection for the Development of the Range Rule Risk Methodology

    National Research Council Canada - National Science Library

    1997-01-01

    ...-Effectiveness Risk Tool and contractor selection for the development of the Range Rule Risk Methodology. The audit objective was to determine whether the Government appropriately used the Ordnance and Explosives Cost-Effectiveness Risk Tool...

  3. Methodology development to support NPR strategic planning. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget

  4. Summary of FY-1978 consultant input for scenario methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Scott, B.L.; Benson, G.L.; Craig, R.A. (eds.); Harwell, M.A.

    1979-11-01

    Associated with commercial nuclear power production in the United States is the generation of potentially hazardous radioactive waste products. The Department of Energy (DOE), through the National Waste Terminal Storage (NWTS) Program, is seeking to develop nuclear waste isolation systems in geologic formations. These underground waste isolation systems will preclude contact with the biosphere of waste radionuclides in concentrations which are sufficient to cause deleterious impact on humans or their environments. Comprehensive analyses of specific isolation systems are needed to assess the postclosure expectations of the systems. Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) program has been established for developing the capability of making those analyses. The assessment of repository post-closure safety has two basic components: identification and analyses of breach scenarios and the pattern of events and processes causing each breach, and identification and analyses of the environmental consequences of radionuclide transport and interactions subsequent to a repository breach. Specific processes and events which might affect potential repository sites and, the rates and probabilities for those phenomena are presented. The description of the system interactions and synergisms and of the repository system as an evolving and continuing process are included. Much of the preliminary information derived from the FY-1978 research effort is summarized in this document. This summary report contains information pertaining to the following areas of study: climatology, geomorphology, glaciology, hydrology, meteorites, sea level fluctuations, structural geology and volcanology.

  5. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types...... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....

  6. A methodology to promote business development from research outcomes in food science and technology

    Directory of Open Access Journals (Sweden)

    Eduardo L. Cardoso

    2015-04-01

    Full Text Available Valorization of knowledge produced in research units has been a major challenge for research universities in contemporary societies. The prevailing forces have led these institutions to develop a “third mission”, the facilitation of technology transfer and activity in an entrepreneurial paradigm. Effective management of challenges encountered in the development of academic entrepreneurship and the associated valorization of knowledge produced by universities are major factors to bridge the gap between research and innovation in Europe.The need to improve the existing institutional knowledge valorization processes, concerning entrepreneurship and business development and the processes required were discussed.A case study was designed to describe the institutional knowledge valorization process in a food science and technology research unit and a related incubator, during a five year evaluation period that ended in 2012.The knowledge valorization processes benefited from the adoption of a structured framework methodology that led to ideas and teams from a business model generation to client development, in parallel, when possible, with an agile product/service development.Although academic entrepreneurship engagement could be improved, this case study demonstrated that stronger skills development was needed to enable the researcher to be more aware of business development fundamentals and therefore contribute to research decisions and the valorisation of individual and institutional knowledge assets. It was noted that the timing for involvement of companies in the research projects or programs varied with the nature of the research.

  7. External Events Analysis for LWRS/RISMC Project: Methodology Development and Early Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Parisi, Carlo [Idaho National Laboratory; Prescott, Steven Ralph [Idaho National Laboratory; Yorg, Richard Alan [Idaho National Laboratory; Coleman, Justin Leigh [Idaho National Laboratory; Szilard, Ronaldo Henriques [Idaho National Laboratory

    2016-02-01

    The ultimate scope of Industrial Application #2 (IA) of the LWRS/RISMC project is a realistic simulation of natural external hazards that impose threat to a NPP. This scope requires the development of a methodology and of a qualified set of tools able to perform advanced risk- informed safety analysis. In particular the methodology should be able to combine results from seismic, flooding and thermal-hydraulic (TH) deterministic calculations with dynamic PRA. This summary presents the key points of methodology being developed and the very first sample application of it to a simple problem (spent fuel pool).

  8. Development of a methodology for maintenance optimization at Kozloduy NPP

    International Nuclear Information System (INIS)

    Kitchev, E.

    1997-01-01

    The paper presents the overview of a project for development of an applicable strategy and methods for Kozloduy NPP (KNPP) to optimize its maintenance program in order to meet the current risk based maintenance requirements. The strategy in a format of Integrated Maintenance Program (IMP) manual will define the targets of the optimization process, the major stages and elements of this process and their relationships. IMP embodies the aspects of the US NRC Maintenance Rule compliance and facilitates the integration of KNPP programs and processes which impact the plant maintenance and safety. The methods in a format of IMP Instructions (IM-PI) will define how the different IMP stages can be implemented and the IMP targets can be achieved at KNPP environment. (author). 8 refs

  9. A methodology for the development of software agent based interoperable telemedicine systems: a tele-electrocardiography perspective.

    Science.gov (United States)

    Ganguly, P; Ray, P

    2000-01-01

    Telemedicine involves the integration of information, human-machine, and healthcare technologies. Because different modalities of patient care require applications running on heterogeneous computing environment, software interoperability is a major issue in telemedicine. Software agent technology provides a range of promising techniques to solve this problem. This article discusses the development of a methodology for the design of interoperable telemedicine systems (illustrated with a tele-electrocardiography application). Software interoperability between different applications can be modeled at different levels of abstraction such as physical interoperability, data-type interoperability, specification-level interoperability, and semantic interoperability. Software agents address the issue of software interoperability at semantic level. A popular object-oriented software development methodology - unified modeling language (UML) - has been used for this development. This research has demonstrated the feasibility of the development of agent-based interoperable telemedicine systems. More research is needed before widespread deployment of such systems can take place.

  10. Research Activities on Development of Piping Design Methodology of High Temperature Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Nam-Su [Seoul National Univ. of Science and Technology, Seoul(Korea, Republic of); Won, Min-Gu [Sungkyukwan Univ., Suwon (Korea, Republic of); Oh, Young-Jin [KEPCO Engineering and Construction Co. Inc., Gimcheon (Korea, Republic of); Lee, Hyeog-Yeon; Kim, Yoo-Gon [Korea Atomic Energy Research Institute, Daejeon(Korea, Republic of)

    2016-10-15

    A SFR is operated at high temperature and low pressure compared with commercial pressurized water reactor (PWR), and such an operating condition leads to time-dependent damages such as creep rupture, excessive creep deformation, creep-fatigue interaction and creep crack growth. Thus, high temperature design and structural integrity assessment methodology should be developed considering such failure mechanisms. In terms of design of mechanical components of SFR, ASME B and PV Code, Sec. III, Div. 5 and RCC-MRx provide high temperature design and assessment procedures for nuclear structural components operated at high temperature, and a Leak-Before-Break (LBB) assessment procedure for high temperature piping is also provided in RCC-MRx, A16. Three web-based evaluation programs based on the current high temperature codes were developed for structural components of high temperature reactors. Moreover, for the detailed LBB analyses of high temperature piping, new engineering methods for predicting creep C*-integral and creep COD rate based either on GE/EPRI or on reference stress concepts were proposed. Finally, the numerical methods based on Garofalo's model and RCC-MRx have been developed, and they have been implemented into ABAQUS. The predictions based on both models were compared with the experimental results, and it has been revealed that the predictions from Garafalo's model gave somewhat successful results to describe the deformation behavior of Gr. 91 at elevated temperatures.

  11. Development of a real-time transport performance optimization methodology

    Science.gov (United States)

    Gilyard, Glenn

    1996-01-01

    The practical application of real-time performance optimization is addressed (using a wide-body transport simulation) based on real-time measurements and calculation of incremental drag from forced response maneuvers. Various controller combinations can be envisioned although this study used symmetric outboard aileron and stabilizer. The approach is based on navigation instrumentation and other measurements found on state-of-the-art transports. This information is used to calculate winds and angle of attack. Thrust is estimated from a representative engine model as a function of measured variables. The lift and drag equations are then used to calculate lift and drag coefficients. An expression for drag coefficient, which is a function of parasite drag, induced drag, and aileron drag, is solved from forced excitation response data. Estimates of the parasite drag, curvature of the aileron drag variation, and minimum drag aileron position are produced. Minimum drag is then obtained by repositioning the symmetric aileron. Simulation results are also presented which evaluate the affects of measurement bias and resolution.

  12. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  13. A methodology to develop computational phantoms with adjustable posture for WBC calibration.

    Science.gov (United States)

    Fonseca, T C Ferreira; Bogaerts, R; Hunt, John; Vanhavere, F

    2014-11-21

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  14. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  15. A methodology to support multidisciplinary model-based water management

    NARCIS (Netherlands)

    Scholten, H.; Kassahun, A.; Refsgaard, J.C.; Kargas, Th.; Gavardinas, C.; Beulens, A.J.M.

    2007-01-01

    Quality assurance in model based water management is needed because of some frequently perceived shortcomings, e.g. a lack of mutual understanding between modelling team members, malpractice and a tendency of modellers to oversell model capabilities. Initiatives to support quality assurance focus on

  16. Development of IRMA reagent and methodology for PSA

    International Nuclear Information System (INIS)

    Najafi, R.

    1997-01-01

    The PSA test is a solid phase two-site immunoassay. Rabbit anti PSA is coated or bound on surface of solid phase and monoclonal anti PSA labeled with 1-125. The PSA molecules present in the standard solution or serum are 'Sandwiched' between the two antibodies. After formation of coated antibody-antigen-labeled antibody complex, the unbound labeled antibody will removed by washing. The complex is measured by gamma counter. The concentration of analyte is proportional to the counts of test sample. In order to develop kits for IRMA PSA, it should be prepared three essential reagents Antibody coated solid phase, labeled antibody, standards and finally optimizing them to obtain an standard curve fit to measure specimen PSA in desired range of concentration. The type of solid phase and procedure(s) to coat or bind to antibody, is still main debatable subject in development and setting up RIA/IRMA kits. In our experiments, polystyrene beads, because of their easy to coat with antibody as well as easy to use, can be considered as a desired solid phase. Most antibodies are passively adsorbed to a plastic surface (e.g. Polystyrene, Propylene, and Polyvinyl chloride) from a diluted buffer. The antibody coated plastic surface, then acts as solid phase reagent. Poor efficiency and time required to reach equilibrium and also lack of reproducibility especially batch-to-batch variation between materials, are disadvantages in this simple coating procedure. Improvements can be made by coating second antibody on surface of beads, and reaction between second and primary antibodies. There is also possible to enhance more coating efficiency of beads by using Staphylococcus ureus-Protein A. Protein A is a major component of staphylococcus aureus cell wall which has an affinity for FC segment of immunoglobulin G (IgG) of some species, including human; rabbit; and mice. This property of Staphylococcal Protein A has made it a very useful tool in the purification of classes and subclasses

  17. State-space models for bio-loggers: A methodological road map

    DEFF Research Database (Denmark)

    Jonsen, I.D.; Basson, M.; Bestley, S.

    2012-01-01

    Ecologists have an unprecedented array of bio-logging technologies available to conduct in situ studies of horizontal and vertical movement patterns of marine animals. These tracking data provide key information about foraging, migratory, and other behaviours that can be linked with bio...... development of state-space modelling approaches for animal movement data provides statistical rigor for inferring hidden behavioural states, relating these states to bio-physical data, and ultimately for predicting the potential impacts of climate change. Despite the widespread utility, and current popularity......, of state-space models for analysis of animal tracking data, these tools are not simple and require considerable care in their use. Here we develop a methodological “road map” for ecologists by reviewing currently available state-space implementations. We discuss appropriate use of state-space methods...

  18. The development of evaluation methodology for advanced interactive communication

    International Nuclear Information System (INIS)

    Okamoto, K.

    2005-01-01

    Face-to-face communication is one of the essential style of communication. Trough face-to-face communication, people exchange much information at a time, both verbal and non-verbal information, which is most effective to learn each other. The authors focused on the face-to-face communication, and developed an evaluation method to quantify the effectiveness of communication. We regard conversation as an exchange of keywords. The effectiveness of conversation is valued by the amount of the keywords, and the achievement of mutual understandings. Through two people's face-to-face communication, the author quantified the shared information by measuring the change of the amount of the participants' knowledge. The participants' knowledge is counted by the words they can give. We measured the change in their shared knowledge (number of the words they gave associated to the theme). And we also quantified the discords in their understandings against their partners by measuring the discords between the knowledge that they think they share and the knowledge that they really share. Through these data, we evaluate the effectiveness of communication and analyzed the trends of mutual understanding. (authors)

  19. Genetic Algorithm-Based Optimization Methodology of Bézier Curves to Generate a DCI Microscale-Model

    Directory of Open Access Journals (Sweden)

    Jesus A. Basurto-Hurtado

    2017-11-01

    Full Text Available The aim of this article is to develop a methodology that is capable of generating micro-scale models of Ductile Cast Irons, which have the particular characteristic to preserve the smoothness of the graphite nodules contours that are lost by discretization errors when the contours are extracted using image processing. The proposed methodology uses image processing to extract the graphite nodule contours and a genetic algorithm-based optimization strategy to select the optimal degree of the Bézier curve that best approximate each graphite nodule contour. To validate the proposed methodology, a Finite Element Analysis (FEA was carried out using models that were obtained through three methods: (a using a fixed Bézier degree for all of the graphite nodule contours, (b the present methodology, and (c using a commercial software. The results were compared using the relative error of the equivalent stresses computed by the FEA, where the proposed methodology results were used as a reference. The present paper does not have the aim to define which models are the correct and which are not. However, in this paper, it has been shown that the errors generated in the discretization process should not be ignored when developing geometric models since they can produce relative errors of up to 35.9% when an estimation of the mechanical behavior is carried out.

  20. Pediatric hospital medicine core competencies: development and methodology.

    Science.gov (United States)

    Stucky, Erin R; Ottolini, Mary C; Maniscalco, Jennifer

    2010-01-01

    Pediatric hospital medicine is the most rapidly growing site-based pediatric specialty. There are over 2500 unique members in the three core societies in which pediatric hospitalists are members: the American Academy of Pediatrics (AAP), the Academic Pediatric Association (APA) and the Society of Hospital Medicine (SHM). Pediatric hospitalists are fulfilling both clinical and system improvement roles within varied hospital systems. Defined expectations and competencies for pediatric hospitalists are needed. In 2005, SHM's Pediatric Core Curriculum Task Force initiated the project and formed the editorial board. Over the subsequent four years, multiple pediatric hospitalists belonging to the AAP, APA, or SHM contributed to the content of and guided the development of the project. Editors and collaborators created a framework for identifying appropriate competency content areas. Content experts from both within and outside of pediatric hospital medicine participated as contributors. A number of selected national organizations and societies provided valuable feedback on chapters. The final product was validated by formal review from the AAP, APA, and SHM. The Pediatric Hospital Medicine Core Competencies were created. They include 54 chapters divided into four sections: Common Clinical Diagnoses and Conditions, Core Skills, Specialized Clinical Services, and Healthcare Systems: Supporting and Advancing Child Health. Each chapter can be used independently of the others. Chapters follow the knowledge, skills, and attitudes educational curriculum format, and have an additional section on systems organization and improvement to reflect the pediatric hospitalist's responsibility to advance systems of care. These competencies provide a foundation for the creation of pediatric hospital medicine curricula and serve to standardize and improve inpatient training practices. (c) 2010 Society of Hospital Medicine.

  1. Processing of the GALILEO fuel rod code model uncertainties within the AREVA LWR realistic thermal-mechanical analysis methodology

    International Nuclear Information System (INIS)

    Mailhe, P.; Barbier, B.; Garnier, C.; Landskron, H.; Sedlacek, R.; Arimescu, I.; Smith, M.; Bellanger, P.

    2013-01-01

    The availability of reliable tools and associated methodology able to accurately predict the LWR fuel behavior in all conditions is of great importance for safe and economic fuel usage. For that purpose, AREVA has developed its new global fuel rod performance code GALILEO along with its associated realistic thermal-mechanical analysis methodology. This realistic methodology is based on a Monte Carlo type random sampling of all relevant input variables. After having outlined the AREVA realistic methodology, this paper will be focused on the GALILEO code benchmarking process, on its extended experimental database and on the GALILEO model uncertainties assessment. The propagation of these model uncertainties through the AREVA realistic methodology is also presented. This GALILEO model uncertainties processing is of the utmost importance for accurate fuel design margin evaluation as illustrated on some application examples. With the submittal of Topical Report GALILEO to the U.S. NRC in 2013, GALILEO and its methodology are on the way to be industrially used in a wide range of irradiation conditions. (authors)

  2. Capturing complexity in work disability research: application of system dynamics modeling methodology.

    Science.gov (United States)

    Jetha, Arif; Pransky, Glenn; Hettinger, Lawrence J

    2016-01-01

    Work disability (WD) is characterized by variable and occasionally undesirable outcomes. The underlying determinants of WD outcomes include patterns of dynamic relationships among health, personal, organizational and regulatory factors that have been challenging to characterize, and inadequately represented by contemporary WD models. System dynamics modeling (SDM) methodology applies a sociotechnical systems thinking lens to view WD systems as comprising a range of influential factors linked by feedback relationships. SDM can potentially overcome limitations in contemporary WD models by uncovering causal feedback relationships, and conceptualizing dynamic system behaviors. It employs a collaborative and stakeholder-based model building methodology to create a visual depiction of the system as a whole. SDM can also enable researchers to run dynamic simulations to provide evidence of anticipated or unanticipated outcomes that could result from policy and programmatic intervention. SDM may advance rehabilitation research by providing greater insights into the structure and dynamics of WD systems while helping to understand inherent complexity. Challenges related to data availability, determining validity, and the extensive time and technical skill requirements for model building may limit SDM's use in the field and should be considered. Contemporary work disability (WD) models provide limited insight into complexity associated with WD processes. System dynamics modeling (SDM) has the potential to capture complexity through a stakeholder-based approach that generates a simulation model consisting of multiple feedback loops. SDM may enable WD researchers and practitioners to understand the structure and behavior of the WD system as a whole, and inform development of improved strategies to manage straightforward and complex WD cases.

  3. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    Indroduction Urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and it has significant economic and social consequences. While the cost of the direct flood damages of urban flooding is well understood, the indirect damages, like the water borne diseases is in general still poorly understood. Climate changes are expected to increase the frequency of urban flooding in many countries which is likely to increase water borne diseases. Diarrheal diseases are most prevalent in developing countries, where poor sanitation, poor drinking water and poor surface water quality causes a high disease burden and mortality, especially during floods. The level of water borne diarrhea in countries with well-developed water and waste water infrastructure has been reduced to an acceptable level, and the population in general do not consider waste water as being a health risk. Hence, exposure to wastewater influenced urban flood water still has the potential to cause transmission of diarrheal diseases. When managing urban flooding and planning urban climate change adaptations, health risks are rarely taken into consideration. This paper outlines a novel methodology for linking dynamic urban flood modelling with Quantitative Microbial Risk Assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and the health risks caused by direct human contact with flood water and provides an option for reducing the burden of disease in the population through the use of intelligent urban flood risk management. Methodology We have linked hydrodynamic urban flood modelling with quantitative microbial risk assessment (QMRA) to determine the risk of infection caused by exposure to wastewater influenced urban flood water. The deterministic model MIKE Flood, which integrates the sewer network model in MIKE Urban and the 2D surface model MIKE21, was used to calculate the concentration of pathogens in the

  4. External validation of multivariable prediction models: a systematic review of methodological conduct and reporting

    Science.gov (United States)

    2014-01-01

    Background Before considering whether to use a multivariable (diagnostic or prognostic) prediction model, it is essential that its performance be evaluated in data that were not used to develop the model (referred to as external validation). We critically appraised the methodological conduct and reporting of external validation studies of multivariable prediction models. Methods We conducted a systematic review of articles describing some form of external validation of one or more multivariable prediction models indexed in PubMed core clinical journals published in 2010. Study data were extracted in duplicate on design, sample size, handling of missing data, reference to the original study developing the prediction models and predictive performance measures. Results 11,826 articles were identified and 78 were included for full review, which described the evaluation of 120 prediction models. in participant data that were not used to develop the model. Thirty-three articles described both the development of a prediction model and an evaluation of its performance on a separate dataset, and 45 articles described only the evaluation of an existing published prediction model on another dataset. Fifty-seven percent of the prediction models were presented and evaluated as simplified scoring systems. Sixteen percent of articles failed to report the number of outcome events in the validation datasets. Fifty-four percent of studies made no explicit mention of missing data. Sixty-seven percent did not report evaluating model calibration whilst most studies evaluated model discrimination. It was often unclear whether the reported performance measures were for the full regression model or for the simplified models. Conclusions The vast majority of studies describing some form of external validation of a multivariable prediction model were poorly reported with key details frequently not presented. The validation studies were characterised by poor design, inappropriate handling

  5. New droplet model developments

    International Nuclear Information System (INIS)

    Dorso, C.O.; Myers, W.D.; Swiatecki, W.J.; Moeller, P.; Treiner, J.; Weiss, M.S.

    1985-09-01

    A brief summary is given of three recent contributions to the development of the Droplet Model. The first concerns the electric dipole moment induced in octupole deformed nuclei by the Coulomb redistribution. The second concerns a study of squeezing in nuclei and the third is a study of the improved predictive power of the model when an empirical ''exponential'' term is included. 25 refs., 3 figs

  6. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2017-01-01

    of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...... philosophy, and still trying to develop a design method or tool with a certain general applicability is discussed at the end of the paper. Experiences from a recent design workshop are described and discussed, with a focus on what specific steps have been taken in order to apply the method successfully...

  7. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    2010-01-01

    of particular ‘mediating design artefacts’. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...... philosophy, and still trying to develop a design method or tool with a certain general applicability is discussed at the end of the paper. Experiences from a recent design workshop are described and discussed, with a focus on what specific steps have been taken in order to apply the method successfully...

  8. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Buus, Lillian; Georgsen, Marianne; Ryberg, Thomas

    of particular ?mediating design artefacts?. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...... philosophy, and still trying to develop a design method or tool with a certain general applicability is discussed at the end of the paper. Experiences from a recent design workshop are described and discussed, with a focus on what specific steps have been taken in order to apply the method successfully...

  9. Catalytic Reforming: Methodology and Process Development for a Constant Optimisation and Performance Enhancement

    Directory of Open Access Journals (Sweden)

    Avenier Priscilla

    2016-05-01

    Full Text Available Catalytic reforming process has been used to produce high octane gasoline since the 1940s. It would appear to be an old process that is well established and for which nothing new could be done. It is however not the case and constant improvements are proposed at IFP Energies nouvelles. With a global R&D approach using new concepts and forefront methodology, IFPEN is able to: propose a patented new reactor concept, increasing capacity; ensure efficiency and safety of mechanical design for reactor using modelization of the structure; develop new catalysts to increase process performance due to a high comprehension of catalytic mechanism by using, an experimental and innovative analytical approach (119Sn Mössbauer and X-ray absorption spectroscopies and also a Density Functional Theory (DFT calculations; have efficient, reliable and adapted pilots to validate catalyst performance.

  10. Development of a methodology for assessing the environmental impact of radioactivity in Northern Marine environments

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.E. [Norwegian Radiation Protection Authority, Grini naeringspark 13, N-1332, Osteras (Norway); Hosseini, A. [Norwegian Radiation Protection Authority, Grini naeringspark 13, N-1332, Osteras (Norway); Borretzen, P. [Norwegian Radiation Protection Authority, Grini naeringspark 13, N-1332, Osteras (Norway); Thorring, H. [Norwegian Radiation Protection Authority, Grini naeringspark 13, N-1332, Osteras (Norway)]. E-mail havard.thorring@nrpa.no

    2006-10-15

    The requirement to assess the impacts of radioactivity in the environment explicitly and transparently is now generally accepted by the scientific community. A recently developed methodology for achieving this end for marine ecosystems is presented within this paper. With its clear relationship to an overarching system, the marine impact assessment is built around components of environmental transfer, ecodosimetry and radiobiological effects appraisal relying on the use of 'reference organisms'. Concentration factors (CFs), dynamic models and, in cases where parameters are missing, allometry have been employed in the consideration of radionuclide transfer. Dose conversion coefficients (DCCs) have been derived for selected flora and fauna using, inter alia, dose attenuation and chord distribution functions. The calculated dose-rates can be contextualised through comparison with dose-rates arising from natural background and chronic dose-rates at which biological effects have been observed in selected 'umbrella' endpoints.

  11. Development of a methodology for assessing the environmental impact of radioactivity in Northern Marine environments

    International Nuclear Information System (INIS)

    Brown, J.E.; Hosseini, A.; Borretzen, P.; Thorring, H. . E-mail havard.thorring@nrpa.no

    2006-01-01

    The requirement to assess the impacts of radioactivity in the environment explicitly and transparently is now generally accepted by the scientific community. A recently developed methodology for achieving this end for marine ecosystems is presented within this paper. With its clear relationship to an overarching system, the marine impact assessment is built around components of environmental transfer, ecodosimetry and radiobiological effects appraisal relying on the use of 'reference organisms'. Concentration factors (CFs), dynamic models and, in cases where parameters are missing, allometry have been employed in the consideration of radionuclide transfer. Dose conversion coefficients (DCCs) have been derived for selected flora and fauna using, inter alia, dose attenuation and chord distribution functions. The calculated dose-rates can be contextualised through comparison with dose-rates arising from natural background and chronic dose-rates at which biological effects have been observed in selected 'umbrella' endpoints

  12. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained

  13. A methodology for constructing the calculation model of scientific spreadsheets

    NARCIS (Netherlands)

    Vos, de M.; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are

  14. DEVELOPMENT OF METHODOLOGY FOR DESIGNING TESTABLE COMPONENT STRUCTURE OF DISCIPLINARY COMPETENCE

    Directory of Open Access Journals (Sweden)

    Vladimir I. Freyman

    2014-01-01

    Full Text Available The aim of the study is to present new methods of quality results assessment of the education corresponding to requirements of Federal State Educational Standards (FSES of the Third Generation developed for the higher school. The urgency of search of adequate tools for quality competency measurement and its elements formed in the course of experts’ preparation are specified. Methods. It is necessary to consider interference of competency components such as knowledge, abilities, possession in order to make procedures of assessment of students’ achievements within the limits of separate discipline or curriculum section more convenient, effective and exact. While modeling of component structure of the disciplinary competence the testable design of components is used; the approach borrowed from technical diagnostics. Results. The research outcomes include the definition and analysis of general iterative methodology for testable designing component structure of the disciplinary competence. Application of the proposed methodology is illustrated as the example of an abstract academic discipline with specified data and index of labour requirement. Methodology restrictions are noted; practical recommendations are given. Scientific novelty. Basic data and a detailed step-by-step implementation phase of the proposed common iterative approach to the development of disciplinary competence testable component structure are considered. Tests and diagnostic tables for different options of designing are proposed. Practical significance. The research findings can help promoting learning efficiency increase, a choice of adequate control devices, accuracy of assessment, and also efficient use of personnel, temporal and material resources of higher education institutions. Proposed algorithms, methods and approaches to procedure of control results organization and realization of developed competences and its components can be used as methodical base while

  15. An interactive boundary layer modelling methodology for aerodynamic flows

    CSIR Research Space (South Africa)

    Smith, L

    2013-01-01

    Full Text Available -of-boundary layer flow, with the inviscid flow approximation: Continuity 0= ∂ ∂ j j u x ρ (1) Conservation of momentum (Newton’s second law) ( ) ( )             ∂ ∂ − ∂ ∂ + ∂ ∂ ∂ ∂ += ∂ ∂ + ∂ ∂ + ∂ ∂ ij k k i j j i j i i ji j i...-integral boundary layer solutions to a generic inviscid solver in an iterative fashion. Design/methodology/approach –The boundary layer solution is obtained using the two-integral method to solve displacement thickness point by point with a local Newton method...

  16. Development and application of a methodology for identifying and characterising scenarios

    International Nuclear Information System (INIS)

    Billington, D.; Bailey, L.

    1998-01-01

    This report forms part of a suite of documents describing the Nirex model development programme. The programme is designed to provide a clear audit trail from the identification of significant features, events and processes (FEPs) to the models and modelling processes employed within a detailed performance assessment. A scenario approach to performance assessment has been adopted. It is proposed that potential evolutions of a deep geological radioactive waste repository can be represented by a base scenario and a number of variant scenarios. It is intended that assessment of the base, scenario would form the core of any future performance assessment. The base scenario is chosen to be broad-ranging and to represent the natural evolution of the repository system and its surrounding environment. The base scenario is defined to include all those FEPs which are certain to occur and those which are judged likely to occur for a significant period of the assessment timescale. Variant scenarios are defined by FEPs which represent a significant perturbation to the natural system evolution, for example the occurrence of a large seismic event. A variant scenario defined by a single initiating FEP is characterised by a sequence of events. This is represented as a 'timeline' which forms the basis for modelling that scenario. To generate a variant scenario defined by two initiating FEPs, a methodology is presented for combining the timelines for the two underlying 'single-FEP' variants. The resulting series of event sequences can be generated automatically. These sequences are then reviewed, in order to reduce the number of timelines requiring detailed consideration. This is achieved in two ways: by aggregating sequences which have similar consequence in terms of safety performance; and by combining successive intervals along a timeline where appropriate. In the context of a performance assessment, the aim is to determine the conditional risk and appropriate weight for each

  17. A system-of-systems modeling methodology for strategic general aviation design decision-making

    Science.gov (United States)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  18. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  19. MODEL AND METHOD FOR SYNTHESIS OF PROJECT MANAGEMENT METHODOLOGY WITH FUZZY INPUT DATA

    Directory of Open Access Journals (Sweden)

    Igor V. KONONENKO

    2016-02-01

    Full Text Available Literature analysis concerning the selection or creation a project management methodology is performed. Creating a "complete" methodology is proposed which can be applied to managing projects with any complexity, various degrees of responsibility for results and different predictability of the requirements. For the formation of a "complete" methodology, it is proposed to take the PMBOK standard as the basis, which would be supplemented by processes of the most demanding plan driven and flexible Agile Methodologies. For each knowledge area of the PMBOK standard, The following groups of processes should be provided: initiation, planning, execution, reporting, and forecasting, controlling, analysis, decision making and closing. The method for generating a methodology for the specific project is presented. The multiple criteria mathematical model and method aredeveloped for the synthesis of methodology when initial data about the project and its environment are fuzzy.

  20. A MAINTENANCE STRATEGY MODEL FOR STATIC EQUIPMENT USING INSPECTION METHODOLOGIES AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.K. Visser

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Mechanical equipment used on process plants can be categorised into two main types, namely static and rotating equipment. A brief survey at a number of chemical process plants indicated that a number of maintenance strategies exist and are used for rotating equipment. However, some of these strategies are not directly applicable to static equipment, although the risk-based inspection (RBI methodology has been developed for pressure vessels. A generalised risk-based maintenance strategy for all types of static equipment does not currently exist. This paper describes the development of an optimised model of inspection methodologies, maintenance strategies, and risk management principles that are generically applicable for static equipment. It enables maintenance managers and engineers to select an applicable maintenance strategy and inspection methodology, based on the operational and business risks posed by the individual pieces of equipment.

    AFRIKAANSE OPSOMMING: Meganiese toerusting wat op prosesaanlegte gebruik word kan in twee kategorieë verdeel word, naamlik statiese en roterende toerusting. 'n Bondige ondersoek by 'n aantal chemiese prosesaanlegte het aangedui dat 'n aantal strategieë vir instandhouding van roterende toerusting gebruik word, terwyl die risikogebaseerde inspeksiemetodologie wel vir drukvate gebruik word. 'n Algemene risikogebaseerde instandhoudingstrategie vir alle tipes statiese toerusting is egter nie tans beskikbaar nie. Hierdie artikel beskryf die ontwikkeling van 'n geoptimeerde model van inspeksiemetodologieë, instandhoudingstrategieë, en risikobestuursbeginsels wat algemeen gebruik kan word vir statiese toerusting. Dit stel die instandhouding-bestuurders en -ingenieurs in staat om 'n instandhoudingstrategie en inspeksie-metodologie te kies, gebaseer op die operasionele en besigheidsrisiko's van die individuele toerusting.

  1. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  2. A framework for assessing the adequacy and effectiveness of software development methodologies

    Science.gov (United States)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  3. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  4. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    Science.gov (United States)

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

  5. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  6. Bootstrap data methodology for sequential hybrid model building

    Science.gov (United States)

    Volponi, Allan J. (Inventor); Brotherton, Thomas (Inventor)

    2007-01-01

    A method for modeling engine operation comprising the steps of: 1. collecting a first plurality of sensory data, 2. partitioning a flight envelope into a plurality of sub-regions, 3. assigning the first plurality of sensory data into the plurality of sub-regions, 4. generating an empirical model of at least one of the plurality of sub-regions, 5. generating a statistical summary model for at least one of the plurality of sub-regions, 6. collecting an additional plurality of sensory data, 7. partitioning the second plurality of sensory data into the plurality of sub-regions, 8. generating a plurality of pseudo-data using the empirical model, and 9. concatenating the plurality of pseudo-data and the additional plurality of sensory data to generate an updated empirical model and an updated statistical summary model for at least one of the plurality of sub-regions.

  7. Developing a methodology for identifying action zones to protect and manage groundwater well fields

    Science.gov (United States)

    Bellier, Sandra; Viennot, Pascal; Ledoux, Emmanuel; Schott, Celine

    2013-04-01

    Implementation of a long term action plan to manage and protect well fields is a complex and very expensive process. In this context, the relevance and efficiency of such action plans on water quality should be evaluated. The objective of this study is to set up a methodology to identify relevant actions zones in which environmental changes may significantly impact the quantity or quality of pumped water. In the Seine-et-Marne department (France), under French environmental laws three sectors integrating numerous well-field pumping in Champigny's limestone aquifer are considered as priority. This aquifer, located at south-east of Paris, supplies more than one million people with drinking water. Catchments areas of these abstractions are very large (2000 km2) and their intrinsic vulnerability was established by a simple parametric approach that does not permit to consider the complexity of hydrosystem. Consequently, a methodology based on a distributed modeling of the process of the aquifer was developed. The basin is modeled using the hydrogeological model MODCOU, developed in MINES ParisTech since the 1980s. It simulates surface and groundwater flow in aquifer systems and allows to represent the local characteristics of the hydrosystem (aquifers communicating by leakage, rivers infiltration, supply from sinkholes and locally perched or dewatering aquifers). The model was calibrated by matching simulated river discharge hydrographs and piezometric heads with observed ones since the 1970s. Thanks to this modelling tool, a methodology based on the transfer of a theoretical tracer through the hydrosystem from the ground surface to the outlets was implemented to evaluate the spatial distribution of the contribution areas at contrasted, wet or dry recharge periods. The results show that the surface of areas contributing to supply most catchments is lower than 300 km2 and the major contributory zones are located along rivers. This finding illustrates the importance of

  8. An Effective Methodology with Automated Product Configuration for Software Product Line Development

    Directory of Open Access Journals (Sweden)

    Scott Uk-Jin Lee

    2015-01-01

    Full Text Available The wide adaptation of product line engineering in software industry has enabled cost effective development of high quality software for diverse market segments. In software product line (SPL, a family of software is specified with a set of core assets representing reusable features with their variability, dependencies, and constraints. From such core assets, valid software products are configured after thoroughly analysing the represented features and their properties. However, current implementations of SPL lack effective means to configure a valid product as core assets specified in SPL, being high-dimensional data, are often too complex to analyse. This paper presents a time and cost effective methodology with associated tool supports to design a SPL model, analyse features, and configure a valid product. The proposed approach uses eXtensible Markup Language (XML to model SPL, where an adequate schema is defined to precisely specify core assets. Furthermore, it enables automated product configuration by (i extracting all the properties of required features from a given SPL model and calculating them with Alloy Analyzer; (ii generating a decision model with appropriate eXtensible Stylesheet Language Transformation (XSLT instructions embedded in each resolution effect; and (iii processing XSLT instructions of all the selected resolution effects.

  9. Associated with aerospace vehicles development of methodologies for the estimation of thermal properties

    Science.gov (United States)

    Scott, Elaine P.

    1994-01-01

    Thermal stress analyses are an important aspect in the development of aerospace vehicles at NASA-LaRC. These analyses require knowledge of the temperature distributions within the vehicle structures which consequently necessitates the need for accurate thermal property data. The overall goal of this ongoing research effort is to develop methodologies for the estimation of the thermal property data needed to describe the temperature responses of these complex structures. The research strategy undertaken utilizes a building block approach. The idea here is to first focus on the development of property estimation methodologies for relatively simple conditions, such as isotropic materials at constant temperatures, and then systematically modify the technique for the analysis of more and more complex systems, such as anisotropic multi-component systems. The estimation methodology utilized is a statistically based method which incorporates experimental data and a mathematical model of the system. Several aspects of this overall research effort were investigated during the time of the ASEE summer program. One important aspect involved the calibration of the estimation procedure for the estimation of the thermal properties through the thickness of a standard material. Transient experiments were conducted using a Pyrex standard at various temperatures, and then the thermal properties (thermal conductivity and volumetric heat capacity) were estimated at each temperature. Confidence regions for the estimated values were also determined. These results were then compared to documented values. Another set of experimental tests were conducted on carbon composite samples at different temperatures. Again, the thermal properties were estimated for each temperature, and the results were compared with values obtained using another technique. In both sets of experiments, a 10-15 percent off-set between the estimated values and the previously determined values was found. Another effort

  10. Development of an Expert Judgement Elicitation and Calibration Methodology for Risk Analysis in Conceptual Vehicle Design

    Science.gov (United States)

    Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina

    2004-01-01

    A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.

  11. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  12. DEVELOPMENT OF METHODOLOGY FOR THE CALCULATION OF THE PROJECT INNOVATION INDICATOR AND ITS CRITERIA COMPONENTS

    Directory of Open Access Journals (Sweden)

    Mariya Vishnevskaya

    2017-12-01

    Full Text Available Two main components of the problem studied in the article are revealed. At the practical level, the provision of the convenient tools allowing a comprehensive evaluation the proposed innovative project in terms of its possibilities for inclusion in the portfolio or development program, and on the level of science – the need for improvement and complementing the existing methodology of assessment of innovative projects attractiveness in the context of their properties and a specific set of components. The research is scientifically applied since the problem solution involves the science-based development of a set of techniques, allowing the practical use of knowledge gained from large information arrays at the initialization stage. The purpose of the study is the formation of an integrated indicator of the project innovation, with a substantive justification of the calculation method, as a tool for the evaluation and selection of projects to be included in the portfolio of projects and programs. The theoretical and methodological basis of the research is the conceptual provisions and scientific developments of experts on project management issues, published in monographs, periodicals, materials of scientific and practical conferences on the topic of research. The tasks were solved using the general scientific and special methods, mathematical modelling methods based on the system approach. Results. A balanced system of parametric single indicators of innovation is presented – the risks, personnel, quality, innovation, resources, and performers, which allows getting a comprehensive idea of any project already in the initial stages. The choice of a risk tolerance as a key criterion of the “risks” element and the reference characteristics is substantiated, in relation to which it can be argued that the potential project holds promise. A tool for calculating the risk tolerance based on the use of matrices and vector analysis is proposed

  13. Development of a combat aircraft operational and cost-effectiveness design methodology

    OpenAIRE

    Nilubol, Otsin

    2005-01-01

    This study set out to develop an aircraft design methodology, which gives com- bat aircraft more operational and cost-effectiveness by considering these factors early in the design process. In this methodology, an aircraft will be considered as a sub-system of an overall system, representing an entire operation scenario. Measures of operational and operational cost-effectiveness indicate the quality of, and relationships between, the major design aspects; i. e. susceptibility, ...

  14. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    OpenAIRE

    Park, Yoonkyung; Pradhan, Ananta Man Singh; Kim, Ungtae; Kim, Yun-Tae; Kim, Sangdan

    2016-01-01

    An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale), and then urban vulnerability is assessed by two categori...

  15. Developing pedagogical competency during teacher education through the experiential learning of didactic and methodological approaches

    OpenAIRE

    Zrim Martinjak, Nataša

    2017-01-01

    This paper considers the development of pedagogic competency during teacher education and in particular the experiential learning of didactic and methodological approaches in students of social pedagogy. The decision to focus upon experiential learning is based on the assumption and realization that the study of didactic and methodology cannot take place without experiential learning. The main goals were to gain teaching competences in order to work with the whole class, to mee...

  16. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions......There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...

  17. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  18. A development methodology for a remote inspection system with JAVA and socket

    International Nuclear Information System (INIS)

    Choi, Yoo Rark; Lee, Jae Cheol; Kim, Jae Hee

    2004-01-01

    We have developed RISYS (Reactor Inspection System) which inspects reactor vessel welds by an underwater mobile robot. The system consists of a main control computer and an inspection robot which is controlled by the main control computer. Since the environments of the inspection tasks in a nuclear plant, like in other industrial fields, is very poor, serious accidents often happen. Therefore the necessity for remote inspection and control system has increased more and more. We have carried out the research for a remote inspection model for RISYS, and have adopted the world wide web, java, and socket technologies for it. Client interface to access the main control computer that controls the inspection equipment is essential for the development of a remote inspection system. It has been developed with a traditional programming language, for example, Visual C++, Visual Basic and X-Window. However, it is too expensive to vend and maintain the version of a interface program because of the different computer O/S. Nevertheless web and java technologies come to the fore to solve the problems but the java interpreting typed language could incur a performance problem in operating the remote inspection system. We suggest a methodology for developing a remote inspection system with java, a traditional programming language, and a socket programming that solves the java performance problem in this paper

  19. [The general methodological approaches identifying strategic positions in developing healthy lifestyle of population].

    Science.gov (United States)

    Dorofeev, S B; Babenko, A I

    2017-01-01

    The article deals with analysis of national and international publications concerning methodological aspects of elaborating systematic approach to healthy life-style of population. This scope of inquiry plays a key role in development of human capital. The costs related to healthy life-style are to be considered as personal investment into future income due to physical incrementation of human capital. The definitions of healthy life-style, its categories and supportive factors are to be considered in the process of development of strategies and programs of healthy lifestyle. The implementation of particular strategies entails application of comprehensive information and educational programs meant for various categories of population. Therefore, different motivation techniques are to be considered for children, adolescents, able-bodied population, the elderly. This approach is to be resulted in establishing particular responsibility for national government, territorial administrations, health care administrations, employers and population itself. The necessity of complex legislative measures is emphasized. The recent social hygienic studies were focused mostly on particular aspects of development of healthy life-style of population. Hence, the demand for long term exploration of development of organizational and functional models implementing medical preventive measures on the basis of comprehensive information analysis using statistical, sociological and professional expertise.

  20. Development of Registration methodology to 3-D Point Clouds in Robot Scanning

    Directory of Open Access Journals (Sweden)

    Chen Liang-Chia

    2016-01-01

    Full Text Available The problem of multi-view 3-D point clouds registration is investigated and effectively resolved by the developed methodology. A registration method is proposed to register two series of scans into an object model by using the proposed oriented-bounding-box (OBB regional area-based descriptor. Robot 3-D scanning is often employed to generate set of point clouds of physical objects. The automated operation has to successively digitize view-dependent area-scanned point clouds from complex shaped objects by multi-view point clouds registration. To achieve this, the OBB regional area-based descriptor is employed to determine an initial transformation matrix and is then refined employing iterative closest point (ICP algorithm. The developed method can be used to resolve the commonly encountered difficulty in accurately merging two neighbouring area-scanned images when no coordinate reference exists. The developed method has been verified through some experimental tests for its registration accuracy. Experimental results have preliminarily demonstrated the feasibility of the developed method.

  1. Methodologies in the modeling of combined chemo-radiation treatments

    Science.gov (United States)

    Grassberger, C.; Paganetti, H.

    2016-11-01

    The variety of treatment options for cancer patients has increased significantly in recent years. Not only do we combine radiation with surgery and chemotherapy, new therapeutic approaches such as immunotherapy and targeted therapies are starting to play a bigger role. Physics has made significant contributions to radiation therapy treatment planning and delivery. In particular, treatment plan optimization using inverse planning techniques has improved dose conformity considerably. Furthermore, medical physics is often the driving force behind tumor control and normal tissue complication modeling. While treatment optimization and outcome modeling does focus mainly on the effects of radiation, treatment modalities such as chemotherapy are treated independently or are even neglected entirely. This review summarizes the published efforts to model combined modality treatments combining radiation and chemotherapy. These models will play an increasing role in optimizing cancer therapy not only from a radiation and drug dosage standpoint, but also in terms of spatial and temporal optimization of treatment schedules.

  2. Methodology for assessing electric vehicle charging infrastructure business models

    OpenAIRE

    Madina, Carlos; Zamora, Inmaculada; Zabala, Eduardo

    2016-01-01

    The analysis of economic implications of innovative business models in networked environments, as electro-mobility is, requires a global approach to ensure that all the involved actors obtain a benefit. Although electric vehicles (EVs) provide benefits for the society as a whole, there are a number of hurdles for their widespread adoption, mainly the high investment cost for the EV and for the infrastructure. Therefore, a sound business model must be built up for charging service operators, w...

  3. Box & Jenkins Model Identification:A Comparison of Methodologies

    Directory of Open Access Journals (Sweden)

    Maria Augusta Soares Machado

    2012-12-01

    Full Text Available This paper focuses on a presentation of a comparison of a neuro-fuzzy back propagation network and Forecast automatic model Identification to identify automatically Box & Jenkins non seasonal models.Recently some combinations of neural networks and fuzzy logic technologies have being used to deal with uncertain and subjective problems. It is concluded on the basis of the obtained results that this type of approach is very powerful to be used.

  4. Systematic reviews of animal models: methodology versus epistemology.

    Science.gov (United States)

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  5. LIVING USABILITY LAB METHODOLOGY FOR THE DEVELOPMENT OF AMBIENT ASSISTED LIVING SYSTEMS AND SERVICES

    Directory of Open Access Journals (Sweden)

    Alexandre Queirós

    2013-10-01

    Full Text Available The paper aims to present the research work associated with the consolidation of the Living Usability Lab (LUL, an ecosystem devoted to the development of Ambient Assisted Living (AAL systems and services. The paper refers the motivations behind the development of LUL and presents its goals and its constituent entities: i stakeholders; ii methodological approaches; iii applications; iv development platform and v logical and physical infrastructure. In particular, it presents the Living Usability Lab methodology, which aims the active involvement of potential end users and other stakeholders in all phases of the AAL systems and services development in order to optimize them in terms of usability, effectiveness and acceptance. Keywords: Usability, Ambient Assisted Living, Living Lab Methodology.

  6. Prototype methodology for obtaining cloud seeding guidance from HRRR model data

    Science.gov (United States)

    Dawson, N.; Blestrud, D.; Kunkel, M. L.; Waller, B.; Ceratto, J.

    2017-12-01

    Weather model data, along with real time observations, are critical to determine whether atmospheric conditions are prime for super-cooled liquid water during cloud seeding operations. Cloud seeding groups can either use operational forecast models, or run their own model on a computer cluster. A custom weather model provides the most flexibility, but is also expensive. For programs with smaller budgets, openly-available operational forecasting models are the de facto method for obtaining forecast data. The new High-Resolution Rapid Refresh (HRRR) model (3 x 3 km grid size), developed by the Earth System Research Laboratory (ESRL), provides hourly model runs with 18 forecast hours per run. While the model cannot be fine-tuned for a specific area or edited to provide cloud-seeding-specific output, model output is openly available on a near-real-time basis. This presentation focuses on a prototype methodology for using HRRR model data to create maps which aid in near-real-time cloud seeding decision making. The R programming language is utilized to run a script on a Windows® desktop/laptop computer either on a schedule (such as every half hour) or manually. The latest HRRR model run is downloaded from NOAA's Operational Model Archive and Distribution System (NOMADS). A GRIB-filter service, provided by NOMADS, is used to obtain surface and mandatory pressure level data for a subset domain which greatly cuts down on the amount of data transfer. Then, a set of criteria, identified by the Idaho Power Atmospheric Science Group, is used to create guidance maps. These criteria include atmospheric stability (lapse rates), dew point depression, air temperature, and wet bulb temperature. The maps highlight potential areas where super-cooled liquid water may exist, reasons as to why cloud seeding should not be attempted, and wind speed at flight level.

  7. First- and Second-Order Methodological Developments from the Coleman Report

    Directory of Open Access Journals (Sweden)

    Samuel R. Lucas

    2016-09-01

    Full Text Available Equality of Educational Opportunity was a watershed for sociological engagement with public policy, yet the questions the project addressed drew attention to several challenging methodological issues. Statistical advances, such as the multilevel model, were important first-order developments from the Coleman Report. Second-order developments, however, may be far less visible but perhaps even more important. Second-order developments of the Coleman Report stem from two sources: (1 social scientists’ reactions to proposed resolutions of the statistical challenges that the report navigated, and (2 Coleman’s own (perhaps implicit theoretical response to criticisms of such works as Equality of Educational Opportunity. Heightened interest in the challenge of identification serves as an example of the former type of second-order effect, whereas “Coleman’s boat” (Coleman 1990—and the social analytics that adopt, among other approaches, simulation strategies of inquiry consistent with Coleman’s typology of causal pathways—serves as an example of the latter. First-order developments take the questions as given and see the challenge as a practical, technical issue; second-order developments explicitly or implicitly reassess the question, treating the challenge as epistemological or social-theoretic. Second-order developments therefore may change the game, upsetting or rejecting routine practice at a fundamental level. I contend that as knowledge of second-order developments and their means of practical implementation in analyses diffuses among social analysts, they will prove of far more value than first-order developments to social understanding, sociology, and social policy.

  8. Development of the methodology and approaches to validate safety and accident management

    International Nuclear Information System (INIS)

    Asmolov, V.G.

    1997-01-01

    The article compares the development of the methodology and approaches to validate the nuclear power plant safety and accident management in Russia and advanced industrial countries. It demonstrates that the development of methods of safety validation is dialectically related to the accumulation of the knowledge base on processes and events during NPP normal operation, transients and emergencies, including severe accidents. The article describes the Russian severe accident research program (1987-1996), the implementation of which allowed Russia to reach the world level of the safety validation efforts, presents future high-priority study areas. Problems related to possible approaches to the methodological accident management development are discussed. (orig.)

  9. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1985-01-01

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are gound-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission. The approach followed consists of a description of the overall system (waste, facility, and site), scenario selection and screening, consequence modeling (source term, ground-water flow, radionuclide transport, biosphere transport, and health effects), and uncertainty and sensitivity analysis

  10. Conceptualizing sustainable development. An assessment methodology connecting values, knowledge, worldviews and scenarios

    International Nuclear Information System (INIS)

    De Vries, Bert J.M.; Petersen, Arthur C.

    2009-01-01

    Sustainability science poses severe challenges to classical disciplinary science. To bring the perspectives of diverse disciplines together in a meaningful way, we describe a novel methodology for sustainability assessment of a particular social-ecological system, or country. Starting point is that a sustainability assessment should investigate the ability to continue and develop a desirable way of living vis-a-vis later generations and life elsewhere on the planet. Evidently, people hold different values and beliefs about the way societies sustain quality of life for their members. The first step, therefore, is to analyze people's value orientations and the way in which they interpret sustainability problems i.e. their beliefs. The next step is to translate the resulting worldviews into model-based narratives, i.e. scenarios. The qualitative and quantitative outcomes are then investigated in terms of associated risks and opportunities and robustness of policy options. The Netherlands Environmental Assessment Agency (PBL) has followed this methodology, using extensive surveys among the Dutch population. In its First Sustainability Outlook (2004), the resulting archetypical worldviews became the basis for four different scenarios for policy analysis, with emphases on the domains of transport, energy and food. The goal of the agency's Sustainability Outlooks is to show that choices are inevitable in policy making for sustainable development, to indicate which positive and negative impacts one can expect of these choices (trade-offs), and to identify options that may be robust under several worldviews. The conceptualization proposed here is both clear and applicable in practical sustainability assessments for policy making. (author)

  11. NEW METHODOLOGY FOR DEVELOPMENT OF ORODISPERSIBLE TABLETS USING HIGH-SHEAR GRANULATION PROCESS.

    Science.gov (United States)

    Ali, Bahaa E; Al-Shedfat, Ramadan I; Fayed, Mohamed H; Alanazi, Fars K

    2017-05-01

    Development of orodispersible delivery system of high mechanical properties and low disintegration time is a big challenge. The aim of the current work was to assess and optimize the high shear granulation process as a new methodology for development of orodispersible tablets of high quality attributes using design of experiment approach. A two factor, three levels (32), full factorial design was carried out to investigate the main and interaction effects of independent variables, water amount (XI) and granulation time (X2) on the characteristics of granules and final product, tablet. The produced granules were analyzed for their granule size, density and flowability. Furthermore, the produced tablets were tested for: weight variation, breaking force/ crushing strength, friability, disintegration time and drug dissolution. Regression analysis results of multiple linear models showed a high correlation between the adjusted R-squared and predicted R-squared for all granules and tablets characteristics, the difference is less than 0.2. All dependent responses of granules and tablets were found to be impacted significantly (p granules and tablet characteristics as shown by higher its coefficient estimate for all selected responses. Numerical optimization using desirability function was performed to optimize the variables under study to provide orodispersible system within the USP limit with respect of mechanical properties and disintegration time. It was found that the higher desirability (0.915) could be attained at the low level pf water (180 g) and short granulation time (1.65 min). Eventually, this study provides the formulator with helpful information in selecting the proper level of water and granulation time to provide an orodispersible system of high crushing strength and very low disintegration time, when high shear granulation methodology was used as a method of manufacture.

  12. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua; Alfonsi, Andrea; Askin Guler; Tunc Aldemir

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper represents an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  13. Research methodology workshops evaluation using the Kirkpatrick's model: translating theory into practice.

    Science.gov (United States)

    Abdulghani, Hamza Mohammad; Shaik, Shaffi Ahamed; Khamis, Nehal; Al-Drees, Abdulmajeed Abdulrahman; Irshad, Mohammad; Khalil, Mahmoud Salah; Alhaqwi, Ali Ibrahim; Isnani, Arthur

    2014-04-01

    Qualitative and quantitative evaluation of academic programs can enhance the development, effectiveness, and dissemination of comparative quality reports as well as quality improvement efforts. To evaluate the five research methodology workshops through assessing participants' satisfaction, knowledge and skills gain and impact on practices by the Kirkpatrick's evaluation model. The four level Kirkpatrick's model was applied for the evaluation. Training feedback questionnaires, pre and post tests, learner development plan reports and behavioral surveys were used to evaluate the effectiveness of the workshop programs. Of the 116 participants, 28 (24.1%) liked with appreciation, 62 (53.4%) liked with suggestions and 26 (22.4%) disliked the programs. Pre and post MCQs tests mean scores showed significant improvement of relevant basic knowledge and cognitive skills by 17.67% (p ≤ 0.005). Pre-and-post tests scores on workshops sub-topics also significantly improved for the manuscripts (p ≤ 0.031) and proposal writing (p ≤ 0.834). As for the impact, 56.9% of participants started research, and 6.9% published their studies. The results from participants' performance revealed an overall positive feedback and 79% of participant reported transfer of training skills at their workplace. The course outcomes achievement and suggestions given for improvements offer insight into the program which were encouraging and very useful. Encouraging "research culture" and work-based learning are probably the most powerful determinants for research promotion. These findings therefore encourage faculty development unit to continue its training and development in the research methodology aspects.

  14. A Comparison Study of a Generic Coupling Methodology for Modeling Wake Effects of Wave Energy Converter Arrays

    Directory of Open Access Journals (Sweden)

    Tim Verbrugghe

    2017-10-01

    Full Text Available Wave Energy Converters (WECs need to be deployed in large numbers in an array layout in order to have a significant power production. Each WEC has an impact on the incoming wave field, by diffracting, reflecting and radiating waves. Simulating the wave transformations within and around a WEC array is complex; it is difficult, or in some cases impossible, to simulate both these near-field and far-field wake effects using a single numerical model, in a time- and cost-efficient way in terms of computational time and effort. Within this research, a generic coupling methodology is developed to model both near-field and far-field wake effects caused by floating (e.g., WECs, platforms or fixed offshore structures. The methodology is based on the coupling of a wave-structure interaction solver (Nemoh and a wave propagation model. In this paper, this methodology is applied to two wave propagation models (OceanWave3D and MILDwave, which are compared to each other in a wide spectrum of tests. Additionally, the Nemoh-OceanWave3D model is validated by comparing it to experimental wave basin data. The methodology proves to be a reliable instrument to model wake effects of WEC arrays; results demonstrate a high degree of agreement between the numerical simulations with relative errors lower than 5 % and to a lesser extent for the experimental data, where errors range from 4 % to 17 % .

  15. Development of a methodology of evaluation of financial stability of commercial banks

    Directory of Open Access Journals (Sweden)

    Brauers Willem Karel M.

    2014-01-01

    Full Text Available The field of evaluation of financial stability of commercial banks, which emanates from persistent existence of financial crisis, induces interest of researchers for over a century. The span of prevailing methodologies stretches from over-simplified risk-return approaches to ones comprising large number of economic variables on the micro- and/or macro-economic level. Methodologies of rating agencies and current methodologies reviewed and applied by the ECB are not intended for reducing information asymmetry in the market of commercial banks. In the paper it is shown that the Lithuanian financial system is bankbased with deposits of households being its primary sources, and its stability is primarily depending on behavior of depositors. A methodology of evaluation of commercial banks with features of decreasing information asymmetry in the market of commercial banks is being developed by comparing different MCDA methods.

  16. Security Testing in Agile Web Application Development - A Case Study Using the EAST Methodology

    CERN Document Server

    Erdogan, Gencer

    2010-01-01

    There is a need for improved security testing methodologies specialized for Web applications and their agile development environment. The number of web application vulnerabilities is drastically increasing, while security testing tends to be given a low priority. In this paper, we analyze and compare Agile Security Testing with two other common methodologies for Web application security testing, and then present an extension of this methodology. We present a case study showing how our Extended Agile Security Testing (EAST) performs compared to a more ad hoc approach used within an organization. Our working hypothesis is that the detection of vulnerabilities in Web applications will be significantly more efficient when using a structured security testing methodology specialized for Web applications, compared to existing ad hoc ways of performing security tests. Our results show a clear indication that our hypothesis is on the right track.

  17. Multiphysics Simulation of Welding-Arc and Nozzle-Arc System: Mathematical-Model, Solution-Methodology and Validation

    Science.gov (United States)

    Pawar, Sumedh; Sharma, Atul

    2018-01-01

    This work presents mathematical model and solution methodology for a multiphysics engineering problem on arc formation during welding and inside a nozzle. A general-purpose commercial CFD solver ANSYS FLUENT 13.0.0 is used in this work. Arc formation involves strongly coupled gas dynamics and electro-dynamics, simulated by solution of coupled Navier-Stoke equations, Maxwell's equations and radiation heat-transfer equation. Validation of the present numerical methodology is demonstrated with an excellent agreement with the published results. The developed mathematical model and the user defined functions (UDFs) are independent of the geometry and are applicable to any system that involves arc-formation, in 2D axisymmetric coordinates system. The high-pressure flow of SF6 gas in the nozzle-arc system resembles arc chamber of SF6 gas circuit breaker; thus, this methodology can be extended to simulate arcing phenomenon during current interruption.

  18. A Roadmap for Generating Semantically Enriched Building Models According to CityGML Model via Two Different Methodologies

    Science.gov (United States)

    Floros, G.; Solou, D.; Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model's format, via semi-automatic procedures with respect to the user's scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model's generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects' purposes.

  19. Theoretic-methodological approaches to determine the content and classification of innovation-investment development strategies

    Directory of Open Access Journals (Sweden)

    Gerashenkova Tatyana

    2016-01-01

    Full Text Available The article states the necessity to form an innovation-investment strategy of enterprise development, offers an approach to its classification, determines the place of this strategy in a corporatewide strategy, gives the methodology of formation and the realization form of the innovation-investment development strategy.

  20. Development Methodology for a “Next Generation” Medical Informatics Curriculum for Clinicians

    OpenAIRE

    Rose, Eric; Zeiger, Roni; Corley, Sarah; Gorman, Paul; Yackel, Thomas; Hersh, William

    2003-01-01

    We describe a new methodology for development of a medical informatics curriculum for practicing clinicians. The curriculum is based on a biaxial framework in which information is categorized by type of application and role of the learner in relation to the application. The curriculum development process incorporates feedback from practicing clinicians on an ongoing basis.

  1. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  2. An improved methodology for dynamic modelling and simulation of ...

    Indian Academy of Sciences (India)

    This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited ...

  3. A branch-and-bound methodology within algebraic modelling systems

    NARCIS (Netherlands)

    Bisschop, J.J.; Heerink, J.B.J.; Kloosterman, G.

    1998-01-01

    Through the use of application-specific branch-and-bound directives it is possible to find solutions to combinatorial models that would otherwise be difficult or impossible to find by just using generic branch-and-bound techniques within the framework of mathematical programming. {\\sc Minto} is an

  4. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  5. The economics of climate change mitigation in developing countries - methodological and empirical results

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs.

  6. The economics of climate change mitigation in developing countries -methodological and empirical results

    International Nuclear Information System (INIS)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs

  7. Platform development for merging various information sources for water management: methodological, technical and operational aspects

    Science.gov (United States)

    Galvao, Diogo

    2013-04-01

    As a result of various economic, social and environmental factors, we can all experience the increase in importance of water resources at a global scale. As a consequence, we can also notice the increasing need of methods and systems capable of efficiently managing and combining the rich and heterogeneous data available that concerns, directly or indirectly, these water resources, such as in-situ monitoring station data, Earth Observation images and measurements, Meteorological modeling forecasts and Hydrological modeling. Under the scope of the MyWater project, we developed a water management system capable of satisfying just such needs, under a flexible platform capable of accommodating future challenges, not only in terms of sources of data but also on applicable models to extract information from it. From a methodological point of view, the MyWater platform obtains data from distinct sources, and in distinct formats, be they Satellite images or meteorological model forecasts, transforms and combines them in ways that allow them to be fed to a variety of hydrological models (such as MOHID Land, SIMGRO, etc…), which themselves can also be combined, using such approaches as those advocated by the OpenMI standard, to extract information in an automated and time efficient manner. Such an approach brings its own deal of challenges, and further research was developed under this project on the best ways to combine such data and on novel approaches to hydrological modeling (like the PriceXD model). From a technical point of view, the MyWater platform is structured according to a classical SOA architecture, with a flexible object oriented modular backend service responsible for all the model process management and data treatment, while the information extracted can be interacted with using a variety of frontends, from a web portal, including also a desktop client, down to mobile phone and tablet applications. From an operational point of view, a user can not only see

  8. Developing Agent-Oriented Video Surveillance System through Agent-Oriented Methodology (AOM

    Directory of Open Access Journals (Sweden)

    Cheah Wai Shiang

    2016-12-01

    Full Text Available Agent-oriented methodology (AOM is a comprehensive and unified agent methodology for agent-oriented software development. Although AOM is claimed to be able to cope with a complex system development, it is still not yet determined up to what extent this may be true. Therefore, it is vital to conduct an investigation to validate this methodology. This paper presents the adoption of AOM in developing an agent-oriented video surveillance system (VSS. An intruder handling scenario is designed and implemented through AOM. AOM provides an alternative method to engineer a distributed security system in a systematic manner. It presents the security system at a holistic view; provides a better conceptualization of agent-oriented security system and supports rapid prototyping as well as simulation of video surveillance system.

  9. Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities

    Science.gov (United States)

    Shivanand M., Handigund; Shweta, Bhat

    The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.

  10. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    2003) propose is the more widely accepted method for clustering unsupervised images or text documents. In our work, we focus on the method of...Numbers Interacting with Forms Modeling &Testing with Images Evaluating with KL Distances Fitting K-Means and KSMERT Pareto Charting Retrieving Top...Cambridge, MA, 2008; 121–128. Blei, D.M., Ng, A.Y., Jordan, M.I.. Latent Dirichlet allocation. The Journal of Machine Learning Research 2003; 3:993

  11. New Methodologies for the Thermal Modelling of CubeSats

    OpenAIRE

    Reiss, Philip

    2012-01-01

    One of the main threats for the success of a CubeSat mission is the unbalanced distribution of thermal loads caused by internal and external heat sources. In order to design an appropriate thermal subsystem that can cope with these loads a detailed analysis is required. However, currently available thermal software is considered as being less convenient for the application with CubeSats, mainly due to the complexity of the modelling process. This paper examines thermal engineering issues for ...

  12. A Validation Methodology for Human Behavior Representation Models

    Science.gov (United States)

    2005-05-01

    Stufflebeam , 2002). There are two means for grounding assessment scales. The first method fixes values for the tails of the scale for each subtask, general...Inaccuracy in the Development of a Best Estimate. Casualty Actuarial Society Forum, 45, 1998. Stufflebeam , D. L. Guidance for Choosing and Applying

  13. Risk methodology for geologic disposal of radioactive waste: asymptotic properties of the environmental transport model

    International Nuclear Information System (INIS)

    Helton, J.C.; Brown, J.B.; Iman, R.L.

    1981-02-01

    The Environmental Transport Model is a compartmental model developed to represent the surface movement of radionuclides. The purpose of the present study is to investigate the asymptotic behavior of the model and to acquire insight with respect to such behavior and the variables which influence it. For four variations of a hypothetical river receiving a radionuclide discharge, the following properties are considered: predicted asymptotic values for environmental radionuclide concentrations and time required for environmental radionuclide concentrations to reach 90% of their predicted asymptotic values. Independent variables of two types are used to define each variation of the river: variables which define physical properties of the river system (e.g., soil depth, river discharge and sediment resuspension) and variables which summarize radionuclide properties (i.e., distribution coefficients). Sensitivity analysis techniques based on stepwise regression are used to determine the dominant variables influencing the behavior of the model. This work constitutes part of a project at Sandia National Laboratories funded by the Nuclear Regulatory Commission to develop a methodology to assess the risk associated with geologic disposal of radioactive waste

  14. Partial least squares path modeling basic concepts, methodological issues and applications

    CERN Document Server

    Noonan, Richard

    2017-01-01

    This edited book presents the recent developments in partial least squares-path modeling (PLS-PM) and provides a comprehensive overview of the current state of the most advanced research related to PLS-PM. The first section of this book emphasizes the basic concepts and extensions of the PLS-PM method. The second section discusses the methodological issues that are the focus of the recent development of the PLS-PM method. The third part discusses the real world application of the PLS-PM method in various disciplines. The contributions from expert authors in the field of PLS focus on topics such as the factor-based PLS-PM, the perfect match between a model and a mode, quantile composite-based path modeling (QC-PM), ordinal consistent partial least squares (OrdPLSc), non-symmetrical composite-based path modeling (NSCPM), modern view for mediation analysis in PLS-PM, a multi-method approach for identifying and treating unobserved heterogeneity, multigroup analysis (PLS-MGA), the assessment of the common method b...

  15. Methodologies for Wind Turbine and STATCOM Integration in Wind Power Plant Models for Harmonic Resonances Assessment

    DEFF Research Database (Denmark)

    Freijedo Fernandez, Francisco Daniel; Chaudhary, Sanjay Kumar; Guerrero, Josep M.

    2015-01-01

    This paper approaches modelling methodologies for integration of wind turbines and STATCOM in harmonic resonance studies. Firstly, an admittance equivalent model representing the harmonic signature of grid connected voltage source converters is provided. A simplified type IV wind turbine modelling......-domain. As an alternative, a power based averaged modelling is also proposed. Type IV wind turbine harmonic signature and STATCOM active harmonic mitigation are considered for the simulation case studies. Simulation results provide a good insight of the features and limitations of the proposed methodologies....

  16. Application of fault tree methodology to modeling of the AP1000 plant digital reactor protection system

    International Nuclear Information System (INIS)

    Teolis, D.S.; Zarewczynski, S.A.; Detar, H.L.

    2012-01-01

    The reactor trip system (RTS) and engineered safety features actuation system (ESFAS) in nuclear power plants utilizes instrumentation and control (IC) to provide automatic protection against unsafe and improper reactor operation during steady-state and transient power operations. During normal operating conditions, various plant parameters are continuously monitored to assure that the plant is operating in a safe state. In response to deviations of these parameters from pre-determined set points, the protection system will initiate actions required to maintain the reactor in a safe state. These actions may include shutting down the reactor by opening the reactor trip breakers and actuation of safety equipment based on the situation. The RTS and ESFAS are represented in probabilistic risk assessments (PRAs) to reflect the impact of their contribution to core damage frequency (CDF). The reactor protection systems (RPS) in existing nuclear power plants are generally analog based and there is general consensus within the PRA community on fault tree modeling of these systems. In new plants, such as AP1000 plant, the RPS is based on digital technology. Digital systems are more complex combinations of hardware components and software. This combination of complex hardware and software can result in the presence of faults and failure modes unique to a digital RPS. The United States Nuclear Regulatory Commission (NRC) is currently performing research on the development of probabilistic models for digital systems for inclusion in PRAs; however, no consensus methodology exists at this time. Westinghouse is currently updating the AP1000 plant PRA to support initial operation of plants currently under construction in the United States. The digital RPS is modeled using fault tree methodology similar to that used for analog based systems. This paper presents high level descriptions of a typical analog based RPS and of the AP1000 plant digital RPS. Application of current fault

  17. Systematic methodological review : developing a framework for a qualitative semi-structured interview guide

    OpenAIRE

    Kallio, H; Pietila, A; Johnson, M; Kangasniemi, M

    2016-01-01

    Aim: To produce a framework for the development of a qualitative semi-structured interview\\ud guide.\\ud Background: Rigorous data collection procedures fundamentally influence the results of\\ud studies. The semi-structured interview is a common data collection method, but\\ud methodological research on the development of a semi-structured interview guide is sparse.\\ud Design: Systematic methodological review.\\ud Data Sources: We searched PubMed, CINAHL, Scopus and Web of Science for\\ud methodo...

  18. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  19. Fuel cycle assessment: A compendium of models, methodologies, and approaches

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    The purpose of this document is to profile analytical tools and methods which could be used in a total fuel cycle analysis. The information in this document provides a significant step towards: (1) Characterizing the stages of the fuel cycle. (2) Identifying relevant impacts which can feasibly be evaluated quantitatively or qualitatively. (3) Identifying and reviewing other activities that have been conducted to perform a fuel cycle assessment or some component thereof. (4) Reviewing the successes/deficiencies and opportunities/constraints of previous activities. (5) Identifying methods and modeling techniques/tools that are available, tested and could be used for a fuel cycle assessment.

  20. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  1. Methodology for modeling the microbial contamination of air filters.

    Directory of Open Access Journals (Sweden)

    Yun Haeng Joe

    Full Text Available In this paper, we propose a theoretical model to simulate microbial growth on contaminated air filters and entrainment of bioaerosols from the filters to an indoor environment. Air filter filtration and antimicrobial efficiencies, and effects of dust particles on these efficiencies, were evaluated. The number of bioaerosols downstream of the filter could be characterized according to three phases: initial, transitional, and stationary. In the initial phase, the number was determined by filtration efficiency, the concentration of dust particles entering the filter, and the flow rate. During the transitional phase, the number of bioaerosols gradually increased up to the stationary phase, at which point no further increase was observed. The antimicrobial efficiency and flow rate were the dominant parameters affecting the number of bioaerosols downstream of the filter in the transitional and stationary phase, respectively. It was found that the nutrient fraction of dust particles entering the filter caused a significant change in the number of bioaerosols in both the transitional and stationary phases. The proposed model would be a solution for predicting the air filter life cycle in terms of microbiological activity by simulating the microbial contamination of the filter.

  2. Development of an Improved Methodology to Assess Potential Unconventional Gas Resources

    International Nuclear Information System (INIS)

    Salazar, Jesus; McVay, Duane A.; Lee, W. John

    2010-01-01

    Considering the important role played today by unconventional gas resources in North America and their enormous potential for the future around the world, it is vital to both policy makers and industry that the volumes of these resources and the impact of technology on these resources be assessed. To provide for optimal decision making regarding energy policy, research funding, and resource development, it is necessary to reliably quantify the uncertainty in these resource assessments. Since the 1970s, studies to assess potential unconventional gas resources have been conducted by various private and governmental agencies, the most rigorous of which was by the United States Geological Survey (USGS). The USGS employed a cell-based, probabilistic methodology which used analytical equations to calculate distributions of the resources assessed. USGS assessments have generally produced distributions for potential unconventional gas resources that, in our judgment, are unrealistically narrow for what are essentially undiscovered, untested resources. In this article, we present an improved methodology to assess potential unconventional gas resources. Our methodology is a stochastic approach that includes Monte Carlo simulation and correlation between input variables. Application of the improved methodology to the Uinta-Piceance province of Utah and Colorado with USGS data validates the means and standard deviations of resource distributions produced by the USGS methodology, but reveals that these distributions are not right skewed, as expected for a natural resource. Our investigation indicates that the unrealistic shape and width of the gas resource distributions are caused by the use of narrow triangular input parameter distributions. The stochastic methodology proposed here is more versatile and robust than the USGS analytic methodology. Adoption of the methodology, along with a careful examination and revision of input distributions, should allow a more realistic

  3. Development of safety analysis methodology for moderator system failure of CANDU-6 reactor by thermal-hydraulics/physics coupling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jong Hyun, E-mail: jhkim@actbest.com [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); ACT Co., Ltd, 705 Gwanpyeong-dong, Yuseong-gu, Daejeon 305-509 (Korea, Republic of); Jin, Dong Sik [ACT Co., Ltd, 705 Gwanpyeong-dong, Yuseong-gu, Daejeon 305-509 (Korea, Republic of); Chang, Soon Heung [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)

    2013-10-15

    Highlights: • Developed new safety analysis methodology of moderator system failures for CANDU-6. • The new methodology used the TH-physics coupling concept. • Thermalhydraulic code is CATHENA, physics code is RFSP-IST. • Moderator system failure ends to the subcriticality through self-shutdown. -- Abstract: The new safety analysis methodology for the CANDU-6 nuclear power plant (NPP) moderator system failure has been developed by using the coupling technology with the thermalhydraulic code, CATHENA and reactor core physics code, RFSP-IST. This sophisticated methodology can replace the legacy methodology using the MODSTBOIL and SMOKIN-G2 in the field of the thermalhydraulics and reactor physics, respectively. The CATHENA thermalhydraulic model of the moderator system can simulate the thermalhydraulic behaviors of all the moderator systems such as the calandria tank, head tank, moderator circulating circuit and cover gas circulating circuit and can also predict the thermalhydraulic property of the moderator such as moderator density, temperature and water level in the calandria tank as the moderator system failures go on. And these calculated moderator thermalhydraulic properties are provided to the 3-dimensional neutron kinetics solution module – CERBRRS of RFSP-IST as inputs, which can predict the change of the reactor power and provide the calculated reactor power to the CATHENA. These coupling calculations are performed at every 2 s time steps, which are equivalent to the slow control of CANDU-6 reactor regulating systems (RRS). The safety analysis results using this coupling methodology reveal that the reactor operation enters into the self-shutdown mode without any engineering safety system and/or human interventions for the postulated moderator system failures of the loss of heat sink and moderator inventory, respectively.

  4. High-Fidelity Modelling Methodology of Light-Limited Photosynthetic Production in Microalgae.

    Directory of Open Access Journals (Sweden)

    Andrea Bernardi

    Full Text Available Reliable quantitative description of light-limited growth in microalgae is key to improving the design and operation of industrial production systems. This article shows how the capability to predict photosynthetic processes can benefit from a synergy between mathematical modelling and lab-scale experiments using systematic design of experiment techniques. A model of chlorophyll fluorescence developed by the authors [Nikolaou et al., J Biotechnol 194:91-99, 2015] is used as starting point, whereby the representation of non-photochemical-quenching (NPQ process is refined for biological consistency. This model spans multiple time scales ranging from milliseconds to hours, thus calling for a combination of various experimental techniques in order to arrive at a sufficiently rich data set and determine statistically meaningful estimates for the model parameters. The methodology is demonstrated for the microalga Nannochloropsis gaditana by combining pulse amplitude modulation (PAM fluorescence, photosynthesis rate and antenna size measurements. The results show that the calibrated model is capable of accurate quantitative predictions under a wide range of transient light conditions. Moreover, this work provides an experimental validation of the link between fluorescence and photosynthesis-irradiance (PI curves which had been theoricized.

  5. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  6. An optimisation methodology of artificial neural network models for predicting solar radiation: a case study

    Science.gov (United States)

    Rezrazi, Ahmed; Hanini, Salah; Laidi, Maamar

    2016-02-01

    The right design and the high efficiency of solar energy systems require accurate information on the availability of solar radiation. Due to the cost of purchase and maintenance of the radiometers, these data are not readily available. Therefore, there is a need to develop alternative ways of generating such data. Artificial neural networks (ANNs) are excellent and effective tools for learning, pinpointing or generalising data regularities, as they have the ability to model nonlinear functions; they can also cope with complex `noisy' data. The main objective of this paper is to show how to reach an optimal model of ANNs for applying in prediction of solar radiation. The measured data of the year 2007 in Ghardaïa city (Algeria) are used to demonstrate the optimisation methodology. The performance evaluation and the comparison of results of ANN models with measured data are made on the basis of mean absolute percentage error (MAPE). It is found that MAPE in the ANN optimal model reaches 1.17 %. Also, this model yields a root mean square error (RMSE) of 14.06 % and an MBE of 0.12. The accuracy of the outputs exceeded 97 % and reached up 99.29 %. Results obtained indicate that the optimisation strategy satisfies practical requirements. It can successfully be generalised for any location in the world and be used in other fields than solar radiation estimation.

  7. Overview on hydrogen risk research and development activities: Methodology and open issues

    Directory of Open Access Journals (Sweden)

    Ahmed Bentaib

    2015-02-01

    Full Text Available During the course of a severe accident in a light water nuclear reactor, large amounts of hydrogen can be generated and released into the containment during reactor core degradation. Additional burnable gases [hydrogen (H2 and carbon monoxide (CO] may be released into the containment in the corium/concrete interaction. This could subsequently raise a combustion hazard. As the Fukushima accidents revealed, hydrogen combustion can cause high pressure spikes that could challenge the reactor buildings and lead to failure of the surrounding buildings. To prevent the gas explosion hazard, most mitigation strategies adopted by European countries are based on the implementation of passive autocatalytic recombiners (PARs. Studies of representative accident sequences indicate that, despite the installation of PARs, it is difficult to prevent at all times and locations, the formation of a combustible mixture that potentially leads to local flame acceleration. Complementary research and development (R&D projects were recently launched to understand better the phenomena associated with the combustion hazard and to address the issues highlighted after the Fukushima Daiichi events such as explosion hazard in the venting system and the potential flammable mixture migration into spaces beyond the primary containment. The expected results will be used to improve the modeling tools and methodology for hydrogen risk assessment and severe accident management guidelines. The present paper aims to present the methodology adopted by Institut de Radioprotection et de Sûreté Nucléaire to assess hydrogen risk in nuclear power plants, in particular French nuclear power plants, the open issues, and the ongoing R&D programs related to hydrogen distribution, mitigation, and combustion.

  8. Development of a methodology for the evaluation of the thermomechanical behavior of the TRISO fuel

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Lorena P. Rodríguez; Pérez, Daniel Milian; Hernández, Carlos Rafael García; Lorenzo, Daniel E. Milian; Lira, Carlos A. Brayner de Oliveira, E-mail: lorenapilar1109@gmail.com, E-mail: milianperez89@gmail.com, E-mail: cgh@instec.cu, E-mail: dmilian@instec.cu, E-mail: cabol@ufpe.br [Higher Institute of Technologies and Applied Sciences (InSTEC), Habana (Cuba); Universidad Federal de Pernambuco (UFPE), Recife (Brazil). Departamento de Energia Nuclear

    2017-11-01

    The use of the Generation IV Very High Temperature Reactors (VHTR) presents significant perspectives to assume the future nuclear energy and hydrogen production. VHTR has advantages because its low electricity generation costs, short construction periods, high hydrogen production efficiency, safety and reliability, proliferation resistance and inherent safety features of the fuel and reactor. However, it faces substantial challenges to be successfully deployed as a sustainable energy source. One of these key challenges is the nuclear safety which mainly relies on the quality and integrity of the coated fuel particles (TRISO) planned to be used in these reactors taking into consideration the high temperatures (1000°C in normal operation and up to 1800°C in accidents conditions) and burnup degrees (150 - 200 GWd/tonU) achievable in these reactors. In this paper is presented the current state of development of a methodology for the evaluation of the thermomechanical behavior of the TRISO fuel in function of the variation of different parameters in the VHTR. In order to achieve this goal will be used coupled computational modeling using analytical methods and Monte Carlo and CFD codes such as MCNPX version 2.6e and Ansys version 14. The studies performed in this investigation included the evaluation of key parameters in the TRISO such as the release of fission gases and CO, gas pressure, temperature distributions, kernel migration, maximum stress values, and failure probabilities. The results achieved in this investigation contributes to demonstrating the viability of the proposed methodology for the study, the design and safety calculations of VHTR. (author)

  9. Development of a methodology for cost determination of wastewater treatment based on functional diagram

    International Nuclear Information System (INIS)

    Lamas, Wendell Q.; Silveira, Jose L.; Giacaglia, Giorgio E.O.; Reis, Luiz O.M.

    2009-01-01

    This work describes a methodology developed for determination of costs associated to products generated in a small wastewater treatment station for sanitary wastewater from a university campus. This methodology begins with plant component units identification, relating their fluid and thermodynamics features for each point marked in its process diagram. Following, its functional diagram is developed and its formulation is elaborated, in exergetic base, describing all equations for these points, which are the constraints for exergetic production cost problem and are used in equations to determine the costs associated to products generated in SWTS. This methodology was applied to a hypothetical system based on SWTS former parts and presented consistent results when compared to expected values based on previous exergetic expertise.

  10. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  11. Developing a methodology for identifying correlations between LERF and early fatality

    International Nuclear Information System (INIS)

    Kang, Kyung Min; Jae, Moo Sung; Ahn, Kwang Il

    2009-01-01

    The correlations between Large Early Release Frequency (LERF) and Early Fatality need to be investigated for risk-informed application and regulation. In RG-1.174, there are decision-making criteria using the measures of CDF and LERF, while there are no specific criteria on LERF. Since there are both huge uncertainty and large cost need in off-site consequence calculation, a LERF assessment methodology need to be developed and its correlation factor needs to be identified for risk-informed decision-making. This regards, the robust method for estimating off-site consequence has been performed for assessing health effects caused by radioisotopes released from severe accidents of nuclear power plants. And also, MACCS2 code are used for validating source term quantitatively regarding health effects depending on release characteristics of radioisotopes during severe accidents has been performed. This study developed a method for identifying correlations between LERF and Early Fatality and validates the results of the model using MACCS2 code. The results of this study may contribute to defining LERF and finding a measure for risk-informed regulations and risk-informed decision making

  12. An Innovative Structural Mode Selection Methodology: Application for the X-33 Launch Vehicle Finite Element Model

    Science.gov (United States)

    Hidalgo, Homero, Jr.

    2000-01-01

    An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.

  13. Methodology for considering environments and culture in developing information security systems

    OpenAIRE

    Mwakalinga, G Jeffy; Kowalski, Stewart; Yngström, Louise

    2009-01-01

    In this paper, we describe a methodology for considering culture of users and environments when developing information security systems. We discuss the problem of how researchers and developers of security for information systems have had difficulties in considering culture of users and environments when they develop information security systems. This has created environments where people serve technology instead of technology serving people. Users have been considered just as any other compo...

  14. Development of methodology for separation and recovery of uranium from nuclear wastewater

    International Nuclear Information System (INIS)

    Satpati, S.K.; Roy, S.B.; Pal, Sangita; Tewari, P.K.

    2015-01-01

    Uranium plays a key role in nuclear power supply, demand of which is growing up with time because of its prospective features. Persistent increase in different nuclear activities leads to increase generation of nuclear wastewater containing uranium. Separation and recovery of the uranium from its unconventional source like nuclear wastewater is worth to explore for addressing the reutilisation of the uranium source. It is also necessary to improve remediation technology of nuclear industries for environmental protection. Development of a suitable process methodology is essential for the purpose to supersede the conventional methodology. In the article, recent developments in several possible methodologies for separation of uranium from dilute solution have been discussed with their merits and demerits. Sorption technique as solid phase extraction methodology has been chosen with suitable polymer matrix and functional moiety based on wastewater characteristics. Polyhydroxamic Acid, PHOA sorbent synthesized following eco-friendly procedure is a promising polymeric chelating sorbents for remediation of nuclear wastewaters and recovery of uranium. Sorption and elution characteristics of the PHOA have been evaluated and illustrated for separation and recovery of uranium from a sample nuclear wastewater. For the remediation of nuclear wastewater SPE technique applying the PHOA, a polymeric sorbent is found to be a potentially suitable methodology. (author)

  15. Methodology for Developing the REScheckTM Software through Version 4.4.3

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M; Gowri, Krishnan; Lucas, Robert G; Schultz, Robert W; Taylor, Zachary T; Wiberg, John D

    2012-09-01

    , MECcheck was renamed REScheck™ to better identify it as a residential code compliance tool. The “MEC” in MECcheck was outdated because it was taken from the Model Energy Code, which has been succeeded by the IECC. The “RES” in REScheck is also a better fit with the companion commercial product, COMcheck™. The easy-to-use REScheck compliance materials include a compliance and enforcement manual for all the MEC and IECC requirements and three compliance approaches for meeting the code’s thermal envelope requirements-prescriptive packages, software, and a trade-off worksheet (included in the compliance manual). The compliance materials can be used for single-family and low-rise multifamily dwellings. The materials allow building energy efficiency measures (such as insulation levels) to be “traded off” against each other, allowing a wide variety of building designs to comply with the code. This report explains the methodology used to develop Version 4.4.3 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, 2006, 2007, 2009, and 2012 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these editions is similar. Beginning with REScheck Version 4.4.0, support for 1992, 1993, and 1995 MEC and the 1998 IECC is no longer included, but those sections remain in this document for reference purposes. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  16. A novel methodology to model the cooling processes of packed horticultural produce using 3D shape models

    Science.gov (United States)

    Gruyters, Willem; Verboven, Pieter; Rogge, Seppe; Vanmaercke, Simon; Ramon, Herman; Nicolai, Bart

    2017-10-01

    Freshly harvested horticultural produce require a proper temperature management to maintain their high economic value. Towards this end, low temperature storage is of crucial importance to maintain a high product quality. Optimizing both the package design of packed produce and the different steps in the postharvest cold chain can be achieved by numerical modelling of the relevant transport phenomena. This work presents a novel methodology to accurately model both the random filling of produce in a package and the subsequent cooling process. First, a cultivar-specific database of more than 100 realistic CAD models of apple and pear fruit is built with a validated geometrical 3D shape model generator. To have an accurate representation of a realistic picking season, the model generator also takes into account the biological variability of the produce shape. Next, a discrete element model (DEM) randomly chooses surface meshed bodies from the database to simulate the gravitational filling process of produce in a box or bin, using actual mechanical properties of the fruit. A computational fluid dynamics (CFD) model is then developed with the final stacking arrangement of the produce to study the cooling efficiency of packages under several conditions and configurations. Here, a typical precooling operation is simulated to demonstrate the large differences between using actual 3D shapes of the fruit and an equivalent spheres approach that simplifies the problem drastically. From this study, it is concluded that using a simplified representation of the actual fruit shape may lead to a severe overestimation of the cooling behaviour.

  17. GRA model development at Bruce Power

    International Nuclear Information System (INIS)

    Parmar, R.; Ngo, K.; Cruchley, I.

    2011-01-01

    In 2007, Bruce Power undertook a project, in partnership with AMEC NSS Limited, to develop a Generation Risk Assessment (GRA) model for its Bruce B Nuclear Generating Station. The model is intended to be used as a decision-making tool in support of plant operations. Bruce Power has recognized the strategic importance of GRA in the plant decision-making process and is currently implementing a pilot GRA application. The objective of this paper is to present the scope of the GRA model development project, methodology employed, and the results and path forward for the model implementation at Bruce Power. The required work was split into three phases. Phase 1 involved development of GRA models for the twelve systems most important to electricity production. Ten systems were added to the model during each of the next two phases. The GRA model development process consists of developing system Failure Modes and Effects Analyses (FMEA) to identify the components critical to the plant reliability and determine their impact on electricity production. The FMEAs were then used to develop the logic for system fault tree (FT) GRA models. The models were solved and post-processed to provide model outputs to the plant staff in a user-friendly format. The outputs consisted of the ranking of components based on their production impact expressed in terms of lost megawatt hours (LMWH). Another key model output was the estimation of the predicted Forced Loss Rate (FLR). (author)

  18. The Cultural Analysis of Soft Systems Methodology and the Configuration Model of Organizational Culture

    Directory of Open Access Journals (Sweden)

    Jürgen Staadt

    2015-06-01

    Full Text Available Organizations that find themselves within a problematic situation connected with cultural issues such as politics and power require adaptable research and corresponding modeling approaches so as to grasp the arrangements of that situation and their impact on the organizational development. This article originates from an insider-ethnographic intervention into the problematic situation of the leading public housing provider in Luxembourg. Its aim is to describe how the more action-oriented cultural analysis of soft systems methodology and the theory-driven configuration model of organizational culture are mutually beneficial rather than contradictory. The data collected between 2007 and 2013 were analyzed manually as well as by means of ATLAS.ti. Results demonstrate that the cultural analysis enables an in-depth understanding of the power-laden environment within the organization bringing about the so-called “socio-political system” and that the configuration model makes it possible to depict the influence of that system on the whole organization. The overall research approach thus contributes toward a better understanding of the influence and the impact of oppressive social environments and evolving power relations on the development of an organization.

  19. HIERARCHICAL METHODOLOGY FOR MODELING HYDROGEN STORAGE SYSTEMS. PART I: SCOPING MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, B; Donald L. Anton, D

    2008-12-22

    Detailed models for hydrogen storage systems provide essential design information about flow and temperature distributions, as well as, the utilization of a hydrogen storage media. However, before constructing a detailed model it is necessary to know the geometry and length scales of the system, along with its heat transfer requirements, which depend on the limiting reaction kinetics. More fundamentally, before committing significant time and resources to the development of a detailed model, it is necessary to know whether a conceptual storage system design is viable. For this reason, a hierarchical system of models progressing from scoping models to detailed analyses was developed. This paper, which discusses the scoping models, is the first in a two part series that presents a collection of hierarchical models for the design and evaluation of hydrogen storage systems.

  20. Digital System Categorization Methodology to Support Integration of Digital Instrumentation and Control Models into PRAs

    International Nuclear Information System (INIS)

    Arndt, Steven A.

    2011-01-01

    It has been suggested that by categorizing the various digital systems used in safety critical applications in nuclear power plants, it would be possible to determine which systems should be modeled in the analysis of the larger plant wide PRA, at what level of detail the digital system should be modeled and using which methods. The research reported in this paper develops a categorization method using system attributes to permit a modeler to more effectively model the systems that will likely have the most critical contributions to the overall plant safety and to more effectively model system interactions for those digital systems where the interactions are most important to the overall accuracy and completeness of the plant PRA. The proposed methodology will categorize digital systems based on certain attributes of the systems themselves and how they will be used in the specific application. This will help determine what digital systems need to be modeled and at what level of detail, and can be used to guide PRA analysis and regulatory reviews. The three-attribute categorization strategy that was proposed by Arndt is used as the basis for the categorization methodology developed here. The first attribute, digital system complexity, is based on Type Il interactions defined by Aldemir and an overall digital system size and complexity index. The size and complexity index used are previously defined software complexity metrics. Potential sub-attributes of digital system complexity include, design complexity, software complexity, hardware complexity, system function complexity and testability. The second attribute, digital system interactions/inter-conductivity, is a combination of Rushby's coupling and Ademir's Type I interactions. Digital systems that are loosely coupled and/or have very few Type I interaction would not interact dynamically with the overall system and would have a low interactions/inter-conductivity score. Potential sub-attributes of digital system