WorldWideScience

Sample records for modeling methodology development

  1. Suitability of Modern Software Development Methodologies for Model Driven Development

    Directory of Open Access Journals (Sweden)

    Ruben Picek

    2009-12-01

    Full Text Available As an answer to today’s growing challenges in software industry, wide spectrum of new approaches of software development has occurred. One prominent direction is currently most promising software development paradigm called Model Driven Development (MDD. Despite a lot of skepticism and problems, MDD paradigm is being used and improved to accomplish many inherent potential benefits. In the methodological approach of software development it is necessary to use some kind of development process. Modern methodologies can be classified into two main categories: formal or heavyweight and agile or lightweight. But when it is a question about MDD and development process for MDD, currently known methodologies are very poor or better said they don't have any explanation of MDD process. As the result of research, in this paper, author examines the possibilities of using existing modern software methodologies in context of MDD paradigm.

  2. General Methodology for developing UML models from UI

    CERN Document Server

    Reddy, Ch Ram Mohan; Srinivasa, K G; Kumar, T V Suresh; Kanth, K Rajani

    2012-01-01

    In recent past every discipline and every industry have their own methods of developing products. It may be software development, mechanics, construction, psychology and so on. These demarcations work fine as long as the requirements are within one discipline. However, if the project extends over several disciplines, interfaces have to be created and coordinated between the methods of these disciplines. Performance is an important quality aspect of Web Services because of their distributed nature. Predicting the performance of web services during early stages of software development is significant. In Industry, Prototype of these applications is developed during analysis phase of Software Development Life Cycle (SDLC). However, Performance models are generated from UML models. Methodologies for predicting the performance from UML models is available. Hence, In this paper, a methodology for developing Use Case model and Activity model from User Interface is presented. The methodology is illustrated with a case...

  3. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...... and Methodology ISO15704:2000)....

  4. Developing the Business Model – a Methodology for Virtual Enterprises

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a methodology to develop Virtual Enterprises (VEs). This Virtual Enterprise Methodology (VEM) outlines activities to consider when setting up and managing virtual enterprises. As a methodology the VEM helps companies to ask the right questions when preparing for, and setting...

  5. Spatial Development Modeling Methodology Application Possibilities in Vilnius

    Directory of Open Access Journals (Sweden)

    Lina Panavaitė

    2017-05-01

    Full Text Available In order to control the continued development of high-rise buildings and their irreversible visual impact on the overall silhouette of the city, the great cities of the world introduced new methodological principles to city’s spatial development models. These methodologies and spatial planning guidelines are focused not only on the controlled development of high-rise buildings, but on the spatial modelling of the whole city by defining main development criteria and estimating possible consequences. Vilnius city is no exception, however the re-establishment of independence of Lithuania caused uncontrolled urbanization process, so most of the city development regulations emerged as a consequence of unmanaged processes of investors’ expectations legalization. The importance of consistent urban fabric as well as conservation and representation of city’s most important objects gained attention only when an actual threat of overshadowing them with new architecture along with unmanaged urbanization in the city center or urban sprawl at suburbia, caused by land-use projects, had emerged. Current Vilnius’ spatial planning documents clearly define urban structure and key development principles, however the definitions are relatively abstract, causing uniform building coverage requirements for territories with distinct qualities and simplifying planar designs which do not meet quality standards. The overall quality of urban architecture is not regulated. The article deals with current spatial modeling methods, their individual parts, principles, the criteria for quality assessment and their applicability in Vilnius. The text contains an outline of possible building coverage regulations and impact assessment criteria for new development. The article contains a compendium of requirements for high-quality spatial planning and building design.

  6. Modeling and Architectural Design in Agile Development Methodologies

    NARCIS (Netherlands)

    Stojanovic, Z.; Dahanayake, A.; Sol, H

    2003-01-01

    Agile Development Methodologies have been designed to address the problem of delivering high-quality software on time under constantly and rapidly changing requirements in business and IT environments. Agile development processes are characterized by extensive coding practice, intensive communicatio

  7. The Methodology Roles in the Realization of a Model Development Environment

    OpenAIRE

    Arthur, James D.; Nance, Richard E.

    1988-01-01

    The definition of "methodology" is followed by a very brief review of past work in modeling methodologies. The dual role of a methodologies is explained: (1) conceptual guidance in the modeling task, and (2) definition of needs for environment designers. A model development based on the conical methodology serves for specific illustration of both roles.

  8. Development of a General Modelling Methodology for Vacuum Residue Hydroconversion

    Directory of Open Access Journals (Sweden)

    Pereira de Oliveira L.

    2013-11-01

    Full Text Available This work concerns the development of a methodology for kinetic modelling of refining processes, and more specifically for vacuum residue conversion. The proposed approach allows to overcome the lack of molecular detail of the petroleum fractions and to simulate the transformation of the feedstock molecules into effluent molecules by means of a two-step procedure. In the first step, a synthetic mixture of molecules representing the feedstock for the process is generated via a molecular reconstruction method, termed SR-REM molecular reconstruction. In the second step, a kinetic Monte-Carlo method (kMC is used to simulate the conversion reactions on this mixture of molecules. The molecular reconstruction was applied to several petroleum residues and is illustrated for an Athabasca (Canada vacuum residue. The kinetic Monte-Carlo method is then described in detail. In order to validate this stochastic approach, a lumped deterministic model for vacuum residue conversion was simulated using Gillespie’s Stochastic Simulation Algorithm. Despite the fact that both approaches are based on very different hypotheses, the stochastic simulation algorithm simulates the conversion reactions with the same accuracy as the deterministic approach. The full-scale stochastic simulation approach using molecular-level reaction pathways provides high amounts of detail on the effluent composition and is briefly illustrated for Athabasca VR hydrocracking.

  9. International orientation on methodologies for modelling developments in road safety.

    NARCIS (Netherlands)

    Reurings, M.C.B. & Commandeur, J.J.F.

    2007-01-01

    This report gives an overview of the models developed in countries other than the Netherlands to evaluate past developments in road traffic safety and to obtain estimates of these developments in the future. These models include classical linear regression and loglinear models as applied in Great Br

  10. Proposal for product development model focused on ce certification methodology

    Directory of Open Access Journals (Sweden)

    Nathalia Marcia Goulart Pinheiro

    2015-09-01

    Full Text Available This paper presents a critical analysis comparing 21 product development models in order to identify whether these structures meet the demands Product Certification of the European Community (CE. Furthermore, it presents a product development model, comprising the steps in the models analyzed, including improvements in activities for referred product certification. The proposed improvements are justified by the growing quest for the internationalization of products and processes within companies.

  11. Goal Model to Business Process Model: A Methodology for Enterprise Government Tourism System Development

    National Research Council Canada - National Science Library

    Ahmad Nurul Fajar; Imam Marzuki Shofi

    2016-01-01

    .... However, the goal model could not used directly to make business process model. In order to solve this problem,this paper presents and proposed a Methodology to extract goal model into business process model that called GBPM Methodology...

  12. Improved methodology for developing cost uncertainty models for naval vessels

    OpenAIRE

    Brown, Cinda L.

    2008-01-01

    The purpose of this thesis is to analyze the probabilistic cost model currently in use by NAVSEA 05C to predict cost uncertainty in naval vessel construction and to develop a method that better predicts the ultimate cost risk. The data used to develop the improved approach is collected from analysis of the CG(X) class ship by NAVSEA 05C. The NAVSEA 05C cost risk factors are reviewed and analyzed to determine if different factors are better cost predictors. The impact of data elicitation, t...

  13. A PROPOSED MODEL OF AGILE METHODOLOGY IN SOFTWARE DEVELOPMENT

    OpenAIRE

    Anjali Sharma*, Karambir

    2016-01-01

    Agile Software development has been increasing popularity and replacing the traditional methods of software develop-ment. This paper presents the all neural network techniques including General Regression Neural Networks (GRNN), Prob-abilistic Neural Network (PNN), GMDH Polynomial Neural Network, Cascade correlation neural network and a Machine Learning Technique Random Forest. To achieve better prediction, effort estimation of agile projects we will use Random Forest with Story Points Approa...

  14. A Model-Based Systems Engineering Methodology for Employing Architecture In System Analysis: Developing Simulation Models Using Systems Modeling Language Products to Link Architecture and Analysis

    Science.gov (United States)

    2016-06-01

    ENGINEERING METHODOLOGY FOR EMPLOYING ARCHITECTURE IN SYSTEM ANALYSIS: DEVELOPING SIMULATION MODELS USING SYSTEMS MODELING LANGUAGE PRODUCTS TO LINK... ENGINEERING METHODOLOGY FOR EMPLOYING ARCHITECTURE IN SYSTEM ANALYSIS: DEVELOPING SIMULATION MODELS USING SYSTEMS MODELING LANGUAGE PRODUCTS TO LINK...to model-based systems engineering (MBSE) by formally defining an MBSE methodology for employing architecture in system analysis (MEASA) that presents

  15. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  16. Development in methodologies for modelling of human and ecotoxic impacts in LCA

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Huijbregts, Mark; Jolliet, Olivier;

    2009-01-01

    Under the UNEP-SETAC Life Cycle Initiative there is an aim to develop an internationally backed recommended practice of life cycle impact assessment addressing methodological issues like choice of characterization model and characterization factors. In this context, an international comparison wa...

  17. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  18. Developing a methodology to predict PM10 concentrations in urban areas using generalized linear models.

    Science.gov (United States)

    Garcia, J M; Teodoro, F; Cerdeira, R; Coelho, L M R; Kumar, Prashant; Carvalho, M G

    2016-09-01

    A methodology to predict PM10 concentrations in urban outdoor environments is developed based on the generalized linear models (GLMs). The methodology is based on the relationship developed between atmospheric concentrations of air pollutants (i.e. CO, NO2, NOx, VOCs, SO2) and meteorological variables (i.e. ambient temperature, relative humidity (RH) and wind speed) for a city (Barreiro) of Portugal. The model uses air pollution and meteorological data from the Portuguese monitoring air quality station networks. The developed GLM considers PM10 concentrations as a dependent variable, and both the gaseous pollutants and meteorological variables as explanatory independent variables. A logarithmic link function was considered with a Poisson probability distribution. Particular attention was given to cases with air temperatures both below and above 25°C. The best performance for modelled results against the measured data was achieved for the model with values of air temperature above 25°C compared with the model considering all ranges of air temperatures and with the model considering only temperature below 25°C. The model was also tested with similar data from another Portuguese city, Oporto, and results found to behave similarly. It is concluded that this model and the methodology could be adopted for other cities to predict PM10 concentrations when these data are not available by measurements from air quality monitoring stations or other acquisition means.

  19. A new methodology for the development of high-latitude ionospheric climatologies and empirical models

    Science.gov (United States)

    Chisham, G.

    2017-01-01

    Many empirical models and climatologies of high-latitude ionospheric processes, such as convection, have been developed over the last 40 years. One common feature in the development of these models is that measurements from different times are combined and averaged on fixed coordinate grids. This methodology ignores the reality that high-latitude ionospheric features are organized relative to the location of the ionospheric footprint of the boundary between open and closed geomagnetic field lines (OCB). This boundary is in continual motion, and the polar cap that it encloses is continually expanding and contracting in response to changes in the rates of magnetic reconnection at the Earth's magnetopause and in the magnetotail. As a consequence, models that are developed by combining and averaging data in fixed coordinate grids heavily smooth the variations that occur near the boundary location. Here we propose that the development of future models should consider the location of the OCB in order to more accurately model the variations in this region. We present a methodology which involves identifying the OCB from spacecraft auroral images and then organizing measurements in a grid where the bins are placed relative to the OCB location. We demonstrate the plausibility of this methodology using ionospheric vorticity measurements made by the Super Dual Auroral Radar Network radars and OCB measurements from the IMAGE spacecraft FUV auroral imagers. This demonstration shows that this new methodology results in sharpening and clarifying features of climatological maps near the OCB location. We discuss the potential impact of this methodology on space weather applications.

  20. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    . The GC model uses the Marrero-Gani (MR) method which considers the group contribution in different levels both functional and structural. The methodology helps improve accuracy and reliability of property modeling and provides a rigorous model quality check and assurance. This is expected to further......Property prediction models are a fundamental tool of process modeling and analysis, especially at the early stage of process development. Furthermore, property prediction models are the fundamental tool for Computer-aided molecular design used for the development of new refrigerants. Group...... contribution (GC) based prediction methods use structurally dependent parameters in order to determine the property of pure components. The aim of the GC parameter estimation is to find the best possible set of model parameters that fits the experimental data. In that sense, there is often a lack of attention...

  1. A Model-Based Methodology for Spray-Drying Process Development.

    Science.gov (United States)

    Dobry, Dan E; Settell, Dana M; Baumann, John M; Ray, Rod J; Graham, Lisa J; Beyerinck, Ron A

    2009-09-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-drying process development and scale-up are efficient and require minimal time and API. This methodology offers substantive advantages over traditional process-development methods, which are often empirical and require large quantities of API and long development times. This approach is also in alignment with the current guidance on Pharmaceutical Development Q8(R1). The methodology is used from early formulation-screening activities (involving milligrams of API) through process development and scale-up for early clinical supplies (involving kilograms of API) to commercial manufacturing (involving metric tons of API). It has been used to progress numerous spray-dried dispersion formulations, increasing bioavailability of formulations at preclinical through commercial scales.

  2. Frescoed Vaults: Accuracy Controlled Simplified Methodology for Planar Development of Three-Dimensional Textured Models

    Directory of Open Access Journals (Sweden)

    Marco Giorgio Bevilacqua

    2016-03-01

    Full Text Available In the field of documentation and preservation of cultural heritage, there is keen interest in 3D metric viewing and rendering of architecture for both formal appearance and color. On the other hand, operative steps of restoration interventions still require full-scale, 2D metric surface representations. The transition from 3D to 2D representation, with the related geometric transformations, has not yet been fully formalized for planar development of frescoed vaults. Methodologies proposed so far on this subject provide transitioning from point cloud models to ideal mathematical surfaces and projecting textures using software tools. The methodology used for geometry and texture development in the present work does not require any dedicated software. The different processing steps can be individually checked for any error introduced, which can be then quantified. A direct accuracy check of the planar development of the frescoed surface has been carried out by qualified restorers, yielding a result of 3 mm. The proposed methodology, although requiring further studies to improve automation of the different processing steps, allowed extracting 2D drafts fully usable by operators restoring the vault frescoes.

  3. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  4. A Mapping Model for Transforming Traditional Software Development Methods to Agile Methodology

    Directory of Open Access Journals (Sweden)

    Rashmi Popli

    2013-07-01

    Full Text Available Agility is bringing in responsibility and ownership in individuals, which will eventually bring outeffectiveness and efficiency in deliverables. Agile model is growing in the market at very good pace.Companies are drifting from traditional Software Development Life Cycle models to Agile Environment forthe purpose of attaining quality and for the sake of saving cost and time. Nimbleness nature of Agile ishelpful in frequent releases so as to satisfy the customer by providing frequent dual feedback. InTraditional models, life cycle is properly defined and also phases are elaborated by specifying needed inputand output parameters. On the other hand, in Agile environment, phases are specific to methodologies ofAgile - Extreme Programming etc. In this paper a common life cycle approach is proposed that isapplicable for different kinds of teams. The paper aims to describe a mapping function for mapping oftraditional methods to Agile method.

  5. Methodology to develop crash modification functions for road safety treatments with fully specified and hierarchical models.

    Science.gov (United States)

    Chen, Yongsheng; Persaud, Bhagwant

    2014-09-01

    Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors.

  6. Scenario Methodology for Modelling of Future Landscape Developments as Basis for Assessing Ecosystem Services

    Directory of Open Access Journals (Sweden)

    Matthias Rosenberg

    2014-04-01

    Full Text Available The ecosystems of our intensively used European landscapes produce a variety of natural goods and services for the benefit of humankind, and secure the basics and quality of life. Because these ecosystems are still undergoing fundamental changes, the interest of the society is to know more about future developments and their ecological impacts. To describe and analyze these changes, scenarios can be developed and an assessment of the ecological changes can be carried out subsequently. In the project „Landscape Saxony 2050“; a methodology for the construction of exploratory scenarios was worked out. The presented methodology provides a possibility to identify the driving forces (socio-cultural, economic and ecological conditions of the landscape development. It allows to indicate possible future paths which lead to a change of structures and processes in the landscape and can influence the capability to provide ecosystem services. One essential component of the applied technique is that an approach for the assessment of the effects of the landscape changes on ecosystem services is integrated into the developed scenario methodology. Another is, that the methodology is strong designed as participatory, i.e. stakeholders are integrated actively. The method is a seven phase model which provides the option for the integration of the stakeholders‘ participation at all levels of scenario development. The scenario framework was applied to the district of Görlitz, an area of 2100 sq km located at the eastern border of Germany. The region is affected by strong demographic as well as economic changes. The core issue focused on the examination of landscape change in terms of biodiversity. Together with stakeholders, a trend scenario and two alternative scenarios were developed. The changes of the landscape structure are represented in story lines, maps and tables. On basis of the driving forces of the issue areas „cultural / social values“ and

  7. Methodology Development of a Gas-Liquid Dynamic Flow Regime Transition Model

    Science.gov (United States)

    Doup, Benjamin Casey

    Current reactor safety analysis codes, such as RELAP5, TRACE, and CATHARE, use flow regime maps or flow regime transition criteria that were developed for static fully-developed two-phase flows to choose interfacial transfer models that are necessary to solve the two-fluid model. The flow regime is therefore difficult to identify near the flow regime transitions, in developing two-phase flows, and in transient two-phase flows. Interfacial area transport equations were developed to more accurately predict the dynamic nature of two-phase flows. However, other model coefficients are still flow regime dependent. Therefore, an accurate prediction of the flow regime is still important. In the current work, the methodology for the development of a dynamic flow regime transition model that uses the void fraction and interfacial area concentration obtained by solving three-field the two-fluid model and two-group interfacial area transport equation is investigated. To develop this model, detailed local experimental data are obtained, the two-group interfacial area transport equations are revised, and a dynamic flow regime transition model is evaluated using a computational fluid dynamics model. Local experimental data is acquired for 63 different flow conditions in bubbly, cap-bubbly, slug, and churn-turbulent flow regimes. The measured parameters are the group-1 and group-2 bubble number frequency, void fraction, interfacial area concentration, and interfacial bubble velocities. The measurements are benchmarked by comparing the prediction of the superficial gas velocities, determined using the local measurements with those determined from volumetric flow rate measurements and the agreement is generally within +/-20%. The repeatability four-sensor probe construction process is within +/-10%. The repeatability of the measurement process is within +/-7%. The symmetry of the test section is examined and the average agreement is within +/-5.3% at z/D = 10 and +/-3.4% at z/D = 32

  8. Methodology Development for Passive Component Reliability Modeling in a Multi-Physics Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States); Denning, Richard [The Ohio State Univ., Columbus, OH (United States); Catalyurek, Umit [The Ohio State Univ., Columbus, OH (United States); Unwin, Stephen [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-01-23

    Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, such as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.

  9. A Model-Based Methodology for Spray-Drying Process Development

    OpenAIRE

    Dobry, Dan E.; Settell, Dana M.; Baumann, John M.; Ray, Rod J.; Graham, Lisa J; Beyerinck, Ron A.

    2009-01-01

    Solid amorphous dispersions are frequently used to improve the solubility and, thus, the bioavailability of poorly soluble active pharmaceutical ingredients (APIs). Spray-drying, a well-characterized pharmaceutical unit operation, is ideally suited to producing solid amorphous dispersions due to its rapid drying kinetics. This paper describes a novel flowchart methodology based on fundamental engineering models and state-of-the-art process characterization techniques that ensure that spray-dr...

  10. Semantic-Driven e-Government: Application of Uschold and King Ontology Building Methodology for Semantic Ontology Models Development

    CERN Document Server

    Fonou-Dombeu, Jean Vincent; 10.5121/ijwest.2011.2401

    2011-01-01

    Electronic government (e-government) has been one of the most active areas of ontology development during the past six years. In e-government, ontologies are being used to describe and specify e-government services (e-services) because they enable easy composition, matching, mapping and merging of various e-government services. More importantly, they also facilitate the semantic integration and interoperability of e-government services. However, it is still unclear in the current literature how an existing ontology building methodology can be applied to develop semantic ontology models in a government service domain. In this paper the Uschold and King ontology building methodology is applied to develop semantic ontology models in a government service domain. Firstly, the Uschold and King methodology is presented, discussed and applied to build a government domain ontology. Secondly, the domain ontology is evaluated for semantic consistency using its semi-formal representation in Description Logic. Thirdly, an...

  11. Epilepsy Therapy Development: Technical and Methodological Issues in Studies with Animal Models

    Science.gov (United States)

    Galanopoulou, Aristea S.; Kokaia, Merab; Loeb, Jeffrey A.; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A.; Staley, Kevin J.; Whittemore, Vicky H.; Dudek, F. Edward

    2013-01-01

    SUMMARY The search for new treatments for seizures, epilepsies and their comorbidities faces considerable challenges. Partly, this is due to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty to predict the efficacy, tolerability and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Here we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodological and reporting practices that will enhance the uniformity, reliability and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multi-disciplinary approaches. The topics considered include: (a) implementation of better study design and reporting practices, (b) incorporation in the study design and analysis of covariants that may impact outcomes (including species, age, sex), (c) utilization of approaches to document target relevance, exposure and engagement by the tested treatment, (d) utilization of clinically relevant treatment protocols, (e) optimization of the use of video-EEG recordings to best meet the study goals, and (f) inclusion of outcome measures that address the tolerability of the treatment or study endpoints apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and development. We propose several infrastructure

  12. Epilepsy therapy development: technical and methodologic issues in studies with animal models.

    Science.gov (United States)

    Galanopoulou, Aristea S; Kokaia, Merab; Loeb, Jeffrey A; Nehlig, Astrid; Pitkänen, Asla; Rogawski, Michael A; Staley, Kevin J; Whittemore, Vicky H; Dudek, F Edward

    2013-08-01

    The search for new treatments for seizures, epilepsies, and their comorbidities faces considerable challenges. This is due in part to gaps in our understanding of the etiology and pathophysiology of most forms of epilepsy. An additional challenge is the difficulty in predicting the efficacy, tolerability, and impact of potential new treatments on epilepsies and comorbidities in humans, using the available resources. Herein we provide a summary of the discussions and proposals of the Working Group 2 as presented in the Joint American Epilepsy Society and International League Against Epilepsy Translational Workshop in London (September 2012). We propose methodologic and reporting practices that will enhance the uniformity, reliability, and reporting of early stage preclinical studies with animal seizure and epilepsy models that aim to develop and evaluate new therapies for seizures or epilepsies, using multidisciplinary approaches. The topics considered include the following: (1) implementation of better study design and reporting practices; (2) incorporation in the study design and analysis of covariants that may influence outcomes (including species, age, sex); (3) utilization of approaches to document target relevance, exposure, and engagement by the tested treatment; (4) utilization of clinically relevant treatment protocols; (5) optimization of the use of video-electroencephalography (EEG) recordings to best meet the study goals; and (6) inclusion of outcome measures that address the tolerability of the treatment or study end points apart from seizures. We further discuss the different expectations for studies aiming to meet regulatory requirements to obtain approval for clinical testing in humans. Implementation of the rigorous practices discussed in this report will require considerable investment in time, funds, and other research resources, which may create challenges for academic researchers seeking to contribute to epilepsy therapy discovery and

  13. Soft Systems Methodology and Problem Framing: Development of an Environmental Problem Solving Model Respecting a New Emergent Reflexive Paradigm.

    Science.gov (United States)

    Gauthier, Benoit; And Others

    1997-01-01

    Identifies the more representative problem-solving models in environmental education. Suggests the addition of a strategy for defining a problem situation using Soft Systems Methodology to environmental education activities explicitly designed for the development of critical thinking. Contains 45 references. (JRH)

  14. Development of CCF modeling and analysis methodology for diverse system status

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Tae Jin; Byun, Si Sub; Yoon, Tae Kwan [Soongsil University, Seoul (Korea); Moon, Jae Pil [Seoul National University, Seoul (Korea)

    1999-04-01

    The objectives of this project is to develop a procedure for modeling and analyzing CCF efficiently according to various system status. CCF events change according to the change of the system status due to maintenance, accidents, or alternating success criteria for various missions. The objective of the first year's research is to develope a CCF model for various system status. We reviewed and evaluated current CCF models, and analyze their merits and deficiency in modeling various system status. An approximate model is developed as a CCF model. The model is compatible with MGL model. Extensive sensitivity study shows the accuracy and efficiency of the proposed model. Second year's research aims to the development of an integrated CCF procedure for PSA and risk monitor. We develope an adaptive method for the approximate model in a k/m/G system with multiple common cause groups. The accuracy of the method is proved by comparing with the implicit method. Next, we develope a method for modeling CCF in a fault tree. Three alternatives are considered. It is proved to be most efficient to model the CCF events under the gate of individual component failure. The we provides a method for estimating the CCF probability, and develope a software for this purpose. We finally provide a fundamental procedure for modeling CCF in a risk monitor. The modeling procedure is applied to HPSI system, and it is proved to be efficient and accurate. (author). 48 refs., 11 figs., 53 tabs.

  15. Computational simulation methodologies for mechanobiological modelling: a cell-centred approach to neointima development in stents.

    Science.gov (United States)

    Boyle, C J; Lennon, A B; Early, M; Kelly, D J; Lally, C; Prendergast, P J

    2010-06-28

    The design of medical devices could be very much improved if robust tools were available for computational simulation of tissue response to the presence of the implant. Such tools require algorithms to simulate the response of tissues to mechanical and chemical stimuli. Available methodologies include those based on the principle of mechanical homeostasis, those which use continuum models to simulate biological constituents, and the cell-centred approach, which models cells as autonomous agents. In the latter approach, cell behaviour is governed by rules based on the state of the local environment around the cell; and informed by experiment. Tissue growth and differentiation requires simulating many of these cells together. In this paper, the methodology and applications of cell-centred techniques--with particular application to mechanobiology--are reviewed, and a cell-centred model of tissue formation in the lumen of an artery in response to the deployment of a stent is presented. The method is capable of capturing some of the most important aspects of restenosis, including nonlinear lesion growth with time. The approach taken in this paper provides a framework for simulating restenosis; the next step will be to couple it with more patient-specific geometries and quantitative parameter data.

  16. A Mapping Model for Transforming Traditional Software Development Methods to Agile Methodology

    National Research Council Canada - National Science Library

    Rashmi Popli; Anita; Naresh Chauhan

    2013-01-01

    .... Agile model is growing in the market at very good pace.Companies are drifting from traditional Software Development Life Cycle models to Agile Environment forthe purpose of attaining quality and for the sake of saving cost and time...

  17. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  18. Development of a cross-section methodology and a real-time core model for VVER-1000 simulator application

    Energy Technology Data Exchange (ETDEWEB)

    Georgieva, Emiliya Lyudmilova

    2016-06-06

    The novel academic contributions are summarized as follows. A) A cross-section modelling methodology and a cycle-specific cross-section update procedure are developed to meet fidelity requirements applicable to a cycle-specific reactor core simulation, as well as particular customer needs and practices supporting VVER-1000 operation and safety. B) A real-time version of the Nodal Expansion Method code is developed and implemented into Kozloduy 6 full-scope replica control room simulator.

  19. Developing Risk Prediction Models for Postoperative Pancreatic Fistula: a Systematic Review of Methodology and Reporting Quality.

    Science.gov (United States)

    Wen, Zhang; Guo, Ya; Xu, Banghao; Xiao, Kaiyin; Peng, Tao; Peng, Minhao

    2016-04-01

    Postoperative pancreatic fistula is still a major complication after pancreatic surgery, despite improvements of surgical technique and perioperative management. We sought to systematically review and critically access the conduct and reporting of methods used to develop risk prediction models for predicting postoperative pancreatic fistula. We conducted a systematic search of PubMed and EMBASE databases to identify articles published before January 1, 2015, which described the development of models to predict the risk of postoperative pancreatic fistula. We extracted information of developing a prediction model including study design, sample size and number of events, definition of postoperative pancreatic fistula, risk predictor selection, missing data, model-building strategies, and model performance. Seven studies of developing seven risk prediction models were included. In three studies (42 %), the number of events per variable was less than 10. The number of candidate risk predictors ranged from 9 to 32. Five studies (71 %) reported using univariate screening, which was not recommended in building a multivariate model, to reduce the number of risk predictors. Six risk prediction models (86 %) were developed by categorizing all continuous risk predictors. The treatment and handling of missing data were not mentioned in all studies. We found use of inappropriate methods that could endanger the development of model, including univariate pre-screening of variables, categorization of continuous risk predictors, and model validation. The use of inappropriate methods affects the reliability and the accuracy of the probability estimates of predicting postoperative pancreatic fistula.

  20. Expert Systems Development Methodology

    Science.gov (United States)

    1989-07-28

    expert systems has been hardware development. In the middle 1950’s at the very birth of AI, hardware was large very slow and extremely expensive. In...into another report. For example, MOBPLEX provides output into the Lotus spreadsheet as a semi-automated destination. From the spreadsheet the user of...designed on top of the Lotus 1-2- 3 interface. Lotus was used because it was decided there was no need to build a powerful ad hoc report generator

  1. Development and application of a statistical methodology to evaluate the predictive accuracy of building energy baseline models

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.; Price, Phillip N. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Energy Technologies Area Div.

    2014-03-01

    This paper documents the development and application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-­building energy savings. The methodology complements the principles addressed in resources such as ASHRAE Guideline 14 and the International Performance Measurement and Verification Protocol. It requires fitting a baseline model to data from a ``training period’’ and using the model to predict total electricity consumption during a subsequent ``prediction period.’’ We illustrate the methodology by evaluating five baseline models using data from 29 buildings. The training period and prediction period were varied, and model predictions of daily, weekly, and monthly energy consumption were compared to meter data to determine model accuracy. Several metrics were used to characterize the accuracy of the predictions, and in some cases the best-­performing model as judged by one metric was not the best performer when judged by another metric.

  2. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  3. Methodologies for Development of Patient Specific Bone Models from Human Body CT Scans

    Science.gov (United States)

    Chougule, Vikas Narayan; Mulay, Arati Vinayak; Ahuja, Bharatkumar Bhagatraj

    2016-06-01

    This work deals with development of algorithm for physical replication of patient specific human bone and construction of corresponding implants/inserts RP models by using Reverse Engineering approach from non-invasive medical images for surgical purpose. In medical field, the volumetric data i.e. voxel and triangular facet based models are primarily used for bio-modelling and visualization, which requires huge memory space. On the other side, recent advances in Computer Aided Design (CAD) technology provides additional facilities/functions for design, prototyping and manufacturing of any object having freeform surfaces based on boundary representation techniques. This work presents a process to physical replication of 3D rapid prototyping (RP) physical models of human bone from various CAD modeling techniques developed by using 3D point cloud data which is obtained from non-invasive CT/MRI scans in DICOM 3.0 format. This point cloud data is used for construction of 3D CAD model by fitting B-spline curves through these points and then fitting surface between these curve networks by using swept blend techniques. This process also can be achieved by generating the triangular mesh directly from 3D point cloud data without developing any surface model using any commercial CAD software. The generated STL file from 3D point cloud data is used as a basic input for RP process. The Delaunay tetrahedralization approach is used to process the 3D point cloud data to obtain STL file. CT scan data of Metacarpus (human bone) is used as the case study for the generation of the 3D RP model. A 3D physical model of the human bone is generated on rapid prototyping machine and its virtual reality model is presented for visualization. The generated CAD model by different techniques is compared for the accuracy and reliability. The results of this research work are assessed for clinical reliability in replication of human bone in medical field.

  4. MODELS OF THE 5 PORTERS COMPETITIVE FORCES METHODOLOGY CHANGES IN COMPANIES STRATEGY DEVELOPMENT ON COMPETITIVE MARKET

    Directory of Open Access Journals (Sweden)

    Sergey I Zubin

    2014-01-01

    Full Text Available There are some different types of approaches to 5 Porters Forces model development in thisarticle. Authors take up the negative attitude researcher reasons to this instrument and inputsuch changes in it, which can help to fi nd the best way to companies growing up on competitive market.

  5. A Probabilistic Ontology Development Methodology

    Science.gov (United States)

    2014-06-01

    Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic

  6. Experimental stress analysis for materials and structures stress analysis models for developing design methodologies

    CERN Document Server

    Freddi, Alessandro; Cristofolini, Luca

    2015-01-01

    This book summarizes the main methods of experimental stress analysis and examines their application to various states of stress of major technical interest, highlighting aspects not always covered in the classic literature. It is explained how experimental stress analysis assists in the verification and completion of analytical and numerical models, the development of phenomenological theories, the measurement and control of system parameters under operating conditions, and identification of causes of failure or malfunction. Cases addressed include measurement of the state of stress in models, measurement of actual loads on structures, verification of stress states in circumstances of complex numerical modeling, assessment of stress-related material damage, and reliability analysis of artifacts (e.g. prostheses) that interact with biological systems. The book will serve graduate students and professionals as a valuable tool for finding solutions when analytical solutions do not exist.

  7. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  8. A flexible hydrological modelling system developed using an object oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Rinde, Trond

    1998-12-31

    The report presents a software system called Process Integrating Network (PINE). The capabilities, working principles, programming technical design and principles of use of the system are described as are some practical applications. PINE is a simulation tool for modelling of hydrological and hydrologically related phenomena. The system is based on object oriented programming principles and was specially designed to provide freedom in the choice of model structures and algorithms for process descriptions. It supports full freedom with regards to spatial distribution and temporal resolution. Geographical information systems (GIS) may be integrated with PINE in order to provide full spatial distribution in system parametrisation, process simulation and visualisation of simulation results. Simulation models are developed by linking components for process description together in a structure. The system can handle compound working media such as water with chemical or biological constituents. Non-hydrological routines may then be included to describe the responses of such constituents. Features such as extensibility and reuse of program components are emphasised in the program design. Separation between process topology, process descriptions and process data facilitates simple and consistent implementation of components for process description. Such components may be automatically prototyped and their response functions may be implemented without knowledge of other parts of the program system and without the need to program import or export routines or a user interface. Model extension is thus a rapid process that does not require extensive programming skills. Components for process descriptions may further be placed in separate program libraries, which can be included in the program as required. The program system can thus be very compact while it still has a large number of process algorithms available. The system can run on both PC and UNIX platforms. 106 figs., 20

  9. Methodology for Developing Hydrological Models Based on an Artificial Neural Network to Establish an Early Warning System in Small Catchments

    Directory of Open Access Journals (Sweden)

    Ivana Sušanj

    2016-01-01

    Full Text Available In some situations, there is no possibility of hazard mitigation, especially if the hazard is induced by water. Thus, it is important to prevent consequences via an early warning system (EWS to announce the possible occurrence of a hazard. The aim and objective of this paper are to investigate the possibility of implementing an EWS in a small-scale catchment and to develop a methodology for developing a hydrological prediction model based on an artificial neural network (ANN as an essential part of the EWS. The methodology is implemented in the case study of the Slani Potok catchment, which is historically recognized as a hazard-prone area, by establishing continuous monitoring of meteorological and hydrological parameters to collect data for the training, validation, and evaluation of the prediction capabilities of the ANN model. The model is validated and evaluated by visual and common calculation approaches and a new evaluation for the assessment. This new evaluation is proposed based on the separation of the observed data into classes based on the mean data value and the percentages of classes above or below the mean data value as well as on the performance of the mean absolute error.

  10. Enviro-HIRLAM online integrated meteorology-chemistry modelling system: strategy, methodology, developments and applications (v7.2)

    Science.gov (United States)

    Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe

    2017-08-01

    The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose model configurations for the meteorological and air quality communities are discussed.

  11. Enviro-HIRLAM online integrated meteorology–chemistry modelling system: strategy, methodology, developments and applications (v7.2

    Directory of Open Access Journals (Sweden)

    A. Baklanov

    2017-08-01

    Full Text Available The Environment – High Resolution Limited Area Model (Enviro-HIRLAM is developed as a fully online integrated numerical weather prediction (NWP and atmospheric chemical transport (ACT model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2, in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct on radiation and (first and second indirect effects on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform – HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose model configurations for the meteorological and air quality communities are discussed.

  12. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Almeida, João Paolo A.; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  13. Towards an MDA-based development methodology

    NARCIS (Netherlands)

    Gavras, Anastasius; Belaunde, Mariano; Ferreira Pires, Luis; Andrade Almeida, João; Oquendo, Flavio; Warboys, Brian C.; Morrison, Ron

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  14. Development of trip coverage analysis methodology - CATHENA trip coverage analysis model

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H.; Huh, J. Y.; Na, Y. H.; Lee, S. Y.; Kim, B. G.; Kim, H. H.; Kim, S. W.; Bae, C. J.; Kim, T. M.; Kim, S. R.; Han, B. S.; Moon, B. J.; Oh, M. T. [Korea Power Engineering Co., Yongin (Korea)

    2001-05-01

    This report describes the CATHENA model for trip coverage analysis. This model is prepared based on the Wolsong 2 design data and consist of primary heat transport system, shutdown system, steam and feedwater system, reactor regulating system, heat transport pressure and inventory control system, and steam generator level and pressure control system. The new features and modified parts from the Wolsong 2 CATHENA LOCA Model required for trip coverage analysis is described. this model is tested by simulation of steady state at 100 % FP and at several low powers. Also, the cases of power rundown and power runup are tested. 17 refs., 124 figs., 19 tabs. (Author)

  15. Development of a modelling methodology for simulation of long-term morphological evolution of the southern Baltic coast

    Science.gov (United States)

    Zhang, Wenyan; Harff, Jan; Schneider, Ralf; Wu, Chaoyu

    2010-10-01

    The Darss-Zingst peninsula at the southern Baltic Sea is a typical wave-dominated barrier island system which includes an outer barrier island and an inner lagoon. The formation of the Darss-Zingst peninsula dates back to the Littorina Transgression onset about 8,000 cal BP. It originated from several discrete islands, has been reshaped by littoral currents, wind-induced waves during the last 8,000 years and evolved into a complex barrier island system as today; thus, it may serve as an example to study the coastal evolution under long-term climate change. A methodology for developing a long-term (decadal-to-centennial) process-based morphodynamic model for the southern Baltic coastal environment is presented here. The methodology consists of two main components: (1) a preliminary analysis of the key processes driving the morphological evolution of the study area based on statistical analysis of meteorological data and sensitivity studies; (2) a multi-scale high-resolution process-based model. The process-based model is structured into eight main modules. The two-dimensional vertically integrated circulation module, the wave module, the bottom boundary layer module, the sediment transport module, the cliff erosion module and the nearshore storm module are real-time calculation modules which aim at solving the short-term processes. A bathymetry update module and a long-term control function set, in which the ‘reduction’ concepts and technique for morphological update acceleration are implemented, are integrated to up-scale the effects of short-term processes to a decadal-to-centennial scale. A series of multi-scale modelling strategies are implemented in the application of the model to the research area. Successful hindcast of the coastline change of the Darss-Zingst peninsula for the last 300 years validates the modelling methodology. Model results indicate that the coastline change of the Darss-Zingst peninsula is dominated by mechanisms acting on different

  16. Methodological and empirical developments for the Ratcliff diffusion model of response times and accuracy

    NARCIS (Netherlands)

    Wagenmakers, E.-J.

    2009-01-01

    The Ratcliff diffusion model for simple two-choice decisions (e.g., Ratcliff, 1978; Ratcliff & McKoon, 2008) has two outstanding advantages. First, the model generally provides an excellent fit to the observed data (i.e., response accuracy and the shape of RT distributions, both for correct and erro

  17. Developing a Cost Model and Methodology to Estimate Capital Costs for Thermal Energy Storage

    Energy Technology Data Exchange (ETDEWEB)

    Glatzmaier, G.

    2011-12-01

    This report provides an update on the previous cost model for thermal energy storage (TES) systems. The update allows NREL to estimate the costs of such systems that are compatible with the higher operating temperatures associated with advanced power cycles. The goal of the Department of Energy (DOE) Solar Energy Technology Program is to develop solar technologies that can make a significant contribution to the United States domestic energy supply. The recent DOE SunShot Initiative sets a very aggressive cost goal to reach a Levelized Cost of Energy (LCOE) of 6 cents/kWh by 2020 with no incentives or credits for all solar-to-electricity technologies.1 As this goal is reached, the share of utility power generation that is provided by renewable energy sources is expected to increase dramatically. Because Concentrating Solar Power (CSP) is currently the only renewable technology that is capable of integrating cost-effective energy storage, it is positioned to play a key role in providing renewable, dispatchable power to utilities as the share of power generation from renewable sources increases. Because of this role, future CSP plants will likely have as much as 15 hours of Thermal Energy Storage (TES) included in their design and operation. As such, the cost and performance of the TES system is critical to meeting the SunShot goal for solar technologies. The cost of electricity from a CSP plant depends strongly on its overall efficiency, which is a product of two components - the collection and conversion efficiencies. The collection efficiency determines the portion of incident solar energy that is captured as high-temperature thermal energy. The conversion efficiency determines the portion of thermal energy that is converted to electricity. The operating temperature at which the overall efficiency reaches its maximum depends on many factors, including material properties of the CSP plant components. Increasing the operating temperature of the power generation

  18. Development and application of compact models of packages based on DELPHI methodology

    CERN Document Server

    Parry, J; Shidore, S

    1997-01-01

    The accurate prediction of the temperatures of critical electronic parts at the package- board- and system-level is seriously hampered by the lack of reliable, standardised input data for the characterisation of the thermal $9 behaviour of these parts. The recently completed collaborative European project, DELPHI has been concerned with the creation and experimental validation of thermal models (both detailed and compact) of a range of electronic parts, $9 including mono-chip packages. This paper demonstrates the reliable performance of thermal compact models in a range of applications, by comparison with the detailed models from which they were derived. (31 refs).

  19. Assessment Methodology, Context, and Empowerment: The ACE Model of Skill Development.

    Science.gov (United States)

    Wagner, Sharon L.; Moffett, Richard G. III

    2000-01-01

    The Assessment, Context, and Empowerment Model provides students with opportunities to practice communication, interpersonal, and problem-solving skills in relevant contexts related to the workplace. They receive developmental feedback from themselves, their peers, and their instructor. (SK)

  20. Recent developments in imaging system assessment methodology, FROC analysis and the search model.

    Science.gov (United States)

    Chakraborty, Dev P

    2011-08-21

    A frequent problem in imaging is assessing whether a new imaging system is an improvement over an existing standard. Observer performance methods, in particular the receiver operating characteristic (ROC) paradigm, are widely used in this context. In ROC analysis lesion location information is not used and consequently scoring ambiguities can arise in tasks, such as nodule detection, involving finding localized lesions. This paper reviews progress in the free-response ROC (FROC) paradigm in which the observer marks and rates suspicious regions and the location information is used to determine whether lesions were correctly localized. Reviewed are FROC data analysis, a search-model for simulating FROC data, predictions of the model and a method for estimating the parameters. The search model parameters are physically meaningful quantities that can guide system optimization.

  1. Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM) Summary of Development and Application

    Science.gov (United States)

    2015-04-01

    Stockpile Requirements,” for the Strategic Materials Office of the Defense Logistics Agency (DLA). The views, opinions, and findings should not be... Logistics Agency Strategic Materials (DLA SM) to provide the capability to analyze supply chains of strategic and critical materials and the... material feedstock requirements. The fundamental approach is mass flow analysis, from raw material through each step in production. The model must also

  2. Development of a fluidized bed agglomeration modeling methodology to include particle-level heterogeneities in ash chemistry and granular physics

    Science.gov (United States)

    Khadilkar, Aditi B.

    The utility of fluidized bed reactors for combustion and gasification can be enhanced if operational issues such as agglomeration are mitigated. The monetary and efficiency losses could be avoided through a mechanistic understanding of the agglomeration process and prediction of operational conditions that promote agglomeration. Pilot-scale experimentation prior to operation for each specific condition can be cumbersome and expensive. So the development of a mathematical model would aid predictions. With this motivation, the study comprised of the following model development stages- 1) development of an agglomeration modeling methodology based on binary particle collisions, 2) study of heterogeneities in ash chemical composition and gaseous atmosphere, 3) computation of a distribution of particle collision frequencies based on granular physics for a poly-disperse particle size distribution, 4) combining the ash chemistry and granular physics inputs to obtain agglomerate growth probabilities and 5) validation of the modeling methodology. The modeling methodology comprised of testing every binary particle collision in the system for sticking, based on the extent of dissipation of the particles' kinetic energy through viscous dissipation by slag-liquid (molten ash) covering the particles. In the modeling methodology developed in this study, thermodynamic equilibrium calculations are used to estimate the amount of slag-liquid in the system, and the changes in particle collision frequencies are accounted for by continuously tracking the number density of the various particle sizes. In this study, the heterogeneities in chemical composition of fuel ash were studied by separating the bulk fuel into particle classes that are rich in specific minerals. FactSage simulations were performed on two bituminous coals and an anthracite to understand the effect of particle-level heterogeneities on agglomeration. The mineral matter behavior of these constituent classes was studied

  3. Practical implications of rapid development methodologies

    CSIR Research Space (South Africa)

    Gerber, A

    2007-11-01

    Full Text Available Rapid development methodologies are popular approaches for the development of modern software systems. The goals of these methodologies are the inclusion of the client into the analysis, design and implementation activities, as well...

  4. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  5. Soft-systems thinking for community-development decision making: A participative, computer-based modeling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cook, R.J.

    1987-01-01

    The normative-rational models used to ensure logical decision processes do not capture the complex nature of planning situations, and alternative methodologies that can improve the collection and use of qualitative data are scarce. The intent of this thesis is to design and apply a methodology that may help planners incorporate such data into policy analysis. To guide the application and allow for its evaluation, criteria are gleaned from the literature on computer modeling, human cognition, and group process. From this, a series of individual and group ideation techniques along with two computer-modeling procedures are combined to aid participant understanding and provide computation capabilities. The methodology is applied in the form of a case study in Door County, Wisconsin. The process and its results were evaluated by workshop participants and by three planners who were intent on using this information to help update a county master plan. Based on established criteria, their evaluations indicate that the soft-systems methodology devised in this thesis has potential for improving the collection and use of qualitative data for public-policy purposes.

  6. TOWARDS A NEW METHODOLOGY FOR WEB GIS DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Fanon Ananda1

    2016-07-01

    Full Text Available There has been an increasing need for geospatial information that is delivered through internet technologies. This broad category of systems is referred to as Web Geographic Information Systems (Web GIS. These systems exhibit characteristics common to both stand-alone and web-based systems making it necessary to apply a hybrid methodology during their development. This paper proposes a methodology for developing Web GIS that is herein referred to as the Y-Model Web GIS Development Methodology (YWDM which has been adapted from existing software development methodologies and applied to the context of Web GIS development. The paper outlines in details the phases of the methodology. Its viability as a methodology has been tested through its use in the implementation of the Emuhaya Web GIS portal. The methodology presented here is not intended to be a rigid guide for web GIS development but instead it provides a useful framework for guiding the process.

  7. Developing a new methodology to characterize in vivo the passive mechanical behavior of abdominal wall on an animal model.

    Science.gov (United States)

    Simón-Allué, R; Montiel, J M M; Bellón, J M; Calvo, B

    2015-11-01

    The most common surgical repair of abdominal wall hernia goes through implanting a mesh that substitutes the abdominal muscle/fascia while it is healing. To reduce the risk of relapse or possible complications, this mesh needs to mimic the mechanical behavior of the muscle/fascia, which nowadays is not fully determined. The aim of this work is to develop a methodology to characterize in vivo the passive mechanical behavior of the abdominal wall. For that, New Zealand rabbits were subjected to pneumoperitoneum tests, taking the inner pressure from 0 mmHg to 12 mmHg, values similar to those used in human laparoscopies. Animals treated were divided into two groups: healthy and herniated animals with a surgical mesh (polypropylene Surgipro(TM) Covidien) previously implanted. All experiments were recorded by a stereo rig composed of two synchronized cameras. During the postprocessing of the images, several points over the abdominal surface were tracked and their coordinates extracted for different levels of internal pressure. Starting from that, a three dimensional model of the abdominal wall was reconstructed. Pressure-displacement curves, radii of curvature and strain fields were also analysed. During the experiments, animals tissue mostly deformed during the first levels of pressure, showing the noticeable hyperelastic passive behavior of abdominal muscles. Comparison between healthy and herniated specimen displayed a strong stiffening for herniated animals in the zone where the high density mesh was situated. Cameras were able to discern this change, so this method can be used to measure the possible effect of other meshes.

  8. A Comprehensive Methodology for Development, ParameterEstimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens;

    2016-01-01

    of the prediction. The methodology is evaluated through development of a GC method for the prediction of the heat of combustion (ΔHco) for pure components. The results showed that robust regression lead to best performance statistics for parameter estimation. The bootstrap method is found to be a valid alternative......A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... prediction accuracy for the GC-models; in some cases, it may even increase the prediction error (hence worse prediction accuracy). However, additional parameters do not affect calculated 95% confidence interval. Last but not least, the newly developed GC model of the heat of combustion (ΔHco) shows...

  9. A NEW PRODUCT DEVELOPMENT METHODOLOGY BASED ON PRODUCT MASTER MODEL%基于产品主模型的新产品开发方法论

    Institute of Scientific and Technical Information of China (English)

    袁清珂; 李建雨

    2006-01-01

    Product Master Model (PMM) Technology is a philosophy and methodology for organizing, managing and controlling product development processes, sharing the information and knowledge of the product and its processes. In this paper, the concept of PMM is proposes, the basic theory of PMM technologies is studied, the blackboard architecture of PMM is proposed,the product development process framework is explored, and the prototype system of PMM is developed.

  10. A Structured Methodology for Spreadsheet Modelling

    CERN Document Server

    Knight, Brian; Rajalingham, Kamalesen

    2008-01-01

    In this paper, we discuss the problem of the software engineering of a class of business spreadsheet models. A methodology for structured software development is proposed, which is based on structured analysis of data, represented as Jackson diagrams. It is shown that this analysis allows a straightforward modularisation, and that individual modules may be represented with indentation in the block-structured form of structured programs. The benefits of structured format are discussed, in terms of comprehensibility, ease of maintenance, and reduction in errors. The capability of the methodology to provide a modular overview in the model is described, and examples are given. The potential for a reverse-engineering tool, to transform existing spreadsheet models is discussed.

  11. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  12. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  13. Modeling Virtual Organization Architecture with the Virtual Organization Breeding Methodology

    Science.gov (United States)

    Paszkiewicz, Zbigniew; Picard, Willy

    While Enterprise Architecture Modeling (EAM) methodologies become more and more popular, an EAM methodology tailored to the needs of virtual organizations (VO) is still to be developed. Among the most popular EAM methodologies, TOGAF has been chosen as the basis for a new EAM methodology taking into account characteristics of VOs presented in this paper. In this new methodology, referred as Virtual Organization Breeding Methodology (VOBM), concepts developed within the ECOLEAD project, e.g. the concept of Virtual Breeding Environment (VBE) or the VO creation schema, serve as fundamental elements for development of VOBM. VOBM is a generic methodology that should be adapted to a given VBE. VOBM defines the structure of VBE and VO architectures in a service-oriented environment, as well as an architecture development method for virtual organizations (ADM4VO). Finally, a preliminary set of tools and methods for VOBM is given in this paper.

  14. Modeling Virtual Organization Architecture with the Virtual Organization Breeding Methodology

    CERN Document Server

    Paszkiewicz, Zbigniew

    2011-01-01

    While Enterprise Architecture Modeling (EAM) methodologies become more and more popular, an EAM methodology tailored to the needs of virtual organizations (VO) is still to be developed. Among the most popular EAM methodologies, TOGAF has been chosen as the basis for a new EAM methodology taking into account characteristics of VOs presented in this paper. In this new methodology, referred as Virtual Organization Breeding Methodology (VOBM), concepts developed within the ECOLEAD project, e.g. the concept of Virtual Breeding Environment (VBE) or the VO creation schema, serve as fundamental elements for development of VOBM. VOBM is a generic methodology that should be adapted to a given VBE. VOBM defines the structure of VBE and VO architectures in a service-oriented environment, as well as an architecture development method for virtual organizations (ADM4VO). Finally, a preliminary set of tools and methods for VOBM is given in this paper.

  15. ENACTED SOFTWARE DEVELOPMENT PROCESS BASED ON AGILE AND AGENT METHODOLOGIES

    OpenAIRE

    DR. NACHAMAI. M; M. Senthil Vadivu; VINITA TAPASKAR

    2011-01-01

    Software Engineering gives the procedures and practices to be followed in the software development and acts as a backbone for computer science engineering techniques. This paper deals with current trends in software engineering methodologies, Agile and Agent Oriented software development process. Agile Methodology is to meet the needs of dynamic changing requirements of the customers. This model is iterative and incremental and accepts the changes in requirements at any stage of development. ...

  16. Developing Methodologies for Applying TRMM-Estimated Precipitation Data to Hydrological Modeling of a South TX Watershed - Initial Results

    Science.gov (United States)

    Tobin, K. J.; Bennett, M. E.

    2007-05-01

    Previous experience with hydrological modeling in South Texas, which is located along the Texas-Mexico border, suggests that NWS ground measurements are too widely scattered to provide reliable precipitation input for modeling. In addition, a significant fraction of the study region is located at the edge of the coverage envelopes of the NWS NEXRAD weather radars present in the region limiting the accuracy of these systems to provide reliable precipitation estimates. Therefore, we are exploring whether TRMM estimated-precipitation data (3B42), in some form, can be used to support hydrological modeling in the Middle Rio Grande and Nueces River Basin watersheds. We have begun our modeling efforts by focusing on the middle Nueces watershed (7770 sq km). To model this largely rural watershed we selected the Soil and Water Assessment Tool (SWAT). Three precipitation datasets were selected for our initial model runs that include: (1) nearest NWS cooperative hourly rain gauge data, (2) three hourly TRMM 3B42 estimated precipitation, and (3) combination TRMM 3B42/NWS rain gauge datasets in which ground measurements are used for three hourly periods lacking high quality satellite microwave precipitation estimates as determined from TRMM 3G68 data. Three dataset were aggregated into an average daily estimate of precipitation for each TRMM grid cell. Manual calibration of was completed achieving model results that yield realistic monthly and annual water balances with both gauge and satellite estimate precipitation datasets. In the future, we plan to use the newly developed automatic calibration routine for SWAT, which is based on the Shuffled Complex Evolution algorithm, to optimize modeled discharge results from this study.

  17. GAJA: 3D CAD methodology for developing a parametric system for the automatic (re)modeling of the cutting components of compound washer dies

    Institute of Scientific and Technical Information of China (English)

    David POTO(C)NIK; Bojan DOL(S)AK; Miran ULBIN

    2013-01-01

    For the designing of cutting-dies is a complex and experience-based process,it is poorly supported by conventional 3D CAD software.Thus,the majority of design activities,including the (re)modeling of those cutting die-components that are directly responsible for performing shaping operations on a sheet-metal stamping part,traditionally still need to be carried-out repetitively,separately,and manually by the designer.To eliminate some of these drawbacks and upgrade the capabilities of conventional 3D CAD software,this paper proposes a new methodology for the development of a parametric system capable of automatically performing a (re)modeling process of compound washer dies' cutting-components.The presented methodology integrates CATIA V5 built-in modules,including Part Design,Assembly Design and Knowledge Advisor,publication mechanism,and compound cutting die-design knowledge.The system developed by this methodology represents an 'intelligent' assembly template composed of two modules GAJA1 and GAJA2,respectively.GAJA1 is responsible for the direct input of the die-design problem regarding the shape,dimensions and material of the stamping part,its extraction in the form of geometric features,and the transferring of relevant design parameters and features to the module GAJA2.GAJA2 interprets the current values for the input parameters and automatically performs the modeling process of cutting die-components,using die-design knowledge and the company's internal design and manufacturing standards.Experimental results show that this system significantly shortens the modeling-time for cutting the die-components,improves the modeling-quality,and enables the training of inexperienced designers.

  18. Development of a methodology for electronic waste estimation: A material flow analysis-based SYE-Waste Model.

    Science.gov (United States)

    Yedla, Sudhakar

    2016-01-01

    Improved living standards and the share of services sector to the economy in Asia, and the use of electronic equipment is on the rise and results in increased electronic waste generation. A peculiarity of electronic waste is that it has a 'significant' value even after its life time, and to add complication, even after its extended life in its 'dump' stage. Thus, in Indian situations, after its life time is over, the e-material changes hands more than once and finally ends up either in the hands of informal recyclers or in the store rooms of urban dwellings. This character makes it extremely difficult to estimate electronic waste generation. The present study attempts to develop a functional model based on a material flow analysis approach by considering all possible end uses of the material, its transformed goods finally arriving at disposal. It considers various degrees of uses derived of the e-goods regarding their primary use (life time), secondary use (first degree extension of life), third-hand use (second degree extension of life), donation, retention at the respective places (without discarding), fraction shifted to scrap vendor, and the components reaching the final dump site from various end points of use. This 'generic functional model' named SYE-Waste Model, developed based on a material flow analysis approach, can be used to derive 'obsolescence factors' for various degrees of usage of e-goods and also to make a comprehensive estimation of electronic waste in any city/country.

  19. Software development methodology for high consequence systems

    Energy Technology Data Exchange (ETDEWEB)

    Baca, L.S.; Bouchard, J.F.; Collins, E.W.; Eisenhour, M.; Neidigk, D.D.; Shortencarier, M.J.; Trellue, P.A.

    1997-10-01

    This document describes a Software Development Methodology for High Consequence Systems. A High Consequence System is a system whose failure could lead to serious injury, loss of life, destruction of valuable resources, unauthorized use, damaged reputation or loss of credibility or compromise of protected information. This methodology can be scaled for use in projects of any size and complexity and does not prescribe any specific software engineering technology. Tasks are described that ensure software is developed in a controlled environment. The effort needed to complete the tasks will vary according to the size, complexity, and risks of the project. The emphasis of this methodology is on obtaining the desired attributes for each individual High Consequence System.

  20. Developing Digital Interventions: A Methodological Guide

    Directory of Open Access Journals (Sweden)

    Katherine Bradbury

    2014-01-01

    Full Text Available Digital interventions are becoming an increasingly popular method of delivering healthcare as they enable and promote patient self-management. This paper provides a methodological guide to the processes involved in developing effective digital interventions, detailing how to plan and develop such interventions to avoid common pitfalls. It demonstrates the need for mixed qualitative and quantitative methods in order to develop digital interventions which are effective, feasible, and acceptable to users and stakeholders.

  1. [Adaptive clinical study methodologies in drug development].

    Science.gov (United States)

    Antal, János

    2015-11-29

    The evolution of drug development in human, clinical phase studies triggers the overview of those technologies and procedures which are labelled as adaptive clinical trials. The most relevant procedural and operational aspects will be discussed in this overview from points of view of clinico-methodological aspect.

  2. Development of a methodology for microstructural description

    Directory of Open Access Journals (Sweden)

    Vanderley de Vasconcelos

    1999-07-01

    Full Text Available A systematic methodology for microstructural description can help the task of obtaining the processing x microstructure x properties x performance relationships. There are, however, some difficulties in performing this task, which are related mainly to the following three factors: the complexity of the interactions between microstructural features; difficulties in evaluating geometric parameters of microstructural features; and difficulties in relating these geometric parameters to process variables. To solve some of these problems, it is proposed a methodology that embodies the following features: takes into account the different possible types of approaches for the microstructural description problem; includes concepts and tools of Total Quality Management; is supported on techniques of system analysis; and makes use of computer modeling and simulation and statistical design of experiments tools. The methodology was applied on evaluating some topological parameters during sintering process and its results were compared with available experimental data.

  3. Model evaluation methodology applicable to environmental assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Shaeffer, D.L.

    1979-08-01

    A model evaluation methodology is presented to provide a systematic framework within which the adequacy of environmental assessment models might be examined. The necessity for such a tool is motivated by the widespread use of models for predicting the environmental consequences of various human activities and by the reliance on these model predictions for deciding whether a particular activity requires the deployment of costly control measures. Consequently, the uncertainty associated with prediction must be established for the use of such models. The methodology presented here consists of six major tasks: model examination, algorithm examination, data evaluation, sensitivity analyses, validation studies, and code comparison. This methodology is presented in the form of a flowchart to show the logical interrelatedness of the various tasks. Emphasis has been placed on identifying those parameters which are most important in determining the predictive outputs of a model. Importance has been attached to the process of collecting quality data. A method has been developed for analyzing multiplicative chain models when the input parameters are statistically independent and lognormally distributed. Latin hypercube sampling has been offered as a promising candidate for doing sensitivity analyses. Several different ways of viewing the validity of a model have been presented. Criteria are presented for selecting models for environmental assessment purposes.

  4. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  5. Ontology-Based Classification System Development Methodology

    Directory of Open Access Journals (Sweden)

    Grabusts Peter

    2015-12-01

    Full Text Available The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision tree-based classification systems has been researched. Using such methodologies, the classification accuracy in some cases can be improved.

  6. FAA Development of Reliable Modeling Methodologies for Fan Blade Out Containment Analysis. Part 2; Ballistic Impact Testing

    Science.gov (United States)

    Revilock, Duane M.; Pereira, J. Michael

    2008-01-01

    This report summarizes the ballistic impact testing that was conducted to provide validation data for the development of numerical models of blade out events in fabric containment systems. The ballistic impact response of two different fiber materials - Kevlar 49 (E.I. DuPont Nemours and Company) and Zylon AS (Toyobo Co., Ltd.) was studied by firing metal projectiles into dry woven fabric specimens using a gas gun. The shape, mass, orientation and velocity of the projectile were varied and recorded. In most cases the tests were designed such that the projectile would perforate the specimen, allowing measurement of the energy absorbed by the fabric. The results for both Zylon and Kevlar presented here represent a useful set of data for the purposes of establishing and validating numerical models for predicting the response of fabrics under conditions simulating those of a jet engine blade release situations. In addition some useful empirical observations were made regarding the effects of projectile orientation and the relative performance of the different materials.

  7. Advances in the Development and Application of Computational Methodologies for Structural Modeling of G-Protein Coupled Receptors

    Science.gov (United States)

    Mobarec, Juan Carlos

    2009-01-01

    Background Despite the large amount of experimental data accumulated in the past decade on G-protein coupled receptor (GPCR) structure and function, understanding of the molecular mechanisms underlying GPCR signaling is still far from being complete, thus impairing the design of effective and selective pharmaceuticals. Objective Understanding of GPCR function has been challenged even further by more recent experimental evidence that several of these receptors are organized in the cell membrane as homo- or hetero-oligomers, and that they may exhibit unique pharmacological properties. Given the complexity of these new signaling systems, researcher’s efforts are turning increasingly to molecular modeling, bioinformatics and computational simulations for mechanistic insights of GPCR functional plasticity. Methods We review here current advances in the development and application of computational approaches to improve prediction of GPCR structure and dynamics, thus enhancing current understanding of GPCR signaling. Results/Conclusions Models resulting from use of these computational approaches further supported by experiments are expected to help elucidate the complex allosterism that propagates through GPCR complexes, ultimately aiming at successful structure-based rational drug design. PMID:19672320

  8. Development and Validation of Methodology to Model Flow in Ventilation Systems Commonly Found in Nuclear Facilities. Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Strons, Philip [Argonne National Lab. (ANL), Argonne, IL (United States); Bailey, James L. [Argonne National Lab. (ANL), Argonne, IL (United States); Davis, John [Argonne National Lab. (ANL), Argonne, IL (United States); Grudzinski, James [Argonne National Lab. (ANL), Argonne, IL (United States); Hlotke, John [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-03-01

    In this work, we apply the CFD in modeling airflow and particulate transport. This modeling is then compared to field validation studies to both inform and validate the modeling assumptions. Based on the results of field tests, modeling assumptions and boundary conditions are refined and the process is repeated until the results are found to be reliable with a high level of confidence.

  9. Development of artificial neural network models based on experimental data of response surface methodology to establish the nutritional requirements of digestible lysine, methionine, and threonine in broiler chicks.

    Science.gov (United States)

    Mehri, M

    2012-12-01

    An artificial neural network (ANN) approach was used to develop feed-forward multilayer perceptron models to estimate the nutritional requirements of digestible lysine (dLys), methionine (dMet), and threonine (dThr) in broiler chicks. Sixty data lines representing response of the broiler chicks during 3 to 16 d of age to dietary levels of dLys (0.88-1.32%), dMet (0.42-0.58%), and dThr (0.53-0.87%) were obtained from literature and used to train the networks. The prediction values of ANN were compared with those of response surface methodology to evaluate the fitness of these 2 methods. The models were tested using R(2), mean absolute deviation, mean absolute percentage error, and absolute average deviation. The random search algorithm was used to optimize the developed ANN models to estimate the optimal values of dietary dLys, dMet, and dThr. The ANN models were used to assess the relative importance of each dietary input on the bird performance using sensitivity analysis. The statistical evaluations revealed the higher accuracy of ANN to predict the bird performance compared with response surface methodology models. The optimization results showed that the maximum BW gain may be obtained with dietary levels of 1.11, 0.51, and 0.78% of dLys, dMet, and dThr, respectively. Minimum feed conversion ratio may be achieved with dietary levels of 1.13, 0.54, 0.78% of dLys, dMet, and dThr, respectively. The sensitivity analysis on the models indicated that dietary Lys is the most important variable in the growth performance of the broiler chicks, followed by dietary Thr and Met. The results of this research revealed that the experimental data of a response-surface-methodology design could be successfully used to develop the well-designed ANN for pattern recognition of bird growth and optimization of nutritional requirements. The comparison between the 2 methods also showed that the statistical methods may have little effect on the ideal ratios of dMet and dThr to dLys in

  10. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    Gavras, A.; Belaunde, M.; Ferreira Pires, L.; Andrade Almeida, J.P.; van Sinderen, M.J.; Ferreira Pires, L.

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  11. Towards an MDA-based development methodology for distributed applications

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Gavras, A.; Belaunde, M.; Ferreira Pires, Luis; Andrade Almeida, João

    2004-01-01

    This paper proposes a development methodology for distributed applications based on the principles and concepts of the Model-Driven Architecture (MDA). The paper identifies phases and activities of an MDA-based development trajectory, and defines the roles and products of each activity in accordance

  12. A methodology for modeling regional terrorism risk.

    Science.gov (United States)

    Chatterjee, Samrat; Abkowitz, Mark D

    2011-07-01

    Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States.

  13. Ontology-Based Classification System Development Methodology

    OpenAIRE

    2015-01-01

    The aim of the article is to analyse and develop an ontology-based classification system methodology that uses decision tree learning with statement propositionalized attributes. Classical decision tree learning algorithms, as well as decision tree learning with taxonomy and propositionalized attributes have been observed. Thus, domain ontology can be extracted from the data sets and can be used for data classification with the help of a decision tree. The use of ontology methods in decision ...

  14. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...

  15. A Comprehensive Methodology for Development, Parameter Estimation, and Uncertainty Analysis of Group Contribution Based Property Models -An Application to the Heat of Combustion

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Marcarie, Camille; Abildskov, Jens

    2016-01-01

    A rigorous methodology is developed that addresses numerical and statistical issues when developing group contribution (GC) based property models such as regression methods, optimization algorithms, performance statistics, outlier treatment, parameter identifiability, and uncertainty...... to calculate parameter estimation errors when underlying distribution of residuals is unknown. Many parameters (first,second, third order group contributions) are found unidentifiable from the typically available data, with large estimation error bounds and significant correlation. Due to this poor parameter...... identifiability issues, reporting of the 95% confidence intervals of the predicted property values should be mandatory as opposed to reporting only single value prediction, currently the norm in literature. Moreover, inclusion of higher order groups (additional parameters) does not always lead to improved...

  16. Baseline methodologies for clean development mechanism projects

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M.K. (ed.); Shrestha, R.M.; Sharma, S.; Timilsina, G.R.; Kumar, S.

    2005-11-15

    The Kyoto Protocol and the Clean Development Mechanism (CDM) came into force on 16th February 2005 with its ratification by Russia. The increasing momentum of this process is reflected in more than 100 projects having been submitted to the CDM Executive Board (CDM-EB) for approval of the baselines and monitoring methodologies, which is the first step in developing and implementing CDM projects. A CDM project should result in a net decrease of GHG emissions below any level that would have resulted from other activities implemented in the absence of that CDM project. The 'baseline' defines the GHG emissions of activities that would have been implemented in the absence of a CDM project. The baseline methodology is the process/algorithm for establishing that baseline. The baseline, along with the baseline methodology, are thus the most critical element of any CDM project towards meeting the important criteria of CDM, which are that a CDM should result in 'real, measurable, and long term benefits related to the mitigation of climate change'. This guidebook is produced within the frame work of the United Nations Environment Programme (UNEP) facilitated 'Capacity Development for the Clean Development Mechanism (CD4CDM)' Project. This document is published as part of the projects effort to develop guidebooks that cover important issues such as project finance, sustainability impacts, legal framework and institutional framework. These materials are aimed to help stakeholders better understand the CDM and are believed to eventually contribute to maximize the effect of the CDM in achieving the ultimate goal of UNFCCC and its Kyoto Protocol. This Guidebook should be read in conjunction with the information provided in the two other guidebooks entitled, 'Clean Development Mechanism: Introduction to the CDM' and 'CDM Information and Guidebook' developed under the CD4CDM project. (BA)

  17. Theories, Models and Methodology in Writing Research

    NARCIS (Netherlands)

    Rijlaarsdam, Gert; Bergh, van den Huub; Couzijn, Michel

    1996-01-01

    Theories, Models and Methodology in Writing Research describes the current state of the art in research on written text production. The chapters in the first part offer contributions to the creation of new theories and models for writing processes. The second part examines specific elements of the w

  18. Methodologies for local development in smart society

    Directory of Open Access Journals (Sweden)

    Lorena BĂTĂGAN

    2012-07-01

    Full Text Available All of digital devices which are connected through the Internet, are producing a big quantity of data. All this information can be turned into knowledge because we now have the computational power and solutions for advanced analytics to make sense of it. With this knowledge, cities could reduce costs, cut waste, and improve efficiency, productivity and quality of life for their citizens. The efficient/smart cities are characterized by more importance given to environment, resources, globalization and sustainable development. This paper represents a study on the methodologies for urban development that become the central element to our society.

  19. Comparing Methodologies for Developing an Early Warning System: Classification and Regression Tree Model versus Logistic Regression. REL 2015-077

    Science.gov (United States)

    Koon, Sharon; Petscher, Yaacov

    2015-01-01

    The purpose of this report was to explicate the use of logistic regression and classification and regression tree (CART) analysis in the development of early warning systems. It was motivated by state education leaders' interest in maintaining high classification accuracy while simultaneously improving practitioner understanding of the rules by…

  20. Intelligent CAD Methodology Research of Adaptive Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Weibo; LI Jun; YAN Jianrong

    2006-01-01

    The key to carry out ICAD technology is to establish the knowledge-based and wide rang of domains-covered product model. This paper put out a knowledge-based methodology of adaptive modeling. It is under the Ontology mind, using the Object-Oriented technology and being a knowledge-based model framework. It involves the diverse domains in product design and realizes the multi-domain modeling, embedding the relative information including standards, regulars and expert experience. To test the feasibility of the methodology, the research bonds of the automotive diaphragm spring clutch design and an adaptive clutch design model is established, using the knowledge-based modeling language-AML.

  1. Thermodynamic modeling of ionic liquid systems: development and detailed overview of novel methodology based on the PC-SAFT.

    Science.gov (United States)

    Paduszyński, Kamil; Domańska, Urszula

    2012-04-26

    We present the results of an extensive study on a novel approach of modeling ionic liquids (ILs) and their mixtures with molecular compounds, incorporating perturbed-chain statistical associating fluid theory (PC-SAFT). PC-SAFT was used to calculate the thermodynamic properties of different homologous series of ILs based on the bis(trifluormethylsulfonyl)imide anion ([NTf2]). First, pure fluid parameters were obtained for each IL by means of fitting the model predictions to experimental liquid densities over a broad range of temperature and pressure. The reliability and physical significance of the parameters as well as the employed molecular scheme were tested by calculation of density, vapor pressure, and other properties of pure ILs (e.g., critical properties, normal boiling point). Additionally, the surface tension of pure ILs was calculated by coupling the PC-SAFT equation of state with density gradient theory (DGT). All correlated/predicted results were compared with literature experimental or simulation data. Afterward, we attempted to model various thermodynamic properties of some binary systems composed of IL and organic solvent or water. The properties under study were the binary vapor-liquid, liquid-liquid, and solid-liquid equilibria and the excess enthalpies of mixing. To calculate cross-interaction energies we used the standard combining rules of Lorentz-Berthelot, Kleiner-Sadowski, and Wolbach-Sandler. It was shown that incorporation of temperature-dependent binary corrections was required to obtain much more accurate results than in the case of conventional predictions. Binary corrections were adjusted to infinite dilution activity coefficients of a particular solute in a given IL determined experimentally or predicted by means of the modified UNIFAC (Dortmund) group contribution method. We concluded that the latter method allows accurate and reliable calculations of bulk-phase properties in a totally predictive manner.

  2. NPA4K development system using object-oriented methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Seong; Hahn, Do Hee

    2000-11-01

    NPA4K consists of module programs with several components for various functions. Software components have to be developed systematically by compartment criteria and design method. In this paper, the understandings of a typical Object-Oriented Methodology , UML(Unified Modeling Language), the procedure for NPA4K program development and the architecture for long-term development of NPA4K are introduced.

  3. Women in India with Gestational Diabetes Mellitus Strategy (WINGS): Methodology and development of model of care for gestational diabetes mellitus (WINGS 4)

    Science.gov (United States)

    Kayal, Arivudainambi; Mohan, Viswanathan; Malanda, Belma; Anjana, Ranjit Mohan; Bhavadharini, Balaji; Mahalakshmi, Manni Mohanraj; Maheswari, Kumar; Uma, Ram; Unnikrishnan, Ranjit; Kalaiyarasi, Gunasekaran; Ninov, Lyudmil; Belton, Anne

    2016-01-01

    Aim: The Women In India with GDM Strategy (WINGS) project was conducted with the aim of developing a model of care (MOC) suitable for women with gestational diabetes mellitus (GDM) in low- and middle-income countries. Methodology: The WINGS project was carried out in Chennai, Southern India, in two phases. In Phase I, a situational analysis was conducted to understand the practice patterns of health-care professionals and to determine the best screening criteria through a pilot screening study. Results: Phase II involved developing a MOC-based on findings from the situational analysis and evaluating its effectiveness. The model focused on diagnosis, management, and follow-up of women with GDM who were followed prospectively throughout their pregnancy. An educational booklet was provided to all women with GDM, offering guidance on self-management of GDM including sample meal plans and physical activity tips. A pedometer was provided to all women to monitor step count. Medical nutrition therapy (MNT) was the first line of treatment given to women with GDM. Women were advised to undergo fasting blood glucose and postprandial blood glucose testing every fortnight. Insulin was indicated when the target blood glucose levels were not achieved with MNT. Women were evaluated for pregnancy outcomes and postpartum glucose tolerance status. Conclusions: The WINGS MOC offers a comprehensive package at every level of care for women with GDM. If successful, this MOC will be scaled up to other resource-constrained settings with the hope of improving lives of women with GDM. PMID:27730085

  4. Certification Testing Methodology for Composite Structure. Volume 2. Methodology Development

    Science.gov (United States)

    1986-10-01

    J.E., "F-18 Composites Development Tests," N00019- 79-C-0044 (January 1981). 3. Stenberg , K.V., et al., "YAV-8B Composite Wing Development," Volumes I...Louis, MO 63166 (Attn: K. Stenberg , R. Garrett, R. Riley, J. Doerr). . . 4 MCDONNELL-DOUGLAS CORP., Long Beach, CA 90846 (Attn: J. Palmer

  5. An Adaptive Design Methodology for Reduction of Product Development Risk

    CERN Document Server

    Pakala, Hara Gopal Mani; Kvsvn, Dr Raju; Khan, Dr Ibrahim; 10.5121/ijasuc.2011.2303

    2011-01-01

    Embedded systems interaction with environment inherently complicates understanding of requirements and their correct implementation. However, product uncertainty is highest during early stages of development. Design verification is an essential step in the development of any system, especially for Embedded System. This paper introduces a novel adaptive design methodology, which incorporates step-wise prototyping and verification. With each adaptive step product-realization level is enhanced while decreasing the level of product uncertainty, thereby reducing the overall costs. The back-bone of this frame-work is the development of Domain Specific Operational (DOP) Model and the associated Verification Instrumentation for Test and Evaluation, developed based on the DOP model. Together they generate functionally valid test-sequence for carrying out prototype evaluation. With the help of a case study 'Multimode Detection Subsystem' the application of this method is sketched. The design methodologies can be compar...

  6. Data and models in Action. Methodological Issues in Production Ecology

    NARCIS (Netherlands)

    Stein, A.; Penning, de F.W.T.

    1999-01-01

    This book addresses methodological issues of production ecology. A central issue is the combination of the agricultural model with reliable data in relation to scale. A model is developed with data from a single point, whereas decisions are to be made for areas of land. Such an approach requires the

  7. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available -233. Drechsler, A., & Hevner, A. (2016). A four-cycle model of IS design science research: Capturing the dynamic nature of IS artefact design. Cork Open Research Archive (CORA), 1-9. Herselman, M., & Botha, A. (2015). Evaluating an Artifact in Design Science...

  8. Assessing quality in software development: An agile methodology approach

    OpenAIRE

    V. Rodríguez-Hernández; M.C. Espino-Gudiño; J.L. González-Pérez; J. Gudiño-Bazaldúa; Victor Castano

    2015-01-01

    A novel methodology, result of 10 years of in-field testing, which makes possible the convergence of different types of models and quality standards for Engineering and Computer Science Faculties, is presented. Since most software-developing companies are small and medium sized, the projects developed must focuson SCRUM and Extreme Programming (XP), opposed to a RUP, which is quite heavy, as well as on Personal Software Process (PSP) and Team Software Process (TSP), which provide students wit...

  9. Simulation and Modeling Methodologies, Technologies and Applications

    CERN Document Server

    Filipe, Joaquim; Kacprzyk, Janusz; Pina, Nuno

    2014-01-01

    This book includes extended and revised versions of a set of selected papers from the 2012 International Conference on Simulation and Modeling Methodologies, Technologies and Applications (SIMULTECH 2012) which was sponsored by the Institute for Systems and Technologies of Information, Control and Communication (INSTICC) and held in Rome, Italy. SIMULTECH 2012 was technically co-sponsored by the Society for Modeling & Simulation International (SCS), GDR I3, Lionphant Simulation, Simulation Team and IFIP and held in cooperation with AIS Special Interest Group of Modeling and Simulation (AIS SIGMAS) and the Movimento Italiano Modellazione e Simulazione (MIMOS).

  10. Evaluation of mechanical load in the musculoskeletal system : development of experimental and modeling methodologies for the study of the effect of exercise in human models

    OpenAIRE

    João, Filipa Oliveira da Silva

    2013-01-01

    Doutoramento em Motricidade Humana, na especialidade de Biomecânica A major concern of Biomechanics research is the evaluation of the mechanical load and power that the human body develops and endorses when performing high to moderate sport activities. With the purpose of increasing performance and reducing the risk of injury, substantial advances were accomplished to pursuit this goal, either on the laboratory techniques as well as modelling and simulation. Traditionally, the main focus w...

  11. Developing collaborative environments - A Holistic software development methodology

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN,MARJORIE B.; MITCHINER,JOHN L.

    2000-03-08

    Sandia National Laboratories has been developing technologies to support person-to-person collaboration and the efforts of teams in the business and research communities. The technologies developed include knowledge-based design advisors, knowledge management systems, and streamlined manufacturing supply chains. These collaborative environments in which people can work together sharing information and knowledge have required a new approach to software development. The approach includes an emphasis on the requisite change in business practice that often inhibits user acceptance of collaborative technology. Leveraging the experience from this work, they have established a multidisciplinary approach for developing collaborative software environments. They call this approach ``A Holistic Software Development Methodology''.

  12. Demand Activated Manufacturing Architecture (DAMA) supply chain collaboration development methodology

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN,MARJORIE B.; CHAPMAN,LEON D.

    2000-03-15

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise supply chain collaboration development methodology. The goal of this methodology is to enable a supply chain to work more efficiently and competitively. The outcomes of this methodology include: (1) A definitive description and evaluation of the role of business cultures and supporting business organizational structures in either inhibiting or fostering change to a more competitive supply chain; (2) ``As-Is'' and proposed ``To-Be'' supply chain business process models focusing on information flows and decision-making; and (3) Software tools that enable and support a transition to a more competitive supply chain, which results form a business driven rather than technologically driven approach to software design. This methodology development will continue in FY00 as DAMA engages companies in the soft goods industry in supply chain research and implementation of supply chain collaboration.

  13. Demand Activated Manufacturing Architecture (DAMA) supply chain collaboration development methodology

    Energy Technology Data Exchange (ETDEWEB)

    PETERSEN,MARJORIE B.; CHAPMAN,LEON D.

    2000-03-15

    The Demand Activated Manufacturing Architecture (DAMA) project during the last five years of work with the U.S. Integrated Textile Complex (retail, apparel, textile, and fiber sectors) has developed an inter-enterprise supply chain collaboration development methodology. The goal of this methodology is to enable a supply chain to work more efficiently and competitively. The outcomes of this methodology include: (1) A definitive description and evaluation of the role of business cultures and supporting business organizational structures in either inhibiting or fostering change to a more competitive supply chain; (2) ``As-Is'' and proposed ``To-Be'' supply chain business process models focusing on information flows and decision-making; and (3) Software tools that enable and support a transition to a more competitive supply chain, which results form a business driven rather than technologically driven approach to software design. This methodology development will continue in FY00 as DAMA engages companies in the soft goods industry in supply chain research and implementation of supply chain collaboration.

  14. SYSTEMS METHODOLOGY AND MATHEMATICAL MODELS FOR KNOWLEDGE MANAGEMENT

    Institute of Scientific and Technical Information of China (English)

    Yoshiteru NAKAMORI

    2003-01-01

    This paper first introduces a new discipline knowledge science and the role of systems science inits development. Then, after the discussion on current trend in systems science, the paper proposes anew systems methodology for knowledge management and creation. Finally, the paper discussesmathematical modeling techniques to represent and manage human knowledge that is essentiallyvague and context-dependent.

  15. Development of a methodology to evaluate probable maximum snow accumulation using a regional climate model: application to Quebec, Canada, under changing climate conditions

    Science.gov (United States)

    Klein, I. M.; Rousseau, A. N.; Gagnon, P.; Frigon, A.

    2012-12-01

    Probable Maximum Snow Accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood. A robust methodology for evaluating the PMSA is imperative so the resulting spring probable maximum flood is neither overestimated, which would mean financial losses, nor underestimated, which could affect public safety. In addition, the impact of climate change needs to be considered since it is known that solid precipitation in some Nordic landscapes will in all likelihood intensify over the next century. In this paper, outputs from different simulations produced by the Canadian Regional Climate Model are used to estimate PMSAs for southern Quebec, Canada (44.1°N - 49.1°N; 68.2°W - 75.5°W). Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationary tests indicate that climate change will not only affect precipitation and temperature but also the monthly maximum precipitable water and the ensuing maximization ratio r. The maximization ratio r is used to maximize "efficient" snowfall events; and represents the ratio of the 100-year precipitable water of a given month divided by the snowstorm precipitable water. A computational method was developed to maximize precipitable water using a non-stationary frequency analysis. The method was carefully adapted to the spatial and temporal constraints embedded in the resolution of the available simulation data. For example, for a given grid cell and time step, snow and rain may occur simultaneously. In this case, the focus is restricted to snow and snow-storm-conditions only, thus rainfall and humidity that could lead to rainfall are neglected. Also, the temporal resolution cannot necessarily capture the full duration of actual snow storms. The threshold for a snowstorm to be maximized and the duration resulting from considered time steps are adjusted in order to obtain a high percentage of maximization ratios below

  16. Applying of component system development in object methodology, case study

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    Full Text Available To create computarization target software as a component system has been a very strong requirement for the last 20 years of software developing. Finally, the architectural components are self-contained units, presenting not only partial and overall system behavior, but also cooperating with each other on the basis of their interfaces. Among others, components have allowed flexible modification of processes the behavior of which is the foundation of components behavior without changing the life of the component system. On the other hand, the component system makes it possible, at design time, to create numerous new connections between components and thus creating modified system behaviors. This all enables the company management to perform, at design time, required behavioral changes of processes in accordance with the requirements of changing production and market.The development of software which is generally referred to as SDP (Software Development Process contains two directions. The first one, called CBD (Component–Based Development, is dedicated to the development of component–based systems CBS (Component–based System, the second target is the development of software under the influence of SOA (Service–Oriented Architecture. Both directions are equipped with their different development methodologies. The subject of this paper is only the first direction and application of development of component–based systems in its object–oriented methodologies. The requirement of today is to carry out the development of component-based systems in the framework of developed object–oriented methodologies precisely in the way of a dominant style. In some of the known methodologies, however, this development is not completely transparent and is not even recognized as dominant. In some cases, it is corrected by the special meta–integration models of component system development into an object methodology.This paper presents a case study

  17. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model developm......We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  18. Photovoltaic-system costing-methodology development. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1982-07-01

    Presented are the results of a study to expand the use of standardized costing methodologies in the National Photovoltaics Program. The costing standards, which include SAMIS for manufacturing costs and M and D for marketing and distribution costs, have been applied to concentrator collectors and power-conditioning units. The M and D model was also computerized. Finally, a uniform construction cost-accounting structure was developed for use in photovoltaic test and application projects. The appendices contain example cases which demonstrate the use of the models.

  19. Availability modeling methodology applied to solar power systems

    Science.gov (United States)

    Unione, A.; Burns, E.; Husseiny, A.

    1981-01-01

    Availability is discussed as a measure for estimating the expected performance for solar- and wind-powered generation systems and for identifying causes of performance loss. Applicable analysis techniques, ranging from simple system models to probabilistic fault tree analysis, are reviewed. A methodology incorporating typical availability models is developed for estimating reliable plant capacity. Examples illustrating the impact of design and configurational differences on the expected capacity of a solar-thermal power plant with a fossil-fired backup unit are given.

  20. Transformations of summary statistics as input in meta-analysis for linear dose–response models on a logarithmic scale: a methodology developed within EURRECA

    Directory of Open Access Journals (Sweden)

    Souverein Olga W

    2012-04-01

    Full Text Available Abstract Background To derive micronutrient recommendations in a scientifically sound way, it is important to obtain and analyse all published information on the association between micronutrient intake and biochemical proxies for micronutrient status using a systematic approach. Therefore, it is important to incorporate information from randomized controlled trials as well as observational studies as both of these provide information on the association. However, original research papers present their data in various ways. Methods This paper presents a methodology to obtain an estimate of the dose–response curve, assuming a bivariate normal linear model on the logarithmic scale, incorporating a range of transformations of the original reported data. Results The simulation study, conducted to validate the methodology, shows that there is no bias in the transformations. Furthermore, it is shown that when the original studies report the mean and standard deviation or the geometric mean and confidence interval the results are less variable compared to when the median with IQR or range is reported in the original study. Conclusions The presented methodology with transformations for various reported data provides a valid way to estimate the dose–response curve for micronutrient intake and status using both randomized controlled trials and observational studies.

  1. Applying Lean on Agile Scrum Development Methodology

    Directory of Open Access Journals (Sweden)

    SurendRaj Dharmapal

    2015-11-01

    Full Text Available This journal introduces the reader to Agile and Lean concepts and provides a basic level of understanding of each process. This journal will also provide a brief background about applying Lean concepts on each phase of agile scrum methodology and summarize their primary business advantages for delivering value to customer.

  2. Applying Lean on Agile Scrum Development Methodology

    Directory of Open Access Journals (Sweden)

    SurendRaj Dharmapal

    2014-03-01

    Full Text Available This journal introduces the reader to Agile and Lean concepts and provides a basic leve l of understanding of each process. This journal will also provide a brief background about applying Lean concepts on each phase of agile scrum methodology and summarize their primary business advantages for delivering value to customer.

  3. Recursive modular modelling methodology for lumped-parameter dynamic systems.

    Science.gov (United States)

    Orsino, Renato Maia Matarazzo

    2017-08-01

    This paper proposes a novel approach to the modelling of lumped-parameter dynamic systems, based on representing them by hierarchies of mathematical models of increasing complexity instead of a single (complex) model. Exploring the multilevel modularity that these systems typically exhibit, a general recursive modelling methodology is proposed, in order to conciliate the use of the already existing modelling techniques. The general algorithm is based on a fundamental theorem that states the conditions for computing projection operators recursively. Three procedures for these computations are discussed: orthonormalization, use of orthogonal complements and use of generalized inverses. The novel methodology is also applied for the development of a recursive algorithm based on the Udwadia-Kalaba equation, which proves to be identical to the one of a Kalman filter for estimating the state of a static process, given a sequence of noiseless measurements representing the constraints that must be satisfied by the system.

  4. New Methodology for the Development of Chromatographic Methods

    OpenAIRE

    Rozet, Eric; Debrus, Benjamin; Lebrun, Pierre; Boulanger, B.; Hubert, Philippe

    2011-01-01

    As defined by ICH [1] and FDA, Quality by Design (QbD) stands for “a systematic approach to development that begins with predefined objectives and emphasizes product and process understanding and process control, based on sound science and quality risk management”. A risk–based QbD–compliant approach is proposed for the robust development of analytical methods. This methodology based on Design of Experiments (DoE) to study the experimental domain models the retention times at the beginning, t...

  5. Photogrammetry Methodology Development for Gossamer Spacecraft Structures

    Science.gov (United States)

    Pappa, Richard S.; Jones, Thomas W.; Black, Jonathan T.; Walford, Alan; Robson, Stuart; Shortis, Mark R.

    2002-01-01

    Photogrammetry--the science of calculating 3D object coordinates from images--is a flexible and robust approach for measuring the static and dynamic characteristics of future ultra-lightweight and inflatable space structures (a.k.a., Gossamer structures), such as large membrane reflectors, solar sails, and thin-film solar arrays. Shape and dynamic measurements are required to validate new structural modeling techniques and corresponding analytical models for these unconventional systems. This paper summarizes experiences at NASA Langley Research Center over the past three years to develop or adapt photogrammetry methods for the specific problem of measuring Gossamer space structures. Turnkey industrial photogrammetry systems were not considered a cost-effective choice for this basic research effort because of their high purchase and maintenance costs. Instead, this research uses mainly off-the-shelf digital-camera and software technologies that are affordable to most organizations and provide acceptable accuracy.

  6. The Methodology for Ontology Development in Lesson Plan Domain

    Directory of Open Access Journals (Sweden)

    Aslina Saad

    2016-04-01

    Full Text Available Ontology has been recognized as a knowledge representation mechanism that supports a semantic web application. The semantic web application that supports lesson plan construction is crucial for teachers to deal with the massive information sources from various domains on the web. Thus, knowledge in lesson plan domain needs to be represented accordingly so that the search on the web will retrieve relevant materials only. Essentially, such retrieval needs an appropriate representation of the domain problem. The emergence of semantic web technology provides a promising solution to improve the representation, sharing, and re-use of information to support decision making. Thus, the knowledge of lesson plan domain needs to be represented ontologically to support efficient retrieval of semantic web application in the domain of lesson plan. This paper presents a new methodology for ontology development representation of lesson plan domain to support semantic web application. The methodology is focused on the important model, tools, and techniques in each phase of the development. The methodology consists of four phases, namely requirements analysis, development, implementation, evaluation and maintenance.

  7. Methodology, models and algorithms in thermographic diagnostics

    CERN Document Server

    Živčák, Jozef; Madarász, Ladislav; Rudas, Imre J

    2013-01-01

    This book presents  the methodology and techniques of  thermographic applications with focus primarily on medical thermography implemented for parametrizing the diagnostics of the human body. The first part of the book describes the basics of infrared thermography, the possibilities of thermographic diagnostics and the physical nature of thermography. The second half includes tools of intelligent engineering applied for the solving of selected applications and projects. Thermographic diagnostics was applied to problematics of paraplegia and tetraplegia and carpal tunnel syndrome (CTS). The results of the research activities were created with the cooperation of the four projects within the Ministry of Education, Science, Research and Sport of the Slovak Republic entitled Digital control of complex systems with two degrees of freedom, Progressive methods of education in the area of control and modeling of complex object oriented systems on aircraft turbocompressor engines, Center for research of control of te...

  8. Agile vs Traditional Methodologies in Developing Information Systems

    Directory of Open Access Journals (Sweden)

    Pere Tumbas

    2006-12-01

    Full Text Available After the review of principles and concepts of structural and object-oriented development of information systems, the work points to the elements of agile approaches and gives short description of some selected agile methodologies. After these reviews, their comparision according to criteria is done. The first criterion reviews the volume of methodology in which project management is used in developing information systems The second criterion shows if the processes, defined by methodology, cover appropriate phase of the life cycle. The last criterion shows if methodology iniciates the use of skills and tools in the life cycle phases of developing information systems. Finally, the work compares, according to the key elements of development, traditional (structural and object methodologies with agile methodologies.

  9. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    43 Figure 14: Simulation Study Methodology for the Weapon System Analysis Metrics Definition and Data Collection The analysis plan calls for...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Presented to the Faculty Department of Operational Sciences

  10. Development of a statistically based access delay timeline methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Rivera, W. Gary; Robinson, David Gerald; Wyss, Gregory Dane; Hendrickson, Stacey M. Langfitt

    2013-02-01

    The charter for adversarial delay is to hinder access to critical resources through the use of physical systems increasing an adversarys task time. The traditional method for characterizing access delay has been a simple model focused on accumulating times required to complete each task with little regard to uncertainty, complexity, or decreased efficiency associated with multiple sequential tasks or stress. The delay associated with any given barrier or path is further discounted to worst-case, and often unrealistic, times based on a high-level adversary, resulting in a highly conservative calculation of total delay. This leads to delay systems that require significant funding and personnel resources in order to defend against the assumed threat, which for many sites and applications becomes cost prohibitive. A new methodology has been developed that considers the uncertainties inherent in the problem to develop a realistic timeline distribution for a given adversary path. This new methodology incorporates advanced Bayesian statistical theory and methodologies, taking into account small sample size, expert judgment, human factors and threat uncertainty. The result is an algorithm that can calculate a probability distribution function of delay times directly related to system risk. Through further analysis, the access delay analyst or end user can use the results in making informed decisions while weighing benefits against risks, ultimately resulting in greater system effectiveness with lower cost.

  11. The SIMRAND methodology - Simulation of Research and Development Projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  12. Powerline Communications Channel Modelling Methodology Based on Statistical Features

    CERN Document Server

    Tan, Bo

    2012-01-01

    This paper proposes a new channel modelling method for powerline communications networks based on the multipath profile in the time domain. The new channel model is developed to be applied in a range of Powerline Communications (PLC) research topics such as impulse noise modelling, deployment and coverage studies, and communications theory analysis. To develop the methodology, channels are categorised according to their propagation distance and power delay profile. The statistical multipath parameters such as path arrival time, magnitude and interval for each category are analyzed to build the model. Each generated channel based on the proposed statistical model represents a different realisation of a PLC network. Simulation results in similar the time and frequency domains show that the proposed statistical modelling method, which integrates the impact of network topology presents the PLC channel features as the underlying transmission line theory model. Furthermore, two potential application scenarios are d...

  13. Combining Ontology Development Methodologies and Semantic Web Platforms for E-government Domain Ontology Development

    CERN Document Server

    Dombeu, Jean Vincent Fonou; 10.5121/ijwest.2011.2202

    2011-01-01

    One of the key challenges in electronic government (e-government) is the development of systems that can be easily integrated and interoperated to provide seamless services delivery to citizens. In recent years, Semantic Web technologies based on ontology have emerged as promising solutions to the above engineering problems. However, current research practicing semantic development in e-government does not focus on the application of available methodologies and platforms for developing government domain ontologies. Furthermore, only a few of these researches provide detailed guidelines for developing semantic ontology models from a government service domain. This research presents a case study combining an ontology building methodology and two state-of-the-art Semantic Web platforms namely Protege and Java Jena ontology API for semantic ontology development in e-government. Firstly, a framework adopted from the Uschold and King ontology building methodology is employed to build a domain ontology describing th...

  14. An improved methodology for precise geoid/quasigeoid modelling

    Science.gov (United States)

    Nesvadba, Otakar; Holota, Petr

    2016-04-01

    The paper describes recent development of the computational procedure useful for precise local quasigeoid modelling. The overall methodology is primarily based on a solution of the so-called gravimetric boundary value problem for an ellipsoidal domain (exterior to an oblate spheroid), which means that gravity disturbances on the ellipsoid are used in quality of input data. The problem of a difference between the Earth's topography and the chosen ellipsoidal surface is solved iteratively, by analytical continuation of the gravity disturbances to the computational ellipsoid. The methodology covers an interpolation technique of the discrete gravity data, which, considering a priori adopted covariance function, provides the best linear unbiased estimate of the respective quantity, numerical integration technique developed on the surface of ellipsoid in the spectral domain, an iterative procedure of analytical continuation in ellipsoidal coordinates, remove and restore of the atmospheric masses, an estimate of the far-zones contribution (in a case of regional data coverage) and the restore step of the obtained disturbing gravity potential to the target height anomaly. All the computational steps of the procedure are modest in the consumption of compute resources, thus the methodology can be used on a common personal computer, free of any accuracy or resolution penalty. Finally, the performance of the developed methodology is demonstrated on the real-case examples related to the territories of France (Auvergne regional quasigeoid) and the Czech Republic.

  15. Model development for mechanical properties and weld quality class of friction stir welding using multi-objective Taguchi method and response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, Mohamed Ackiel [University Kuala Lumpur Malaysia France Institute, Bandar Baru Bangi (Malaysia); Manurung, Yupiter HP; Berhan, Mohamed Nor [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-06-15

    This study presents the effect of the governing parameters in friction stir welding (FSW) on the mechanical properties and weld quality of a 6mm thick 6061 T651 Aluminum alloy butt joint. The main FSW parameters, the rotational and traverse speed were optimized based on multiple mechanical properties and quality features, which focus on the tensile strength, hardness and the weld quality class using the multi-objective Taguchi method (MTM). Multi signal to noise ratio (MSNR) was employed to determine the optimum welding parameters for MTM while further analysis concerning the significant level determination was accomplished via the well-established analysis of variance (ANOVA). Furthermore, the first order model for predicting the mechanical properties and weld quality class is derived by applying response surface methodology (RSM). Based on the experimental confirmation test, the proposed method can effectively estimate the mechanical properties and weld quality class which can be used to enhance the welding performance in FSW or other applications.

  16. Development of a computational methodology for internal dose calculations

    CERN Document Server

    Yoriyaz, H

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phanto...

  17. Methodology for Modeling and Analysis of Business Processes (MMABP)

    OpenAIRE

    Vaclav Repa; Tomas Bruckner

    2015-01-01

    This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process stat...

  18. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  19. Safety-related operator actions: methodology for developing criteria

    Energy Technology Data Exchange (ETDEWEB)

    Kozinsky, E.J.; Gray, L.H.; Beare, A.N.; Barks, D.B.; Gomer, F.E.

    1984-03-01

    This report presents a methodology for developing criteria for design evaluation of safety-related actions by nuclear power plant reactor operators, and identifies a supporting data base. It is the eleventh and final NUREG/CR Report on the Safety-Related Operator Actions Program, conducted by Oak Ridge National Laboratory for the US Nuclear Regulatory Commission. The operator performance data were developed from training simulator experiments involving operator responses to simulated scenarios of plant disturbances; from field data on events with similar scenarios; and from task analytic data. A conceptual model to integrate the data was developed and a computer simulation of the model was run, using the SAINT modeling language. Proposed is a quantitative predictive model of operator performance, the Operator Personnel Performance Simulation (OPPS) Model, driven by task requirements, information presentation, and system dynamics. The model output, a probability distribution of predicted time to correctly complete safety-related operator actions, provides data for objective evaluation of quantitative design criteria.

  20. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF......). This paper presents an application of the Udwadia-Kalaba Equation for modelling the Reconfigurable Underwater Robots. The constraints developed to enforce the rigid connection between robots in the system is derived through restrictions on relative distances and orientations. To avoid singularities...

  1. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF......). This paper presents an application of the Udwadia-Kalaba Equation for modelling the Reconfigurable Underwater Robots. The constraints developed to enforce the rigid connection between robots in the system is derived through restrictions on relative distances and orientations. To avoid singularities...... in the orientation and, thereby, allow the robots to undertake any relative configuration the attitude is represented in Euler parameters....

  2. Reference Management Methodologies for Large Structural Models at Kennedy Space Center

    Science.gov (United States)

    Jones, Corey; Bingham, Ryan; Schmidt, Rick

    2011-01-01

    There have been many challenges associated with modeling some of NASA KSC's largest structures. Given the size of the welded structures here at KSC, it was critically important to properly organize model struc.ture and carefully manage references. Additionally, because of the amount of hardware to be installed on these structures, it was very important to have a means to coordinate between different design teams and organizations, check for interferences, produce consistent drawings, and allow for simple release processes. Facing these challenges, the modeling team developed a unique reference management methodology and model fidelity methodology. This presentation will describe the techniques and methodologies that were developed for these projects. The attendees will learn about KSC's reference management and model fidelity methodologies for large structures. The attendees will understand the goals of these methodologies. The attendees will appreciate the advantages of developing a reference management methodology.

  3. Mixed-mode modelling mixing methodologies for organisational intervention

    CERN Document Server

    Clarke, Steve; Lehaney, Brian

    2001-01-01

    The 1980s and 1990s have seen a growing interest in research and practice in the use of methodologies within problem contexts characterised by a primary focus on technology, human issues, or power. During the last five to ten years, this has given rise to challenges regarding the ability of a single methodology to address all such contexts, and the consequent development of approaches which aim to mix methodologies within a single problem situation. This has been particularly so where the situation has called for a mix of technological (the so-called 'hard') and human­ centred (so-called 'soft') methods. The approach developed has been termed mixed-mode modelling. The area of mixed-mode modelling is relatively new, with the phrase being coined approximately four years ago by Brian Lehaney in a keynote paper published at the 1996 Annual Conference of the UK Operational Research Society. Mixed-mode modelling, as suggested above, is a new way of considering problem situations faced by organisations. Traditional...

  4. Methodology of problem space modeling in industrial enterprise management system

    Directory of Open Access Journals (Sweden)

    V.V. Glushchevsky

    2015-03-01

    Full Text Available The aim of the article. The aim of the article is to develop methodological principles for building a problem space model which can be integrated into industrial enterprise management system. The results of the analysis. The author developed methodological principles for constructing the problem space of an industrial enterprise as a structural and functional model. These problems appear on enterprise business process network topology and can be solved by its management system. The centerpiece of the article is description of the main stages of implementation of modeling methodology of industrial enterprise typical management problems. These stages help to solve several units of organizational management system structure of enterprise within their functional competence. Author formulated an axiom system of structural and characteristic properties of modeling space problems elements, and interconnections between them. This system of axioms is actually a justification for the correctness and adequacy of the proposed modeling methodology and comes as theoretical basis in the construction of the structural and functional model of the management problems space. This model generalizes three basic structural components of the enterprise management system with the help of axioms system: a three-dimensional model of the management problem space (the first dimension is the enterprise business process network, the second dimension is a set of management problems, the third dimension is four vectors of measurable and qualitative characteristics of management problems, which can be analyzed and managed during enterprise functioning; a two-dimensional model of the cybernetic space of analytical problems, which are formalized form of management problems (multivariate model experiments can be implemented with the help of this model to solve wide range of problem situations and determine the most effective or optimal management solutions; a two-dimensional model

  5. An experimental methodology for a fuzzy set preference model

    Science.gov (United States)

    Turksen, I. B.; Willson, Ian A.

    1992-01-01

    A flexible fuzzy set preference model first requires approximate methodologies for implementation. Fuzzy sets must be defined for each individual consumer using computer software, requiring a minimum of time and expertise on the part of the consumer. The amount of information needed in defining sets must also be established. The model itself must adapt fully to the subject's choice of attributes (vague or precise), attribute levels, and importance weights. The resulting individual-level model should be fully adapted to each consumer. The methodologies needed to develop this model will be equally useful in a new generation of intelligent systems which interact with ordinary consumers, controlling electronic devices through fuzzy expert systems or making recommendations based on a variety of inputs. The power of personal computers and their acceptance by consumers has yet to be fully utilized to create interactive knowledge systems that fully adapt their function to the user. Understanding individual consumer preferences is critical to the design of new products and the estimation of demand (market share) for existing products, which in turn is an input to management systems concerned with production and distribution. The question of what to make, for whom to make it and how much to make requires an understanding of the customer's preferences and the trade-offs that exist between alternatives. Conjoint analysis is a widely used methodology which de-composes an overall preference for an object into a combination of preferences for its constituent parts (attributes such as taste and price), which are combined using an appropriate combination function. Preferences are often expressed using linguistic terms which cannot be represented in conjoint models. Current models are also not implemented an individual level, making it difficult to reach meaningful conclusions about the cause of an individual's behavior from an aggregate model. The combination of complex aggregate

  6. Efficient Modelling Methodology for Reconfigurable Underwater Robots

    DEFF Research Database (Denmark)

    Nielsen, Mikkel Cornelius; Blanke, Mogens; Schjølberg, Ingrid

    2016-01-01

    This paper considers the challenge of applying reconfigurable robots in an underwater environment. The main result presented is the development of a model for a system comprised of N, possibly heterogeneous, robots dynamically connected to each other and moving with 6 Degrees of Freedom (DOF...

  7. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    OpenAIRE

    Alexandre Tadeu Simon; Luiz Carlos Di Serio; Silvio Roberto Ignacio Pires; Guilherme Silveira Martins

    2015-01-01

    Despite the increasing interest in supply chain management (SCM) by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert an...

  8. ChOrDa: a methodology for the modeling of business processes with BPMN

    CERN Document Server

    Buferli, Matteo; Montesi, Danilo

    2009-01-01

    In this paper we present a modeling methodology for BPMN, the standard notation for the representation of business processes. Our methodology simplifies the development of collaborative BPMN diagrams, enabling the automated creation of skeleton process diagrams representing complex choreographies. To evaluate and tune the methodology, we have developed a tool supporting it, that we apply to the modeling of an international patenting process as a working example.

  9. Organizational information assets classification model and security architecture methodology

    Directory of Open Access Journals (Sweden)

    Mostafa Tamtaji

    2015-12-01

    Full Text Available Today's, Organizations are exposed with huge and diversity of information and information assets that are produced in different systems shuch as KMS, financial and accounting systems, official and industrial automation sysytems and so on and protection of these information is necessary. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released.several benefits of this model cuses that organization has a great trend to implementing Cloud computing. Maintaining and management of information security is the main challenges in developing and accepting of this model. In this paper, at first, according to "design science research methodology" and compatible with "design process at information systems research", a complete categorization of organizational assets, including 355 different types of information assets in 7 groups and 3 level, is presented to managers be able to plan corresponding security controls according to importance of each groups. Then, for directing of organization to architect it’s information security in cloud computing environment, appropriate methodology is presented. Presented cloud computing security architecture , resulted proposed methodology, and presented classification model according to Delphi method and expers comments discussed and verified.

  10. IT Development: Methodology Overload or Crisis?

    Science.gov (United States)

    Korac-Boisvert, Nada; Kouzmin, Alexander

    1995-01-01

    An examination of the management techniques that underlie the information technology (IT) industry reveals a common strategy of dividing organizational functions into tasks in a "top-down" fashion. The research and development approach is recommended for design-in-action development and an open-ended IT strategy to facilitate an…

  11. Microgenetic Analysis of Moral Development: Theoretical and Methodological Issues

    OpenAIRE

    Barrios, Alia; Barbato,Silviane; Branco,Angela

    2012-01-01

    New ideas and methodologies need to be developed to advance our knowledge in the understanding of moral development. The intertwined nature of human activities, communication processes, and the numerous aspects of morality pose a challenge to researchers to construct a methodology that takes into account cognition, affect, sociocultural processes and characteristics, as well as the active role of individuals in their own development. In this paper we aim at suggesting fresh theoretical ideas ...

  12. Assessing quality in software development: An agile methodology approach

    National Research Council Canada - National Science Library

    V Rodríguez-Hernández; M C Espino-Gudiño; J L González-Pérez; J Gudiño-Bazaldúa; V M Castaño

    2015-01-01

      A novel methodology, result of 10 years of in-field testing, which makes possible the convergence of different types of models and quality standards for Engineering and Computer Science Faculties, is presented...

  13. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...

  14. High-throughput measurement methodologies for developing ...

    African Journals Online (AJOL)

    African Journal of Food, Agriculture, Nutrition and Development ... Journal Home > Vol 17, No 2 (2017) > ... When analyzing the minerals Fe and Zn, these same techniques are not suitable, but it is still important to ensure careful sample ...

  15. An assessment methodology for developing countries

    African Journals Online (AJOL)

    life-cycle assessment approach and an expanded definition of sustainability, ... However, these systems tend not to address social or economic aspects and ... development as “... improving the quality of human life while living within the ...

  16. Development of Engine Loads Methodology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR seeks to improve the definition of design loads for rocket engine components such that higher performing, lighter weight engines can be developed more...

  17. MAPPING OF TRADITIONAL SOFTWARE DEVELOPMENT METHODS TO AGILE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Rashmi Popli

    2013-02-01

    Full Text Available Agility is bringing in responsibility and ownership in individuals, which will eventually bring out effectiveness and efficiency in deliverables. Companies are drifting from traditional Software Development Life Cycle models to Agile Environment for the purpose of attaining quality and for the sake of saving cost and time. In Traditional models, life cycle is properly defined and also phases are elaborated by specifying needed input and output parameters. On the other hand, in Agile environment, phases are specific to methodologies of Agile - Extreme Programming etc. In this paper a common life cycle approach is proposed that is applicable for different kinds of methods. This paper also aims to describe a mapping function for mapping of traditional methods to Agile methods.

  18. Reflood completion report: Volume 1. A phenomenological thermal-hydraulic model of hot rod bundles experiencing simultaneous bottom and top quenching and an optimization methodology for closure development

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, R.A. Jr.; Pimentel, D.A.; Jolly-Woodruff, S.; Spore, J.

    1998-04-01

    In this report, a phenomenological model of simultaneous bottom-up and top-down quenching is developed and discussed. The model was implemented in the TRAC-PF1/MOD2 computer code. Two sets of closure relationships were compared within the study, the Absolute set and the Conditional set. The Absolute set of correlations is frequently viewed as the pure set because the correlations is frequently viewed as the pure set because the correlations utilize their original coefficients as suggested by the developer. The Conditional set is a modified set of correlations with changes to the correlation coefficient only. Results for these two sets indicate quite similar results. This report also summarizes initial results of an effort to investigate nonlinear optimization techniques applied to the closure model development. Results suggest that such techniques can provide advantages for future model development work, but that extensive expertise is required to utilize such techniques (i.e., the model developer must fully understand both the physics of the process being represented and the computational techniques being employed). The computer may then be used to improve the correlation of computational results with experiments.

  19. Model-driven software migration a methodology

    CERN Document Server

    Wagner, Christian

    2014-01-01

    Today, reliable software systems are the basis of any business or company. The continuous further development of those systems is the central component in software evolution. It requires a huge amount of time- man power- as well as financial resources. The challenges are size, seniority and heterogeneity of those software systems. Christian Wagner addresses software evolution: the inherent problems and uncertainties in the process. He presents a model-driven method which leads to a synchronization between source code and design. As a result the model layer will be the central part in further e

  20. A Review of Roads Data Development Methodologies

    Directory of Open Access Journals (Sweden)

    Taro Ubukawa

    2014-05-01

    Full Text Available There is a clear need for a public domain data set of road networks with high special accuracy and global coverage for a range of applications. The Global Roads Open Access Data Set (gROADS, version 1, is a first step in that direction. gROADS relies on data from a wide range of sources and was developed using a range of methods. Traditionally, map development was highly centralized and controlled by government agencies due to the high cost or required expertise and technology. In the past decade, however, high resolution satellite imagery and global positioning system (GPS technologies have come into wide use, and there has been significant innovation in web services, such that a number of new methods to develop geospatial information have emerged, including automated and semi-automated road extraction from satellite/aerial imagery and crowdsourcing. In this paper we review the data sources, methods, and pros and cons of a range of road data development methods: heads-up digitizing, automated/semi-automated extraction from remote sensing imagery, GPS technology, crowdsourcing, and compiling existing data sets. We also consider the implications for each method in the production of open data.

  1. Human-Systems Integration (HSI) Methodology Development for NASA Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A technology with game-changing potential for crew to space system interaction will be selected to develop using the HSI Methodology created through the efforts of...

  2. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  3. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  4. A methodology for development of biocatalytic processes

    DEFF Research Database (Denmark)

    Lima Ramos, Joana

    in process development is selecting between different process alternatives. The development effort for a novel process is considerable and thus, an increasing number of conceptual process design methods are now applied in chemical industries. Since the natural environment of the biocatalyst is often very...... interpretable results to enable rational design choices of different available process technologies. In the particular case of the asymmetric synthesis of chiral amines, the reaction constraints (thermodynamic equilibrium) must be solved prior to implementation and these fix the hard boundaries of the operating......The potential advantages displayed by biocatalytic processes for organic synthesis (such as exquisite selectivity under mild operating conditions), have prompted the increasing number of processes running on a commercial scale. However, biocatalysis is still a fairly underutilised technology...

  5. Developing enterprise collaboration: a methodology to implement and improve interoperability

    Science.gov (United States)

    Daclin, Nicolas; Chen, David; Vallespir, Bruno

    2016-06-01

    The aim of this article is to present a methodology for guiding enterprises to implement and improve interoperability. This methodology is based on three components: a framework of interoperability which structures specific solutions of interoperability and is composed of abstraction levels, views and approaches dimensions; a method to measure interoperability including interoperability before (maturity) and during (operational performances) a partnership; and a structured approach defining the steps of the methodology, from the expression of an enterprise's needs to implementation of solutions. The relationship which consistently relates these components forms the methodology and enables developing interoperability in a step-by-step manner. Each component of the methodology and the way it operates is presented. The overall approach is illustrated in a case study example on the basis of a process between a given company and its dealers. Conclusions and future perspectives are given at the end of the article.

  6. A review of methodologies used in research on cadastral development

    DEFF Research Database (Denmark)

    Silva, Maria Augusta; Stubkjær, Erik

    2002-01-01

    World-wide, much attention has been given to cadastral development. As a consequence of experiences made during the last decades, several authors have stated the need of research in the domain of cadastre and proposed methodologies to be used. The purpose of this paper is to contribute...... to the acceptance of research methodologies needed for cadastral development, and thereby enhance theory in the cadastral domain. The paper reviews nine publica-tions on cadastre and identifies the methodologies used. The review focuses on the institutional, social political and economic aspects of cadastral...... development, rather than on the technical aspects. The main conclusion of this paper is that the methodologies used are largely those of the social sciences. That agrees with the notion that cadastre relates as much to people and institutions, as it relates to land, and that cadastral systems are shaped...

  7. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  8. Advanced Power Plant Development and Analyses Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    G.S. Samuelsen; A.D. Rao

    2006-02-06

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include ''Zero Emission'' power plants and the ''FutureGen'' H{sub 2} co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the ''Vision 21'' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  9. Methodology and basic algorithms of the Livermore Economic Modeling System

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.B.

    1981-03-17

    The methodology and the basic pricing algorithms used in the Livermore Economic Modeling System (EMS) are described. The report explains the derivations of the EMS equations in detail; however, it could also serve as a general introduction to the modeling system. A brief but comprehensive explanation of what EMS is and does, and how it does it is presented. The second part examines the basic pricing algorithms currently implemented in EMS. Each algorithm's function is analyzed and a detailed derivation of the actual mathematical expressions used to implement the algorithm is presented. EMS is an evolving modeling system; improvements in existing algorithms are constantly under development and new submodels are being introduced. A snapshot of the standard version of EMS is provided and areas currently under study and development are considered briefly.

  10. Physical protection evaluation methodology program development and application

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Janghoon; Yoo, Hosik [Korea Institute of Nuclear Non-proliferation and Control, Daejeon (Korea, Republic of)

    2015-10-15

    It is essential to develop a reliable physical protection evaluation methodology for applying physical protection concept to the design stage. The methodology can be used to assess weak points and improve performance not only for the design stage but also for nuclear facilities in operation. Analyzing physical protection property of nuclear facilities is not a trivial work since there are many interconnected factors affecting overall performance. Therefore several international projects have been organized to develop a systematic physical protection evaluation methodology. INPRO (The International Project on Innovative Nuclear Reactors and Fuel Cycles) and GIF PRPP (Generation IV International Forum Proliferation Resistance and Physical Protection) methodology are among the most well-known evaluation methodologies. INPRO adopts a checklist type of questionnaire and has a strong point in analyzing overall characteristic of facilities in a qualitative way. COMPRE program has been developed to help general users apply COMPRE methodology to nuclear facilities. In this work, COMPRE program development and a case study of the hypothetical nuclear facility are presented. The development of COMPRE program and a case study for hypothetic facility is presented in this work. The case study shows that COMPRE PP methodology can be a useful tool to assess the overall physical protection performance of nuclear facilities. To obtain meaningful results from COMPRE PP methodology, detailed information and comprehensive analysis are required. Especially, it is not trivial to calculate reliable values for PPSE (Physical Protection System Effectiveness) and C (Consequence), while it is relatively straightforward to evaluate LI (Legislative and Institutional framework), MC (Material Control) and HR (Human Resources). To obtain a reliable PPSE value, comprehensive information about physical protection system, vital area analysis and realistic threat scenario assessment are required. Like

  11. Modern methodology and applications in spatial-temporal modeling

    CERN Document Server

    Matsui, Tomoko

    2015-01-01

    This book provides a modern introductory tutorial on specialized methodological and applied aspects of spatial and temporal modeling. The areas covered involve a range of topics which reflect the diversity of this domain of research across a number of quantitative disciplines. For instance, the first chapter deals with non-parametric Bayesian inference via a recently developed framework known as kernel mean embedding which has had a significant influence in machine learning disciplines. The second chapter takes up non-parametric statistical methods for spatial field reconstruction and exceedance probability estimation based on Gaussian process-based models in the context of wireless sensor network data. The third chapter presents signal-processing methods applied to acoustic mood analysis based on music signal analysis. The fourth chapter covers models that are applicable to time series modeling in the domain of speech and language processing. This includes aspects of factor analysis, independent component an...

  12. Improved Methodology for Parameter Inference in Nonlinear, Hydrologic Regression Models

    Science.gov (United States)

    Bates, Bryson C.

    1992-01-01

    A new method is developed for the construction of reliable marginal confidence intervals and joint confidence regions for the parameters of nonlinear, hydrologic regression models. A parameter power transformation is combined with measures of the asymptotic bias and asymptotic skewness of maximum likelihood estimators to determine the transformation constants which cause the bias or skewness to vanish. These optimized constants are used to construct confidence intervals and regions for the transformed model parameters using linear regression theory. The resulting confidence intervals and regions can be easily mapped into the original parameter space to give close approximations to likelihood method confidence intervals and regions for the model parameters. Unlike many other approaches to parameter transformation, the procedure does not use a grid search to find the optimal transformation constants. An example involving the fitting of the Michaelis-Menten model to velocity-discharge data from an Australian gauging station is used to illustrate the usefulness of the methodology.

  13. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  14. A generalized methodology to characterize composite materials for pyrolysis models

    Science.gov (United States)

    McKinnon, Mark B.

    The predictive capabilities of computational fire models have improved in recent years such that models have become an integral part of many research efforts. Models improve the understanding of the fire risk of materials and may decrease the number of expensive experiments required to assess the fire hazard of a specific material or designed space. A critical component of a predictive fire model is the pyrolysis sub-model that provides a mathematical representation of the rate of gaseous fuel production from condensed phase fuels given a heat flux incident to the material surface. The modern, comprehensive pyrolysis sub-models that are common today require the definition of many model parameters to accurately represent the physical description of materials that are ubiquitous in the built environment. Coupled with the increase in the number of parameters required to accurately represent the pyrolysis of materials is the increasing prevalence in the built environment of engineered composite materials that have never been measured or modeled. The motivation behind this project is to develop a systematic, generalized methodology to determine the requisite parameters to generate pyrolysis models with predictive capabilities for layered composite materials that are common in industrial and commercial applications. This methodology has been applied to four common composites in this work that exhibit a range of material structures and component materials. The methodology utilizes a multi-scale experimental approach in which each test is designed to isolate and determine a specific subset of the parameters required to define a material in the model. Data collected in simultaneous thermogravimetry and differential scanning calorimetry experiments were analyzed to determine the reaction kinetics, thermodynamic properties, and energetics of decomposition for each component of the composite. Data collected in microscale combustion calorimetry experiments were analyzed to

  15. Development of nondestructive testing/evaluation methodology for MEMS

    Science.gov (United States)

    Zunino, James L., III; Skelton, Donald R.; Marinis, Ryan T.; Klempner, Adam R.; Hefti, Peter; Pryputniewicz, Ryszard J.

    2008-02-01

    Development of MEMS constitutes one of the most challenging tasks in today's micromechanics. In addition to design, analysis, and fabrication capabilities, this task also requires advanced test methodologies for determination of functional characteristics of MEMS to enable refinement and optimization of their designs as well as for demonstration of their reliability. Until recently, this characterization was hindered by lack of a readily available methodology. However, using recent advances in photonics, electronics, and computer technology, it was possible to develop a NonDestructive Testing (NDT) methodology suitable for evaluation of MEMS. In this paper, an optoelectronic methodology for NDT of MEMS is described and its application is illustrated with representative examples; this description represents work in progress and the results are preliminary. This methodology provides quantitative full-field-of-view measurements in near real-time with high spatial resolution and nanometer accuracy. By quantitatively characterizing performance of MEMS, under different vibration, thermal, and other operating conditions, specific suggestions for their improvements can be made. Then, using the methodology, we can verify the effects of these improvements. In this way, we can develop better understanding of functional characteristics of MEMS, which will ensure that they are operated at optimum performance, are durable, and are reliable.

  16. Methodology Approaches Regarding Classic versus Mobile Enterprise Application Development

    Directory of Open Access Journals (Sweden)

    Vasile-Daniel PAVALOAIA

    2013-01-01

    Full Text Available In the nowadays enterprise computerized context, there is a trend that shifts the business ap-plications to the new mobile environments. In the light of this information, it is highly important to be knowledgeable about the software development methodologies available in order to make the right choice when it comes to developing a mobile application. The current research aims to presenting the methodological approaches regarding the development cycle of classic enterprise software versus mobile apps. In the first part of the paper a brief literature review regarding the mobile apps is made, for the purpose of justifying the current research theme. The most consistent part of the article puts face-to-face the “classical” and the new development methodologies adapted to the requirements of the new mobile environment trends. The paper also presents the challenges and limitations of mobile applications as well as few of the future trends in the researched domain.

  17. A methodology for modeling barrier island storm-impact scenarios

    Science.gov (United States)

    Mickey, Rangley C.; Long, Joseph W.; Plant, Nathaniel G.; Thompson, David M.; Dalyander, P. Soupy

    2017-02-16

    A methodology for developing a representative set of storm scenarios based on historical wave buoy and tide gauge data for a region at the Chandeleur Islands, Louisiana, was developed by the U.S. Geological Survey. The total water level was calculated for a 10-year period and analyzed against existing topographic data to identify when storm-induced wave action would affect island morphology. These events were categorized on the basis of the threshold of total water level and duration to create a set of storm scenarios that were simulated, using a high-fidelity, process-based, morphologic evolution model, on an idealized digital elevation model of the Chandeleur Islands. The simulated morphological changes resulting from these scenarios provide a range of impacts that can help coastal managers determine resiliency of proposed or existing coastal structures and identify vulnerable areas within those structures.

  18. Methodological Approach for Modeling of Multienzyme in-pot Processes

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia; Roman Martinez, Alicia; Sin, Gürkan;

    2011-01-01

    This paper presents a methodological approach for modeling multi-enzyme in-pot processes. The methodology is exemplified stepwise through the bi-enzymatic production of N-acetyl-D-neuraminic acid (Neu5Ac) from N-acetyl-D-glucosamine (GlcNAc). In this case study, sensitivity analysis is also used...

  19. Improved rapid prototyping methodology for MPEG-4 IC development

    Science.gov (United States)

    Tang, Clive K. K.; Moseler, Kathy; Levi, Sami

    1998-12-01

    One important factor in deciding the success of a new consumer product or integrated circuit is minimized time-to- market. A rapid prototyping methodology that encompasses algorithm development in the hardware design phase will have great impact on reducing time-to-market. In this paper, a proven hardware design methodology and a novel top-down design methodology based on Frontier Design's DSP Station tool are described. The proven methodology was used during development of the MC149570 H.261/H.263 video codec manufactured by Motorola. This paper discusses an improvement to this method to create an integrated environment for both system and hardware development, thereby further reducing the time-to-market. The software tool chosen is DSP Station tool by Frontier Design. The rich features of DSP Station tool will be described and then it will be shown how these features may be useful in designing from algorithm to silicon. How this methodology may be used in the development of a new MPEG4 Video Communication ASIC will be outlined. A brief comparison with a popular tool, Signal Processing WorkSystem tool by Cadence, will also be given.

  20. MODERN MODELS AND METHODS OF DIAGNOSIS OF METHODOLOGY COMPETENT TEACHERS

    Directory of Open Access Journals (Sweden)

    Loyko V. I.

    2016-06-01

    Full Text Available The purpose of the research is development of models and methods of diagnostics of methodical competence of a teacher. According to modern views, methodical thinking is the key competence of teachers. Modern experts consider the methodical competence of a teacher as a personal and professional quality, which is a fundamentally important factor in the success of the professional activity of teachers, as well as a subsystem of its professional competence. This is due to the fact that in today's world, a high level of knowledge of teachers of academic subjects and their possessing of learnt basics of teaching methods can not fully describe the level of professional competence of the teacher. The authors have characterized the functional components of methodical competence of the teacher, its relationship with other personalprofessional qualities (first - to the psychological and educational, research and informational competence, as well as its levels of formation. Forming a model of methodical competence of the teacher, the authors proceeded from the fact that a contemporary teacher high demands: it must be ready to conduct independent research, design-learning technologies, forecasting results of training and education of students. As a leading component of the methodical competence of the teacher is his personal experience in methodological activities and requirements of methodical competence determined goals and objectives of methodical activity, the process of the present study, the formation of patterns of methodical competence of the teacher preceded the refinement of existing models methodical activity of scientific and pedagogical staff of higher education institutions and secondary vocational education institutions. The proposed model of methodical competence of the teacher - the scientific basis of a system of monitoring of his personal and professional development, and evaluation criteria and levels of her diagnosis - targets system of

  1. Optimal Data Split Methodology for Model Validation

    CERN Document Server

    Morrison, Rebecca; Terejanu, Gabriel; Miki, Kenji; Prudhomme, Serge

    2011-01-01

    The decision to incorporate cross-validation into validation processes of mathematical models raises an immediate question - how should one partition the data into calibration and validation sets? We answer this question systematically: we present an algorithm to find the optimal partition of the data subject to certain constraints. While doing this, we address two critical issues: 1) that the model be evaluated with respect to predictions of a given quantity of interest and its ability to reproduce the data, and 2) that the model be highly challenged by the validation set, assuming it is properly informed by the calibration set. This framework also relies on the interaction between the experimentalist and/or modeler, who understand the physical system and the limitations of the model; the decision-maker, who understands and can quantify the cost of model failure; and the computational scientists, who strive to determine if the model satisfies both the modeler's and decision maker's requirements. We also note...

  2. Development of an integrated methodology for the design and optimization of charging and EGR circuits in modern diesel engines based on 1D-CFD engine modelling

    Energy Technology Data Exchange (ETDEWEB)

    Arrigoni, Stefano; Avolio, Giovanni; Loudjertli, Lydia; Renella, Alfonso; Vassallo, Alberto [General Motors Powertrain Europe, Turin (Italy)

    2011-07-01

    In modern diesel engines, the requirements on the combustion system are very tightening, due to an aggressive combination of pollutant emission, fuel economy, NVH and fun-to-drive targets. In particular, the charging and EGR circuits, with their impact on combustion system performance, deserve a special attention, both in terms of architecture selection, as well as component design and specifications. Since most of these choices occur very early in the engine design phase, it is of high importance to have a reliable analytical tool capable to predict the performance of such components, prior than the actual hardware is available for testing. The present paper describes the development and application to a new diesel engine of an integrated approach for charging and EGR circuit design optimization, based on a set of high-level targets for emissions, fuel economy and performance. In order to achieve this goal, a 1D-CFD approach based on GT-Power suite has been employed: specific sub-routines and semi-empirical models for accurate heat-release and emission prediction have been developed and validated, and finally applied to a light-duty passenger car diesel engine under development. The results show that the tool is capable to predict engine indicated cycle as well as NOx, PM emissions depending on the characteristics of charging and EGR circuits, and can be used to cascade high-level engine target to component specifications (turbocharger, EGR cooler, intercooler) in an effective way. (orig.)

  3. Methodology for development of risk indicators for offshore platforms

    Energy Technology Data Exchange (ETDEWEB)

    Oeien, K.; Sklet, S. [SINTEF Industrial Management Safety and Reliability (Norway)

    1999-09-01

    This paper presents a generic methodology for development of risk indicators for petroleum installations and a specific set of risk indicators established for one offshore platform. The risk indicators should be used to control the risk during operation of platforms. The methodology is purely risk-based and the basis for development of risk indicators is the platform specific quantitative risk analysis (QRA). In order to identify high risk contributing factors, platform personnel are asked to assess whether and how much the risk influencing factors will change. A brief comparison of probabilistic safety assessment (PSA) for nuclear power plants and quantitative risk analysis (QRA) for petroleum platforms is also given. (au)

  4. The Health Behaviour in School-aged Children (HBSC) study: methodological developments and current tensions

    DEFF Research Database (Denmark)

    Roberts, C; Freeman, J; Samdal, O

    2009-01-01

    OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members of the ...... in working through such challenges renders it likely that HBSC can provide a model of other similar studies facing these tensions.......OBJECTIVES: To describe the methodological development of the HBSC survey since its inception and explore methodological tensions that need to be addressed in the ongoing work on this and other large-scale cross-national surveys. METHODS: Using archival data and conversations with members...... of the network, we collaboratively analysed our joint understandings of the survey's methodology. RESULTS: We identified four tensions that are likely to be present in upcoming survey cycles: (1) maintaining quality standards against a background of rapid growth, (2) continuous improvement with limited financial...

  5. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare

    NARCIS (Netherlands)

    Bracke, M.B.M.; Edwards, S.A.; Metz, J.H.M.; Noordhuizen, J.P.T.M.; Algers, B.

    2008-01-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called `semantic modelling¿ has been developed. To date, however, this methodology has not been generally app

  6. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  7. Service Innovation Methodologies II : How can new product development methodologies be applied to service innovation and new service development? : Report no 2 from the TIPVIS-project

    OpenAIRE

    Nysveen, Herbjørn; Pedersen, Per E.; Aas, Tor Helge

    2007-01-01

    This report presents various methodologies used in new product development and product innovation and discusses the relevance of these methodologies for service development and service innovation. The service innovation relevance for all of the methodologies presented is evaluated along several service specific dimensions, like intangibility, inseparability, heterogeneity, perishability, information intensity, and co-creation. The methodologies discussed are mainly collect...

  8. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    -line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...

  9. A Fault-tolerant Development Methodology for Industrial Control Systems

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh; Thybo, C.

    2004-01-01

    and logically sound manner. This paper presents the employe fault-tolerant development methodology and highlights steps, which have been essential for achieving complete and consistent monitoring capabilities. Fault diagnosis for a commercial refrigeration system is treated as a case-study....

  10. Development of Management Methodology for Engineering Production Quality

    Science.gov (United States)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness

  11. Research Methodology on Language Development from a Complex Systems Perspective

    Science.gov (United States)

    Larsen-Freeman, Diane; Cameron, Lynne

    2008-01-01

    Changes to research methodology motivated by the adoption of a complexity theory perspective on language development are considered. The dynamic, nonlinear, and open nature of complex systems, together with their tendency toward self-organization and interaction across levels and timescales, requires changes in traditional views of the functions…

  12. Prioritization Methodology for Development of Required Operational Capabilities

    Science.gov (United States)

    2010-04-01

    Cours d’économie politique professé à l’université de Lausanne, 3 vol., 1896-7. 4 Pareto, V., (1935), The Mind and Society [Trattato Di Sociologia ...and Applications: A State-of-the- Art Survey, Springer-Verlag, New York. Prioritization Methodology for Development of Required Operational

  13. MULTIMEDIA MOBILE CONTENT DEVELOPMENT FRAMEWORK AND METHODOLOGY FOR DEVELOPING M-LEARNING APPLICATIONS

    Directory of Open Access Journals (Sweden)

    Wan Sazli Nasaruddin Saifudin Saifudin

    2012-07-01

    Full Text Available Mobile devices limitations such as small screen resolution, limited data space and slow processing speed pose challenges to the developers in developing good m-learning applications. Therefore, aspects such as content design, navigation design and mobile HCI are critical and need greater attention during the development phase. The purpose of this paper is to discuss a Multimedia Mobile Content Development (MMCD Framework and Methodology for developing an effective m-Learning application that focuses on user needs. MMCD is based on the characteristic of an agile development model and by using a Flash Light (FL technology which is widely supported by today’s mobile devices, the final output is compatible with the majorities of available mobile devices which would encourage mobile-learning activities. Focus on the object or content design and the navigation control are two development aspects that help the development of learning application via mobile device optimized.

  14. High level models and methodologies for information systems

    CERN Document Server

    Isaias, Pedro

    2014-01-01

    This book introduces methods and methodologies in Information Systems (IS) by presenting, describing, explaining, and illustrating their uses in various contexts, including website development, usability evaluation, quality evaluation, and success assessment.

  15. Applying of component system development in object methodology

    Directory of Open Access Journals (Sweden)

    Milan Mišovič

    2013-01-01

    software system and referred to as software alliance.In both of these mentioned publications there is delivered ​​deep philosophy of relevant issues relating to SWC / SWA as creating copies of components (cloning, the establishment and destruction of components at software run-time (dynamic reconfiguration, cooperation of autonomous components, programmable management of components interface in depending on internal components functionality and customer requirements (functionality, security, versioning.Nevertheless, even today we can meet numerous cases of SWC / SWA existence, with a highly developed architecture that is accepting vast majority of these requests. On the other hand, in the development practice of component-based systems with a dynamic architecture (i.e. architecture with dynamic reconfiguration, and finally with a mobile architecture (i.e. architecture with dynamic component mobility confirms the inadequacy of the design methods contained in UML 2.0. It proves especially the dissertation thesis (Rych, Weis, 2008. Software Engineering currently has two different approaches to systems SWC / SWA. The first approach is known as component-oriented software development CBD (Component based Development. According to (Szyper, 2002 that is a collection of CBD methodologies that are heavily focused on the setting up and software components re-usability within the architecture. Although CBD does not show high theoretical approach, nevertheless, it is classified under the general evolution of SDP (Software Development Process, see (Sommer, 2010 as one of its two dominant directions.From a structural point of view, a software system consists of self-contained, interoperable architectural units – components based on well-defined interfaces. Classical procedural object-oriented methodologies significantly do not use the component meta-models, based on which the target component systems are formed, then. Component meta-models describe the syntax, semantics of

  16. Selecting a software development methodology. [of digital flight control systems

    Science.gov (United States)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  17. The Development of Methodology to Support Comprehensive Approach: TMC

    Science.gov (United States)

    2014-05-02

    methodology components, each of which supports a different spectrum of multidisciplinary teamwork . The TMC development has been an iterative process and it...program that aims at exploring new ideas with high potential but high risks. The concept IMAGE developed in this project aimed at improving the...increasing understanding of a complex situation and enabling individuals to share their comprehension. IMAGE is a software toolset concept proposed

  18. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  19. Methodology to develop and evaluate a semantic representation for NLP.

    Science.gov (United States)

    Irwin, Jeannie Y; Harkema, Henk; Christensen, Lee M; Schleyer, Titus; Haug, Peter J; Chapman, Wendy W

    2009-11-14

    Natural language processing applications that extract information from text rely on semantic representations. The objective of this paper is to describe a methodology for creating a semantic representation for information that will be automatically extracted from textual clinical records. We illustrate two of the four steps of the methodology in this paper using the case study of encoding information from dictated dental exams: (1) develop an initial representation from a set of training documents and (2) iteratively evaluate and evolve the representation while developing annotation guidelines. Our approach for developing and evaluating a semantic representation is based on standard principles and approaches that are not dependent on any particular domain or type of semantic representation.

  20. Industrial environmental performance assessment: developing a referential methodology

    Directory of Open Access Journals (Sweden)

    Andréia Marize Rodrigues

    2015-02-01

    Full Text Available This study aimed to the development of a methodological framework for the broadly evaluation of environmental performance in industrial companies. The methodological framework developed is composed for nine aspects of evaluation, namely: Organizational Management, Human Resources, Product, Production Process, Physical Facilities, Emissions, Social Development, Economic and Financial Aspect and Media. For each of these aspects were created evaluation indicators, a total of 35 indicators. Each indicator is accompanied by a numeric ID, a description, a generic goal, and its metric unit of measure and a scale to measure the care of each indicator. As an illustration, the proposed framework was applied to the evaluation in an industrial metal-mechanical company, the latter achieved environmental performance of 84.0%.

  1. Integrated methodology for constructing a quantified hydrodynamic model for application to clastic petroleum reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Honarpour, M. M.; Schatzinger, R. A.; Szpakiewicz, M. J.; Jackson, S. R.; Sharma, B.; Tomutsa, L.; Chang, M. M.

    1990-01-01

    A comprehensive, multidisciplinary, stepwise methodology is developed for constructing and integration geological and engineering information for predicting petroleum reservoir performance. This methodology is based on our experience in characterizing shallow marine reservoirs, but it should also apply to other deposystems. The methodology is presented as Part 1 of this report. Three major tasks that must be studied to facilitate a systematic approach for constructing a predictive hydrodynamic model for petroleum reservoirs are addressed: (1) data collection, organization, evaluation, and integration; (2) hydrodynamic model construction and verification; and (3) prediction and ranking of reservoir parameters by numerical simulation using data derived from the model. 39 refs., 62 figs., 13 tabs.

  2. A cislunar guidance methodology and model for low thrust trajectory generation

    Science.gov (United States)

    Korsmeyer, David J.

    1992-01-01

    A guidance methodology for generating low-thrust cislunar trajectories was developed and incorporated in a computer model. The guidance methodology divides the cislunar transfer into three phases. Each phase is discussed in turn. To establish the effectiveness of the methodology and algorithms the computer model generated three example cases for the cislunar transfer of a low-thrust electric orbital transfer vehicle (EOTV). Transfers from both earth orbit to lunar orbit and from lunar orbit back to earth orbit are considered. The model allows the determination of the low-thrust EOTV's time-of-flight, propellant mass, payload mass, and thrusting history.

  3. Development of an aeroelastic methodology for surface morphing rotors

    Science.gov (United States)

    Cook, James R.

    Helicopter performance capabilities are limited by maximum lift characteristics and vibratory loading. In high speed forward flight, dynamic stall and transonic flow greatly increase the amplitude of vibratory loads. Experiments and computational simulations alike have indicated that a variety of active rotor control devices are capable of reducing vibratory loads. For example, periodic blade twist and flap excitation have been optimized to reduce vibratory loads in various rotors. Airfoil geometry can also be modified in order to increase lift coefficient, delay stall, or weaken transonic effects. To explore the potential benefits of active controls, computational methods are being developed for aeroelastic rotor evaluation, including coupling between computational fluid dynamics (CFD) and computational structural dynamics (CSD) solvers. In many contemporary CFD/CSD coupling methods it is assumed that the airfoil is rigid to reduce the interface by single dimension. Some methods retain the conventional one-dimensional beam model while prescribing an airfoil shape to simulate active chord deformation. However, to simulate the actual response of a compliant airfoil it is necessary to include deformations that originate not only from control devices (such as piezoelectric actuators), but also inertial forces, elastic stresses, and aerodynamic pressures. An accurate representation of the physics requires an interaction with a more complete representation of loads and geometry. A CFD/CSD coupling methodology capable of communicating three-dimensional structural deformations and a distribution of aerodynamic forces over the wetted blade surface has not yet been developed. In this research an interface is created within the Fully Unstructured Navier-Stokes (FUN3D) solver that communicates aerodynamic forces on the blade surface to University of Michigan's Nonlinear Active Beam Solver (UM/NLABS -- referred to as NLABS in this thesis). Interface routines are developed for

  4. Development of test methodology for dynamic mechanical analysis instrumentation

    Science.gov (United States)

    Allen, V. R.

    1982-08-01

    Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.

  5. Modeling of development and projection of the accumulated recoverable oil volume: methodology and application; Modelagem da evolucao e projecao de volume de oleo recuperavel acumulado: metodologia e aplicacao

    Energy Technology Data Exchange (ETDEWEB)

    Melo, Luciana Cavalcanti de; Ferreira Filho, Virgilio Jose Martins; Rocha, Vinicius Brito [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)

    2004-07-01

    A relevant problem that petroleum companies deal is the estimate of the future levels of reserves The objective of the reserve forecasting is pursued through the construction of mathematical models. Considering that the exploration process is an informed and controlled process, in order to reach the exploration targets, the exploration process is lead inside of a sequence of decisions based on the reached results. Such decisions are taken surrounded by an uncertain environment added to the random nature of the process. Another important assumption that must be taken into consideration is the dependency of the exploration on the conditions, or structure, of the discovered resources and the final potential. The modeling starts with the establishment of a general problem, when the models are being constructed, based on suppositions associated to the main concepts, and ends with the attainment of specific solutions, when the best description, or model, is selected through the estimate of the respective parameters and of the measurement adjustments. The result of this approach reflects the essence of the exploration process and how it is reflected in the incorporation of reserves and history of field discoveries. A case study is used for validation of the models and the estimates. (author)

  6. Modeling survival in colon cancer: a methodological review

    Directory of Open Access Journals (Sweden)

    Holbert Don

    2007-02-01

    Full Text Available Abstract The Cox proportional hazards model is the most widely used model for survival analysis because of its simplicity. The fundamental assumption in this model is the proportionality of the hazard function. When this condition is not met, other modifications or other models must be used for analysis of survival data. We illustrate in this review several methodological approaches to deal with the violation of the proportionality assumption, using survival in colon cancer as an illustrative example.

  7. Towards the development of a global probabilistic tsunami risk assessment methodology

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2017-04-01

    The assessment of tsunami risk is on many levels still ambiguous and under discussion. Over the last two decades, various methodologies and models have been developed to quantify tsunami risk, most of the time on a local or regional level, with either deterministic or probabilistic background. Probabilistic modelling has significant difficulties, as the underlying tsunami hazard modelling demands an immense amount of computational time and thus limits the assessment substantially, being often limited to either institutes with supercomputing access or the modellers are forced to reduce modelling resolution either quantitatively or qualitatively. Furthermore, data on the vulnerability of infrastructure and buildings is empirically limited to a few disasters in the recent years. Thus, a reliable quantification of socio-economic vulnerability is still questionable. Nonetheless, significant improvements have been developed recently on both the methodological site as well as computationally. This study, introduces a methodological framework for a globally uniform probabilistic tsunami risk assessment. Here, the power of recently developed hardware for desktop-based parallel computing plays a crucial role in the calculation of numerical tsunami wave propagation, while large-scale parametric models and paleo-seismological data enhances the return period assessment of tsunami-genic megathrust earthquake events. Adaptation of empirical tsunami vulnerability functions in conjunction with methodologies from flood modelling support a more reliable vulnerability quantification. In addition, methodologies for exposure modelling in coastal areas are introduced focusing on the diversity of coastal exposure landscapes and data availability. Overall, this study introduces a first overview of how a global tsunami risk modelling framework may be accomplished, while covering methodological, computational and data-driven aspects.

  8. Design Intelligent Model base Online Tuning Methodology for Nonlinear System

    Directory of Open Access Journals (Sweden)

    Ali Roshanzamir

    2014-04-01

    Full Text Available In various dynamic parameters systems that need to be training on-line adaptive control methodology is used. In this paper fuzzy model-base adaptive methodology is used to tune the linear Proportional Integral Derivative (PID controller. The main objectives in any systems are; stability, robust and reliability. However PID controller is used in many applications but it has many challenges to control of continuum robot. To solve these problems nonlinear adaptive methodology based on model base fuzzy logic is used. This research is used to reduce or eliminate the PID controller problems based on model reference fuzzy logic theory to control of flexible robot manipulator system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  9. HIERARCHICAL METHODOLOGY FOR MODELING HYDROGEN STORAGE SYSTEMS PART II: DETAILED MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, B; Donald L. Anton, D

    2008-12-22

    There is significant interest in hydrogen storage systems that employ a media which either adsorbs, absorbs or reacts with hydrogen in a nearly reversible manner. In any media based storage system the rate of hydrogen uptake and the system capacity is governed by a number of complex, coupled physical processes. To design and evaluate such storage systems, a comprehensive methodology was developed, consisting of a hierarchical sequence of models that range from scoping calculations to numerical models that couple reaction kinetics with heat and mass transfer for both the hydrogen charging and discharging phases. The scoping models were presented in Part I [1] of this two part series of papers. This paper describes a detailed numerical model that integrates the phenomena occurring when hydrogen is charged and discharged. A specific application of the methodology is made to a system using NaAlH{sub 4} as the storage media.

  10. Latest developments on safety analysis methodologies at the Juzbado plant

    Energy Technology Data Exchange (ETDEWEB)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A. [ENUSA Industrias Avanzadas S. A., Juzbado Nuclear Fuel Fabrication Plant, Ctra. Salamanca-Ledesma, km. 26, 37015 Juzbado, Salamanca (Spain)

    2010-07-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  11. Coal resources available for development; a methodology and pilot study

    Science.gov (United States)

    Eggleston, Jane R.; Carter, M. Devereux; Cobb, James C.

    1990-01-01

    Coal accounts for a major portion of our Nation's energy supply in projections for the future. A demonstrated reserve base of more than 475 billion short tons, as the Department of Energy currently estimates, indicates that, on the basis of today's rate of consumption, the United States has enough coal to meet projected energy needs for almost 200 years. However, the traditional procedures used for estimating the demonstrated reserve base do not account for many environmental and technological restrictions placed on coal mining. A new methodology has been developed to determine the quantity of coal that might actually be available for mining under current and foreseeable conditions. This methodology is unique in its approach, because it applies restrictions to the coal resource before it is mined. Previous methodologies incorporated restrictions into the recovery factor (a percentage), which was then globally applied to the reserve (minable coal) tonnage to derive a recoverable coal tonnage. None of the previous methodologies define the restrictions and their area and amount of impact specifically. Because these restrictions and their impacts are defined in this new methodology, it is possible to achieve more accurate and specific assessments of available resources. This methodology has been tested in a cooperative project between the U.S. Geological Survey and the Kentucky Geological Survey on the Matewan 7.5-minute quadrangle in eastern Kentucky. Pertinent geologic, mining, land-use, and technological data were collected, assimilated, and plotted. The National Coal Resources Data System was used as the repository for data, and its geographic information system software was applied to these data to eliminate restricted coal and quantify that which is available for mining. This methodology does not consider recovery factors or the economic factors that would be considered by a company before mining. Results of the pilot study indicate that, of the estimated

  12. Development of methodology to prioritise wildlife pathogens for surveillance.

    Science.gov (United States)

    McKenzie, Joanna; Simpson, Helen; Langstaff, Ian

    2007-09-14

    We developed and evaluated a methodology to prioritise pathogens for a wildlife disease surveillance strategy in New Zealand. The methodology, termed 'rapid risk analysis' was based on the import risk analysis framework recommended by the Office Internationale des Epizooties (OIE), and involved: hazard identification, risk estimation, and ranking of 48 exotic and 34 endemic wildlife pathogens. The risk assessment was more rapid than a full quantitative assessment through the use of a semi-quantitative approach to score pathogens for probability of entry to NZ (release assessment), likelihood of spread (exposure assessment) and consequences in free-living wildlife, captive wildlife, humans, livestock and companion animals. Risk was estimated by multiplying the scores for the probability of entry to New Zealand by the likelihood of spread by the consequences for free-living wildlife, humans and livestock. The rapid risk analysis methodology produced scores that were sufficiently differentiated between pathogens to be useful for ranking them on the basis of risk. Ranking pathogens on the basis of the risk estimate for each population sector provided an opportunity to identify the priorities within each sector alone thus avoiding value-laden comparisons between sectors. Ranking pathogens across all three population sectors by summing the risk estimate for each sector provided a comparison of total risk which may be useful for resource allocation decisions at national level. Ranking pathogens within each wildlife taxonomic group using the total risk estimate was most useful for developing specific surveillance strategies for each group.

  13. Novel computational methodologies for structural modeling of spacious ligand binding sites of G-protein-coupled receptors: development and application to human leukotriene B4 receptor.

    Science.gov (United States)

    Ishino, Yoko; Harada, Takanori

    2012-01-01

    This paper describes a novel method to predict the activated structures of G-protein-coupled receptors (GPCRs) with high accuracy, while aiming for the use of the predicted 3D structures in in silico virtual screening in the future. We propose a new method for modeling GPCR thermal fluctuations, where conformation changes of the proteins are modeled by combining fluctuations on multiple time scales. The core idea of the method is that a molecular dynamics simulation is used to calculate average 3D coordinates of all atoms of a GPCR protein against heat fluctuation on the picosecond or nanosecond time scale, and then evolutionary computation including receptor-ligand docking simulations functions to determine the rotation angle of each helix of a GPCR protein as a movement on a longer time scale. The method was validated using human leukotriene B4 receptor BLT1 as a sample GPCR. Our study demonstrated that the proposed method was able to derive the appropriate 3D structure of the active-state GPCR which docks with its agonists.

  14. Novel Computational Methodologies for Structural Modeling of Spacious Ligand Binding Sites of G-Protein-Coupled Receptors: Development and Application to Human Leukotriene B4 Receptor

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2012-01-01

    Full Text Available This paper describes a novel method to predict the activated structures of G-protein-coupled receptors (GPCRs with high accuracy, while aiming for the use of the predicted 3D structures in in silico virtual screening in the future. We propose a new method for modeling GPCR thermal fluctuations, where conformation changes of the proteins are modeled by combining fluctuations on multiple time scales. The core idea of the method is that a molecular dynamics simulation is used to calculate average 3D coordinates of all atoms of a GPCR protein against heat fluctuation on the picosecond or nanosecond time scale, and then evolutionary computation including receptor-ligand docking simulations functions to determine the rotation angle of each helix of a GPCR protein as a movement on a longer time scale. The method was validated using human leukotriene B4 receptor BLT1 as a sample GPCR. Our study demonstrated that the proposed method was able to derive the appropriate 3D structure of the active-state GPCR which docks with its agonists.

  15. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  16. Development of tools, technologies, and methodologies for imaging sensor testing

    Science.gov (United States)

    Lowry, H.; Bynum, K.; Steely, S.; Nicholson, R.; Horne, H.

    2013-05-01

    Ground testing of space- and air-borne imaging sensor systems is supported by Vis-to-LWIR imaging sensor calibration and characterization, as well as hardware-in-the-loop (HWIL) simulation with high-fidelity complex scene projection to validate sensor mission performance. To accomplish this successfully, there must be the development of tools, technologies, and methodologies that are used in space simulation chambers for such testing. This paper provides an overview of such efforts being investigated and implemented at Arnold Engineering Development Complex (AEDC).

  17. DIGITAL GEOMETRIC MODELLING OF TEETH PROFILE BY USING CAD METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Krzysztof TWARDOCH

    2014-03-01

    Full Text Available This article is devoted to the problem of properly defining the spatial model of tooth profile with CAD methodologies. Moved by the problem of the accuracy of the mapping defined curves describing the geometry of the teeth. Particular attention was paid to precise geometric modeling involute tooth profile, which has a significant influence on the process of identifying the mesh stiffness for tests performed on the dynamic phenomena occurring in the gear transmission systems conducted using dynamic models

  18. A methodology for constructing the calculation model of scientific spreadsheets

    OpenAIRE

    Vos, de, Ans; Wielemaker, J.; Schreiber, G.; Wielinga, B.; Top, J.L.

    2015-01-01

    Spreadsheets models are frequently used by scientists to analyze research data. These models are typically described in a paper or a report, which serves as single source of information on the underlying research project. As the calculation workflow in these models is not made explicit, readers are not able to fully understand how the research results are calculated, and trace them back to the underlying spreadsheets. This paper proposes a methodology for semi-automatically deriving the calcu...

  19. A Referential Methodology for Education on Sustainable Tourism Development

    Directory of Open Access Journals (Sweden)

    Burcin Hatipoglu

    2014-08-01

    Full Text Available Sustainable tourism has the potential of contributing to local development while protecting the natural environment and preserving cultural heritage. Implementation of this form of tourism requires human resources that can assume effective leadership in sustainable development. The purpose of the international student program, described in this paper, was to develop and implement an educational methodology to fulfill this need. The study, which was developed and applied by two universities, took place in August 2013, in the study setting of Kastamonu, Turkey. The effectiveness of the program was measured by pre- and post-surveys using the Global Citizenship Scale developed by Morais and Ogden. The findings document a change in intercultural communication, global knowledge and political voice dimensions of the scale.

  20. A New Methodology of Design and Development of Serious Games

    Directory of Open Access Journals (Sweden)

    André F. S. Barbosa

    2014-01-01

    Full Text Available The development of a serious game requires perfect knowledge of the learning domain to obtain the desired results. But it is also true that this may not be enough to develop a successful serious game. First of all, the player has to feel that he is playing a game where the learning is only a consequence of the playing actions. Otherwise, the game is viewed as boring and not as a fun activity and engaging. For example, the player can catch some items in the scenario and then separate them according to its type (i.e., recycle them. Thus, the main action for player is catching the items in the scenario where the recycle action is a second action, which is viewed as a consequence of the first action. Sometimes, the game design relies on a detailed approach based on the ideas of the developers because some educational content are difficult to integrate in the games, while maintaining the fun factor in the first place. In this paper we propose a new methodology of design and development of serious games that facilitates the integration of educational contents in the games. Furthermore, we present a serious game, called “Clean World”, created using this new methodology.

  1. An ABET assessment model using Six Sigma methodology

    Science.gov (United States)

    Lalovic, Mira

    Technical fields are changing so rapidly that even the core of an engineering education must be constantly reevaluated. Graduates of today give more dedication and, almost certainly, more importance to continued learning than to mastery of specific technical concepts. Continued learning shapes a high-quality education, which is what an engineering college must offer its students. The question is how to guarantee the quality of education. In addition, the Accreditation Board of Engineering and Technology is asking that universities commit to continuous and comprehensive education, assuming quality of the educational process. The research is focused on developing a generic assessment model for a college of engineering as an annual cycle that consists of a systematic assessment of every course in the program, followed by an assessment of the program and of the college as a whole using Six Sigma methodology. This unique approach to assessment in education will provide a college of engineering with valuable information regarding many important curriculum decisions in every accreditation cycle. The Industrial and Manufacturing Engineering (IME) Program in the College of Engineering at the University of Cincinnati will be used as a case example for a preliminary test of the generic model.

  2. Rate of force development: physiological and methodological considerations.

    Science.gov (United States)

    Maffiuletti, Nicola A; Aagaard, Per; Blazevich, Anthony J; Folland, Jonathan; Tillin, Neale; Duchateau, Jacques

    2016-06-01

    The evaluation of rate of force development during rapid contractions has recently become quite popular for characterising explosive strength of athletes, elderly individuals and patients. The main aims of this narrative review are to describe the neuromuscular determinants of rate of force development and to discuss various methodological considerations inherent to its evaluation for research and clinical purposes. Rate of force development (1) seems to be mainly determined by the capacity to produce maximal voluntary activation in the early phase of an explosive contraction (first 50-75 ms), particularly as a result of increased motor unit discharge rate; (2) can be improved by both explosive-type and heavy-resistance strength training in different subject populations, mainly through an improvement in rapid muscle activation; (3) is quite difficult to evaluate in a valid and reliable way. Therefore, we provide evidence-based practical recommendations for rational quantification of rate of force development in both laboratory and clinical settings.

  3. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show ...

  4. Methodological issues in clinical drug development for essential tremor.

    Science.gov (United States)

    Carranza, Michael A; Snyder, Madeline R; Elble, Rodger J; Boutzoukas, Angelique E; Zesiewicz, Theresa A

    2012-01-01

    Essential tremor (ET) is one of the most common tremor disorders in the world. Despite this, only two medications have received Level A recommendations from the American Academy of Neurology to treat it (primidone and propranolol). Even though these medications provide relief to a large group of ET patients, up to 50% of patients are non-responders. Additional medications to treat ET are needed. This review discusses some of the methodological issues that should be addressed for quality clinical drug development in ET.

  5. Methodological questions for the post-2015 development agenda.

    Directory of Open Access Journals (Sweden)

    Jacopo Bonan

    2014-07-01

    Full Text Available In 2015, the Millennium Development Goals are due to end. Academics, practitioners and the general public are eager to see which development agenda will take their place and a variety of different organizations are currently elaborating proposals for the next “round” of goals and targets. Instead of investigating possible topics of the upcoming agenda, we focus on methodological questions that – according to our view – will play a major role in the definition and implementation of future development goals. We focus on the elaboration of some key questions that should be addressed in the realm of poverty and inequality measurement, the definition of targets, the ability to consider complexity and evidence-based policy making.

  6. Developing a science of land change: Challenges and methodological issues

    Science.gov (United States)

    Rindfuss, Ronald R.; Walsh, Stephen J.; Turner, B. L.; Fox, Jefferson; Mishra, Vinod

    2004-01-01

    Land-change science has emerged as a foundational element of global environment change and sustainability science. It seeks to understand the human and environment dynamics that give rise to changed land uses and covers, not only in terms of their type and magnitude but their location as well. This focus requires the integration of social, natural, and geographical information sciences. Each of these broad research communities has developed different ways to enter the land-change problem, each with different means of treating the locational specificity of the critical variables, such as linking the land manager to the parcel being managed. The resulting integration encounters various data, methodological, and analytical problems, especially those concerning aggregation and inference, land-use pixel links, data and measurement, and remote sensing analysis. Here, these integration problems, which hinder comprehensive understanding and theory development, are addressed. Their recognition and resolution are required for the sustained development of land-change science. PMID:15383671

  7. Methodology of citrate-based biomaterial development and application

    Science.gov (United States)

    Tran, M. Richard

    Biomaterials play central roles in modern strategies of regenerative medicine and tissue engineering. Attempts to find tissue-engineered solutions to cure various injuries or diseases have led to an enormous increase in the number of polymeric biomaterials over the past decade. The breadth of new materials arises from the multiplicity of anatomical locations, cell types, and mode of application, which all place application-specific requirements on the biomaterial. Unfortunately, many of the currently available biodegradable polymers are limited in their versatility to meet the wide range of requirements for tissue engineering. Therefore, a methodology of biomaterial development, which is able to address a broad spectrum of requirements, would be beneficial to the biomaterial field. This work presents a methodology of citrate-based biomaterial design and application to meet the multifaceted needs of tissue engineering. We hypothesize that (1) citric acid, a non-toxic metabolic product of the body (Krebs Cycle), can be exploited as a universal multifunctional monomer and reacted with various diols to produce a new class of soft biodegradable elastomers with the flexibility to tune the material properties of the resulting material to meet a wide range of requirements; (2) the newly developed citrate-based polymers can be used as platform biomaterials for the design of novel tissue engineering scaffolding; and (3) microengineering approaches in the form thin scaffold sheets, microchannels, and a new porogen design can be used to generate complex cell-cell and cell-microenvironment interactions to mimic tissue complexity and architecture. To test these hypotheses, we first developed a methodology of citrate-based biomaterial development through the synthesis and characterization of a family of in situ crosslinkable and urethane-doped elastomers, which are synthesized using simple, cost-effective strategies and offer a variety methods to tailor the material properties to

  8. Reliability modelling of repairable systems using Petri nets and fuzzy Lambda-Tau methodology

    Energy Technology Data Exchange (ETDEWEB)

    Knezevic, J.; Odoom, E.R

    2001-07-01

    A methodology is developed which uses Petri nets instead of the fault tree methodology and solves for reliability indices utilising fuzzy Lambda-Tau method. Fuzzy set theory is used for representing the failure rate and repair time instead of the classical (crisp) set theory because fuzzy numbers allow expert opinions, linguistic variables, operating conditions, uncertainty and imprecision in reliability information to be incorporated into the system model. Petri nets are used because unlike the fault tree methodology, the use of Petri nets allows efficient simultaneous generation of minimal cut and path sets.

  9. Joint intelligence operations centers (JIOC) business process model & capabilities evaluation methodology

    OpenAIRE

    Schacher, Gordon; Irvine, Nelson; Hoyt, Roger

    2012-01-01

    A JIOC Business Process Model has been developed for use in evaluating JIOC capabilities. The model is described and depicted through OV5 and organization swim-lane diagrams. Individual intelligence activities diagrams are included. A JIOC evaluation methodology is described.

  10. Towards a Cognitive Handoff for the Future Internet: Model-driven Methodology and Taxonomy of Scenarios

    CERN Document Server

    Gonzalez-Horta, Francisco A; Ramirez-Cortes, Juan M; Martinez-Carballido, Jorge; Buenfil-Alpuche, Eldamira

    2011-01-01

    A cognitive handoff is a multipurpose handoff that achieves many desirable features simultaneously; e.g., seamlessness, autonomy, security, correctness, adaptability, etc. But, the development of cognitive handoffs is a challenging task that has not been properly addressed in the literature. In this paper, we discuss the difficulties of developing cognitive handoffs and propose a new model-driven methodology for their systematic development. The theoretical framework of this methodology is the holistic approach, the functional decomposition method, the model-based design paradigm, and the theory of design as scientific problem-solving. We applied the proposed methodology and obtained the following results: (i) a correspondence between handoff purposes and quantitative environment information, (ii) a novel taxonomy of handoff mobility scenarios, and (iii) an original state-based model representing the functional behavior of the handoff process.

  11. CHARACTERISTICS OF RESEARCH METHODOLOGY DEVELOPMENT IN SPECIAL EDUCATION AND REHABILITATION

    Directory of Open Access Journals (Sweden)

    Natasha ANGELOSKA-GALEVSKA

    2004-12-01

    Full Text Available The aim of the text is to point out the developmental tendencies in the research methodology of special education and rehabilitation worldwide and in our country and to emphasize the importance of methodological training of students in special education and rehabilitation at the Faculty of Philosophy in Skopje.The achieved scientific knowledge through research is the fundamental pre-condition for development of special education and rehabilitation theory and practice. The results of the scientific work sometimes cause small, insignificant changes, but, at times, they make radical changes. Thank to the scientific researches and knowledge, certain prejudices were rejected. For example, in the sixth decade of the last century there was a strong prejudice that mentally retarded children should be segregated from the society as aggressive and unfriendly ones or the deaf children should not learn sign language because they would not be motivated to learn lip-reading and would hardly adapt. Piaget and his colleagues from Geneva institute were the pioneers in researching this field and they imposed their belief that handicapped children were not handicapped in each field and they had potentials that could be developed and improved by systematic and organized work. It is important to initiate further researches in the field of special education and rehabilitation, as well as a critical analysis of realized researches. Further development of the scientific research in special education and rehabilitation should be a base for education policy on people with disabilities and development of institutional and non-institutional treatment of this population.

  12. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  13. Qualitative response models: A survey of methodology and illustrative applications

    Directory of Open Access Journals (Sweden)

    Nojković Aleksandra

    2007-01-01

    Full Text Available This paper introduces econometric modeling with discrete (categorical dependent variables. Such models, commonly referred to as qualitative response (QR models, have become a standard tool of microeconometric analysis. Microeconometric research represents empirical analysis of microdata, i.e. economic information about individuals, households and firms. Microeconometrics has been most widely adopted in various fields, such as labour economics, consumer behavior, or economy of transport. The latest research shows that this methodology can also be successfully transferred to macroeconomic context and applied to time series and panel data analysis in a wider scope. .

  14. Rapid development of xylanase assay conditions using Taguchi methodology.

    Science.gov (United States)

    Prasad Uday, Uma Shankar; Bandyopadhyay, Tarun Kanti; Bhunia, Biswanath

    2016-11-01

    The present investigation is mainly concerned with the rapid development of extracellular xylanase assay conditions by using Taguchi methodology. The extracellular xylanase was produced from Aspergillus niger (KP874102.1), a new strain isolated from a soil sample of the Baramura forest, Tripura West, India. Four physical parameters including temperature, pH, buffer concentration and incubation time were considered as key factors for xylanase activity and were optimized using Taguchi robust design methodology for enhanced xylanase activity. The main effect, interaction effects and optimal levels of the process factors were determined using signal-to-noise (S/N) ratio. The Taguchi method recommends the use of S/N ratio to measure quality characteristics. Based on analysis of the S/N ratio, optimal levels of the process factors were determined. Analysis of variance (ANOVA) was performed to evaluate statistically significant process factors. ANOVA results showed that temperature contributed the maximum impact (62.58%) on xylanase activity, followed by pH (22.69%), buffer concentration (9.55%) and incubation time (5.16%). Predicted results showed that enhanced xylanase activity (81.47%) can be achieved with pH 2, temperature 50°C, buffer concentration 50 Mm and incubation time 10 min.

  15. Evaluating Supply Chain Management: A Methodology Based on a Theoretical Model

    Directory of Open Access Journals (Sweden)

    Alexandre Tadeu Simon

    2015-01-01

    Full Text Available Despite the increasing interest in supply chain management (SCM by researchers and practitioners, there is still a lack of academic literature concerning topics such as methodologies to guide and support SCM evaluation. Most developed methodologies have been provided by consulting companies and are restricted in their publication and use. This article presents a methodology for evaluating companies’ degree of adherence to a SCM conceptual model. The methodology is based on Cooper, Lambert and Pagh’s original contribution and involves analysis of eleven referential axes established from key business processes, horizontal structures, and initiatives & practices. We analyze the applicability of the proposed model based on findings from interviews with experts - academics and practitioners - as well as from case studies of three focal firms and their supply chains. In general terms, the methodology can be considered a diagnostic instrument that allows companies to evaluate their maturity regarding SCM practices. From this diagnosis, firms can identify and implement activities to improve degree of adherence to the reference model and achieve SCM benefits. The methodology aims to contribute to SCM theory development. It is an initial, but structured, reference for translating a theoretical approach into practical aspects.

  16. Urban Agglomerations in Regional Development: Theoretical, Methodological and Applied Aspects

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Shmidt

    2016-09-01

    Full Text Available The article focuses on the analysis of the major process of modern socio-economic development, such as the functioning of urban agglomerations. A short background of the economic literature on this phenomenon is given. There are the traditional (the concentration of urban types of activities, the grouping of urban settlements by the intensive production and labour communications and modern (cluster theories, theories of network society conceptions. Two methodological principles of studying the agglomeration are emphasized: the principle of the unity of the spatial concentration of economic activity and the principle of compact living of the population. The positive and negative effects of agglomeration in the economic and social spheres are studied. Therefore, it is concluded that the agglomeration is helpful in the case when it brings the agglomerative economy (the positive benefits from it exceed the additional costs. A methodology for examination the urban agglomeration and its role in the regional development is offered. The approbation of this methodology on the example of Chelyabinsk and Chelyabinsk region has allowed to carry out the comparative analysis of the regional centre and the whole region by the main socio-economic indexes under static and dynamic conditions, to draw the conclusions on a position of the city and the region based on such socio-economic indexes as an average monthly nominal accrued wage, the cost of fixed assets, the investments into fixed capital, new housing supply, a retail turnover, the volume of self-produced shipped goods, the works and services performed in the region. In the study, the analysis of a launching site of the Chelyabinsk agglomeration is carried out. It has revealed the following main characteristics of the core of the agglomeration in Chelyabinsk (structure feature, population, level of centralization of the core as well as the Chelyabinsk agglomeration in general (coefficient of agglomeration

  17. Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM)

    Science.gov (United States)

    2013-12-05

    Supply Chain Modeling: Downstream Risk Assessment Methodology (DRAM) Dr. Sean Barnett December 5, 2013 Institute for Defense Analyses Alexandria, Virginia DMSMS Conference 2013 These Slides are Unclassified and Not Proprietary Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the

  18. A Methodology to Develop Ontologies for Emerging Domains

    Directory of Open Access Journals (Sweden)

    Chai Meenorngwar

    2013-05-01

    Full Text Available The characteristic of complex, dynamic domains, such as an emerging domain, is that the information necessary to describe them is not fully established. Standards are not yet established for these domains, and hence they are difficult to describe and present, and methods are needed that will reflect the changes that will occur as the domains develop and mature. This research proposes the Liverpool Metadata or LiMe methodology to develop an ontology and organise the knowledge that is necessary for developing the domain environment descriptions. Its aim is to capture Knowledge Information (KI from research articles and translate this into semantic information with web description languages such as XML(s, RDF(s, and OWL. LiMe represents an Ontological Framework, which provides the concept characteristics, represented as a concept framework that specifies conceptualisations of the knowledge. LiMe supports the Semantic Web development. “e-Learning” has been chosen as an example of an emerging domain in this research. The characteristics of e-Learning concepts will be extracted from research articles of journal websites such ScienceDirect, Springer, etc and represented as knowledge. LiMe also explicitly represents how these concepts are developed and evolve to represent the domain.

  19. Development of methodology for horizontal axis wind turbine dynamic analysis

    Science.gov (United States)

    Dugundji, J.

    1982-01-01

    Horizontal axis wind turbine dynamics were studied. The following findings are summarized: (1) review of the MOSTAS computer programs for dynamic analysis of horizontal axis wind turbines; (2) review of various analysis methods for rotating systems with periodic coefficients; (3) review of structural dynamics analysis tools for large wind turbine; (4) experiments for yaw characteristics of a rotating rotor; (5) development of a finite element model for rotors; (6) development of simple models for aeroelastics; and (7) development of simple models for stability and response of wind turbines on flexible towers.

  20. Environmental sustainability modeling with exergy methodology for building life cycle

    Institute of Scientific and Technical Information of China (English)

    刘猛; 姚润明

    2009-01-01

    As an important human activity,the building industry has created comfortable space for living and work,and at the same time brought considerable pollution and huge consumption of energy and recourses. From 1990s after the first building environmental assessment model-BREEAM was released in the UK,a number of assessment models were formulated as analytical and practical in methodology respectively. This paper aims to introduce a generic model of exergy assessment on environmental impact of building life cycle,taking into consideration of previous models and focusing on natural environment as well as building life cycle,and three environmental impacts will be analyzed,namely energy embodied exergy,resource chemical exergy and abatement exergy on energy consumption,resource consumption and pollutant discharge respectively. The model of exergy assessment on environmental impact of building life cycle thus formulated contains two sub-models,one from the aspect of building energy utilization,and the other from building materials use. Combining theories by ecologists such as Odum,building environmental sustainability modeling with exergy methodology is put forward with the index of exergy footprint of building environmental impacts.

  1. An integrated measurement and modeling methodology for estuarine water quality management

    Institute of Scientific and Technical Information of China (English)

    Michael Hartnett; Stephen Nash

    2015-01-01

    This paper describes research undertaken by the authors to develop an integrated measurement and modeling methodology for water quality management of estuaries. The approach developed utilizes modeling and measurement results in a synergistic manner. Modeling results were initially used to inform the field campaign of appropriate sampling locations and times, and field data were used to develop accurate models. Remote sensing techniques were used to capture data for both model development and model validation. Field surveys were undertaken to provide model initial conditions through data assimilation and determine nutrient fluxes into the model domain. From field data, salinity re-lationships were developed with various water quality parameters, and relationships between chlorophyll a concentrations, transparency, and light attenuation were also developed. These relationships proved to be invaluable in model development, particularly in modeling the growth and decay of chlorophyll a. Cork Harbour, an estuary that regularly experiences summer algal blooms due to anthropogenic sources of nutrients, was used as a case study to develop the methodology. The integration of remote sensing, conventional fieldwork, and modeling is one of the novel aspects of this research and the approach developed has widespread applicability.

  2. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    Science.gov (United States)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  3. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    Science.gov (United States)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  4. Didaktisch-methodisches Modell, Methode und methodisches Instrumentarium im Fremdsprachenunterricht (Pedagogical-Methodological Model, Method and Methodological Arsenal in Foreign Language Teaching)

    Science.gov (United States)

    Guenther, Klaus

    1975-01-01

    Concentrates on (1) an exposition of the categories "pedagogical-methodological model", "method", and "methodological arsenal" from the viewpoint of FL teaching; (2) clearing up the relation between the pedagogical-methodological model and teaching method; (3) explaining an example of the application of the categories mentioned. (Text is in…

  5. Development and evaluation of clicker methodology for introductory physics courses

    Science.gov (United States)

    Lee, Albert H.

    Many educators understand that lectures are cost effective but not learning efficient, so continue to search for ways to increase active student participation in this traditionally passive learning environment. In-class polling systems, or "clickers", are inexpensive and reliable tools allowing students to actively participate in lectures by answering multiple-choice questions. Students assess their learning in real time by observing instant polling summaries displayed in front of them. This in turn motivates additional discussions which increase the opportunity for active learning. We wanted to develop a comprehensive clicker methodology that creates an active lecture environment for a broad spectrum of students taking introductory physics courses. We wanted our methodology to incorporate many findings of contemporary learning science. It is recognized that learning requires active construction; students need to be actively involved in their own learning process. Learning also depends on preexisting knowledge; students construct new knowledge and understandings based on what they already know and believe. Learning is context dependent; students who have learned to apply a concept in one context may not be able to recognize and apply the same concept in a different context, even when both contexts are considered to be isomorphic by experts. On this basis, we developed question sequences, each involving the same concept but having different contexts. Answer choices are designed to address students preexisting knowledge. These sequences are used with the clickers to promote active discussions and multiple assessments. We have created, validated, and evaluated sequences sufficient in number to populate all of introductory physics courses. Our research has found that using clickers with our question sequences significantly improved student conceptual understanding. Our research has also found how to best measure student conceptual gain using research-based instruments

  6. Regional studies program. Forecasting the local economic impacts of energy resource development: a methodological approach

    Energy Technology Data Exchange (ETDEWEB)

    Stenehjem, E.J.

    1975-12-01

    Emphasis is placed on the nature and magnitude of socio-economic impacts of fossil-fuel development. A model is described that identifies and estimates the magnitude of the economic impacts of anticipated energy resource development in site-specific areas and geographically contiguous areas of unspecified size. The modeling methodology was designed to assist industries and government agencies complying with recent federal and state legislation requiring subregional impact analyses for individual facilities. The model was designed in light of the requirements for accuracy, expandability, and exportability. The methodology forecasts absolute increments in local and regional growth on an annual or biennial basis and transforms these parameters into estimates of the affected area's ability to accommodate growth-induced demands, especially demands for public services. (HLW)

  7. Agent-Oriented Methodology and Modeling Tools%面向主体的开发方法和可视化建模工具

    Institute of Scientific and Technical Information of China (English)

    季强

    2002-01-01

    This paper introduces an agent-oriented methodology and modeling tools based on MAGE. The methodology supports analysis, desing and implimentation of multi-agent systems. The modeling tools assist the developer in building multi-agent systems using the methodology through a set of visual model editors.

  8. Natural gas production problems : solutions, methodologies, and modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, Christopher Arthur; Herrin, James M.; Cooper, Scott Patrick; Basinski, Paul M. (El Paso Production Company, Houston, TX); Olsson, William Arthur; Arnold, Bill Walter; Broadhead, Ronald F. (New Mexico Bureau of Geology and Mineral Resources, Socorro, NM); Knight, Connie D. (Consulting Geologist, Golden, CO); Keefe, Russell G.; McKinney, Curt (Devon Energy Corporation, Oklahoma City, OK); Holm, Gus (Vermejo Park Ranch, Raton, NM); Holland, John F.; Larson, Rich (Vermejo Park Ranch, Raton, NM); Engler, Thomas W. (New Mexico Institute of Mining and Technology, Socorro, NM); Lorenz, John Clay

    2004-10-01

    Natural gas is a clean fuel that will be the most important domestic energy resource for the first half the 21st centtuy. Ensuring a stable supply is essential for our national energy security. The research we have undertaken will maximize the extractable volume of gas while minimizing the environmental impact of surface disturbances associated with drilling and production. This report describes a methodology for comprehensive evaluation and modeling of the total gas system within a basin focusing on problematic horizontal fluid flow variability. This has been accomplished through extensive use of geophysical, core (rock sample) and outcrop data to interpret and predict directional flow and production trends. Side benefits include reduced environmental impact of drilling due to reduced number of required wells for resource extraction. These results have been accomplished through a cooperative and integrated systems approach involving industry, government, academia and a multi-organizational team within Sandia National Laboratories. Industry has provided essential in-kind support to this project in the forms of extensive core data, production data, maps, seismic data, production analyses, engineering studies, plus equipment and staff for obtaining geophysical data. This approach provides innovative ideas and technologies to bring new resources to market and to reduce the overall environmental impact of drilling. More importantly, the products of this research are not be location specific but can be extended to other areas of gas production throughout the Rocky Mountain area. Thus this project is designed to solve problems associated with natural gas production at developing sites, or at old sites under redevelopment.

  9. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  10. Integration of process design and controller design for chemical processes using model-based methodology

    DEFF Research Database (Denmark)

    Abd.Hamid, Mohd-Kamaruddin; Sin, Gürkan; Gani, Rafiqul

    2010-01-01

    In this paper, a novel systematic model-based methodology for performing integrated process design and controller design (IPDC) for chemical processes is presented. The methodology uses a decomposition method to solve the IPDC typically formulated as a mathematical programming (optimization...... with constraints) problem. Accordingly the optimization problem is decomposed into four sub-problems: (i) pre-analysis, (ii) design analysis, (iii) controller design analysis, and (iv) final selection and verification, which are relatively easier to solve. The methodology makes use of thermodynamic-process...... insights and the reverse design approach to arrive at the final process design–controller design decisions. The developed methodology is illustrated through the design of: (a) a single reactor, (b) a single separator, and (c) a reactor–separator-recycle system and shown to provide effective solutions...

  11. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  12. Logic flowgraph methodology - A tool for modeling embedded systems

    Science.gov (United States)

    Muthukumar, C. T.; Guarro, S. B.; Apostolakis, G. E.

    1991-01-01

    The logic flowgraph methodology (LFM), a method for modeling hardware in terms of its process parameters, has been extended to form an analytical tool for the analysis of integrated (hardware/software) embedded systems. In the software part of a given embedded system model, timing and the control flow among different software components are modeled by augmenting LFM with modified Petrinet structures. The objective of the use of such an augmented LFM model is to uncover possible errors and the potential for unanticipated software/hardware interactions. This is done by backtracking through the augmented LFM mode according to established procedures which allow the semiautomated construction of fault trees for any chosen state of the embedded system (top event). These fault trees, in turn, produce the possible combinations of lower-level states (events) that may lead to the top event.

  13. Formulación de una Metodología de Formación y Evaluación en Empresarismo, bajo un Modelo de Competencias (Development of an entrepreneurial training and evaluation methodology under the competency - based model

    Directory of Open Access Journals (Sweden)

    Paola Podestá

    2012-12-01

    Full Text Available El presente artículo se deriva de un trabajo de investigaciónque surgió del interés por contar con un modelo deformación y evaluación en empresarismo, para la UniversidadEAFIT. Se elige un modelo por competencias, dada la tendenciaactual de la pedagogía hacia esta perspectiva, entendiendoel concepto de competencia como hacer en contexto.En la actualidad el empresarismo es, junto con los procesosde promoción y acompañamiento, una de las estrategias dedesarrollo de la Universidad EAFI.; El proceso de formaciónes uno de los pilares del programa. El resultado de este trabajode investigación es el modelo de formación y evaluación porcompetencias para empresarismo, modelo que sirve no sóloal interior de la institución, sino también como metodologíareplicable en proyectos de consultoría en el tema.   ABSTRACT This article derives from a research project aimed atthe development of an entrepreneurial training and evaluationmethodology under the competency - based model forthe EAFIT University in Colombia. A competency - basedmodel was selected due to present teaching trends towardthis approach that views competency as “doing in context”.Nowadays, entrepreneurship is, along with promotionaland guidance processes, a development strategy for theEAFIT University. The training process is one of the pillarsupon which the program is built. The outcome of thisresearch is a training and evaluation by competency - basedmodel for entrepreneurship that not only serves the Universityinternally, but also serves a repeatable methodology forconsulting projects.

  14. Setting road safety targets in Cambodia : a methodological demonstration using the latent risk time series model.

    NARCIS (Netherlands)

    Commandeur, J.J.F. Wesemann, P. Bijleveld, F.D. Chhoun, V. & Sann, S.

    2017-01-01

    The authors present the methodology used for estimating forecasts for the number of road traffic fatalities in 2011—2020 in Cambodia based on observed developments in Cambodian road traffic fatalities and motor vehicle ownership in the years 1995—2009. Using the latent risk time series model

  15. Evaluation of probable maximum snow accumulation: Development of a methodology for climate change studies

    Science.gov (United States)

    Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick

    2016-06-01

    Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.

  16. Development of a methodology for assessing the safety of embedded software systems

    Science.gov (United States)

    Garrett, C. J.; Guarro, S. B.; Apostolakis, G. E.

    1993-01-01

    A Dynamic Flowgraph Methodology (DFM) based on an integrated approach to modeling and analyzing the behavior of software-driven embedded systems for assessing and verifying reliability and safety is discussed. DFM is based on an extension of the Logic Flowgraph Methodology to incorporate state transition models. System models which express the logic of the system in terms of causal relationships between physical variables and temporal characteristics of software modules are analyzed to determine how a certain state can be reached. This is done by developing timed fault trees which take the form of logical combinations of static trees relating the system parameters at different point in time. The resulting information concerning the hardware and software states can be used to eliminate unsafe execution paths and identify testing criteria for safety critical software functions.

  17. Development of a Teaching Methodology for Undergraduate Human Development in Psychology

    Science.gov (United States)

    Rodriguez, Maria A.; Espinoza, José M.

    2015-01-01

    The development of a teaching methodology for the undergraduate Psychology course Human Development II in a private university in Lima, Peru is described. The theoretical framework consisted of an integration of Citizen Science and Service Learning, with the application of Information and Communications Technology (ICT), specifically Wikipedia and…

  18. The development of theoretical and methodological foundations of enterprises’ transition to innovative development

    Directory of Open Access Journals (Sweden)

    Yu.S. Shypulina

    2011-12-01

    Full Text Available The analysis is set out of trends in the development of theoretical and methodological approaches to the management process of intensification of domestic enterprises transition to innovative development. The results can be the basis for the formation of the prerequisites for creating the innovation-friendly environment for domestic enterprises.

  19. Development of the damage assessment methodology for ceiling elements

    Science.gov (United States)

    Nitta, Yoshihiro; Iwasaki, Atsumi; Nishitani, Akira; Wakatabe, Morimasa; Inai, Shinsuke; Ohdomari, Iwao; Tsutsumi, Hiroki

    2012-04-01

    This paper presents the basic concept of a damage assessment methodology for ceiling elements with the aid of smart sensor board and inspection robot. In this proposed system, the distributed smart sensor boards firstly detect the fact of damage occurrence. Next, the robot inspects the damage location and captures the photographic image of damage condition. The smart sensor board for the proposed system mainly consists of microcontroller, strain gage and LAN module. The inspection robot integrated into the proposed system has a wireless camera and wireless LAN device for receiving signal to manipulate itself. At first, the effectiveness of the smart sensor board and inspection robot is tested by experiments of a full-scale suspended ceiling utilizing shaking table facilities. The model ceiling is subjected to several levels of excitations and thus various levels of damages are caused. Next, this robot inspection scheme is applied to the ceiling of a real structure damaged by the 2011 off the pacific coast of Tohoku Earthquake. The obtained results indicate that the proposed system can detect the location and condition of the damage.

  20. Proposed Methodology for Generation of Building Information Model with Laserscanning

    Institute of Scientific and Technical Information of China (English)

    Shutao Li; J(o)rg lsele; Georg Bretthauer

    2008-01-01

    For refurbishment and state review of an existing old building,a new model reflecting the current state is often required especially when the original plans are no longer accessible.Laser scanners are used more and more as surveying instruments for various applications because of their high-precision scanning abilities.For buildings,the most notable and widely accepted product data model is the IFC product data model.It is designed to cover the whole lifecycle and supported by various software vendors and enables applications to efficiently share and exchange project information.The models obtained with the laser scan-ner,normally sets of points ("point cloud"),have to be transferred to an IFC compatible building information model to serve the needs of different planning states.This paper presents an approach designed by the German Research Center in Karlsmhe (Forschungszentrum Kadsmhe) to create an IFC compatible building information model from laser range images.The methodology through the entire process from data acquisi tion to the IFC compatible product model was proposed in this paper.In addition,IFC-Models with different level of detail (LoDs) were introduced and discussed within the work.

  1. Methodology Using MELCOR Code to Model Proposed Hazard Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Gavin Hawkley

    2010-07-01

    This study demonstrates a methodology for using the MELCOR code to model a proposed hazard scenario within a building containing radioactive powder, and the subsequent evaluation of a leak path factor (LPF) (or the amount of respirable material which that escapes a facility into the outside environment), implicit in the scenario. This LPF evaluation will analyzes the basis and applicability of an assumed standard multiplication of 0.5 × 0.5 (in which 0.5 represents the amount of material assumed to leave one area and enter another), for calculating an LPF value. The outside release is dependsent upon the ventilation/filtration system, both filtered and un-filtered, and from other pathways from the building, such as doorways (, both open and closed). This study is presents ed to show how the multiple leak path factorsLPFs from the interior building can be evaluated in a combinatory process in which a total leak path factorLPF is calculated, thus addressing the assumed multiplication, and allowing for the designation and assessment of a respirable source term (ST) for later consequence analysis, in which: the propagation of material released into the environmental atmosphere can be modeled and the dose received by a receptor placed downwind can be estimated and the distance adjusted to maintains such exposures as low as reasonably achievableALARA.. Also, this study will briefly addresses particle characteristics thatwhich affect atmospheric particle dispersion, and compares this dispersion with leak path factorLPF methodology.

  2. Methodological Guidelines for Reducing the Complexity of Data Warehouse Development for Transactional Blood Bank Systems.

    Science.gov (United States)

    Takecian, Pedro L; Oikawa, Marcio K; Braghetto, Kelly R; Rocha, Paulo; Lucena, Fred; Kavounis, Katherine; Schlumpf, Karen S; Acker, Susan; Carneiro-Proietti, Anna B F; Sabino, Ester C; Custer, Brian; Busch, Michael P; Ferreira, João E

    2013-06-01

    Over time, data warehouse (DW) systems have become more difficult to develop because of the growing heterogeneity of data sources. Despite advances in research and technology, DW projects are still too slow for pragmatic results to be generated. Here, we address the following question: how can the complexity of DW development for integration of heterogeneous transactional information systems be reduced? To answer this, we proposed methodological guidelines based on cycles of conceptual modeling and data analysis, to drive construction of a modular DW system. These guidelines were applied to the blood donation domain, successfully reducing the complexity of DW development.

  3. Development of an in-situ soil structure characterization methodology

    Science.gov (United States)

    Debos, Endre; Kriston, Sandor

    2015-04-01

    The agricultural cultivation has several direct and indirect effects on the soil properties, among which the soil structure degradation is the best known and most detectable one. Soil structure degradation leads to several water and nutrient management problems, which reduce the efficiency of agricultural production. There are several innovative technological approaches aiming to reduce these negative impacts on the soil structure. The tests, validation and optimization of these methods require an adequate technology to measure the impacts on the complex soil system. This study aims to develop an in-situ soil structure and root development testing methodology, which can be used in field experiments and which allows one to follow the real time changes in the soil structure - evolution / degradation and its quantitative characterization. The method is adapted from remote sensing image processing technology. A specifically transformed A/4 size scanner is placed into the soil into a safe depth that cannot be reached by the agrotechnical treatments. Only the scanner USB cable comes to the surface to allow the image acquisition without any soil disturbance. Several images from the same place can be taken throughout the vegetation season to follow the soil consolidation and structure development after the last tillage treatment for the seedbed preparation. The scanned image of the soil profile is classified using supervised image classification, namely the maximum likelihood classification algorithm. The resulting image has two principal classes, soil matrix and pore space and other complementary classes to cover the occurring thematic classes, like roots, stones. The calculated data is calibrated with filed sampled porosity data. As the scanner is buried under the soil with no changes in light conditions, the image processing can be automated for better temporal comparison. Besides the total porosity each pore size fractions and their distributions can be calculated for

  4. Modeling of electrohydrodynamic drying process using response surface methodology.

    Science.gov (United States)

    Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin

    2014-05-01

    Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box-Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM.

  5. Development of performance assessment methodology for establishment of quantitative acceptance criteria of near-surface radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. R.; Lee, E. Y.; Park, J. W.; Chang, G. M.; Park, H. Y.; Yeom, Y. S. [Korea Hydro and Nuclear Power Co., Ltd., Seoul (Korea, Republic of)

    2002-03-15

    The contents and the scope of this study are as follows : review of state-of-the-art on the establishment of waste acceptance criteria in foreign near-surface radioactive waste disposal facilities, investigation of radiological assessment methodologies and scenarios, investigation of existing models and computer codes used in performance/safety assessment, development of a performance assessment methodology(draft) to derive quantitatively radionuclide acceptance criteria of domestic near-surface disposal facility, preliminary performance/safety assessment in accordance with the developed methodology.

  6. A ROADMAP FOR GENERATING SEMANTICALLY ENRICHED BUILDING MODELS ACCORDING TO CITYGML MODEL VIA TWO DIFFERENT METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    G. Floros

    2016-10-01

    Full Text Available The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model’s format, via semi-automatic procedures with respect to the user’s scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model’s generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects’ purposes.

  7. A Methodology For The Development Of Complex Domain Specific Languages

    CERN Document Server

    Risoldi, Matteo; Falquet, Gilles

    2010-01-01

    The term Domain-Specific Modeling Language is used in software development to indicate a modeling (and sometimes programming) language dedicated to a particular problem domain, a particular problem representation technique and/or a particular solution technique. The concept is not new -- special-purpose programming language and all kinds of modeling/specification languages have always existed, but the term DSML has become more popular due to the rise of domain-specific modeling. Domain-specific languages are considered 4GL programming languages. Domain-specific modeling techniques have been adopted for a number of years now. However, the techniques and frameworks used still suffer from problems of complexity of use and fragmentation. Although in recent times some integrated environments are seeing the light, it is not common to see many concrete use cases in which domain-specific modeling has been put to use. The main goal of this thesis is tackling the domain of interactive systems and applying a DSML-based...

  8. An improved methodology for dynamic modelling and simulation of electromechanically coupled drive systems: An experimental validation

    Indian Academy of Sciences (India)

    Nuh Erdogan; Humberto Henao; Richard Grisel

    2015-10-01

    The complexity of electromechanical coupling drive system (ECDS)s, specifically electrical drive systems, makes studying them in their entirety challenging since they consist of elements of diverse nature, i.e. electric, electronics and mechanics. This presents a real struggle to the engineers who want to design and implement such systems with high performance, efficiency and reliability. For this purpose, engineers need a tool capable of modelling and/or simulating components of diverse nature within the ECDS. However, a majority of the available tools are limited in their capacity to describe the characteristics of such components sufficiently. To overcome this difficulty, this paper first proposes an improved methodology of modelling and simulation for ECDS. The approach is based on using domain-based simulators individually, namely electric and mechanic part simulators and also integrating them with a co-simulation. As for the modelling of the drive machine, a finely tuned dynamic model is developed by taking the saturation effect into account. In order to validate the developed model as well as the proposed methodology, an industrial ECDS is tested experimentally. Later, both the experimental and simulation results are compared to prove the accuracy of the developed model and the relevance of the proposed methodology.

  9. Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes

    Science.gov (United States)

    Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin

    This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.

  10. Development of a methodology for the detection of hospital financial outliers using information systems.

    Science.gov (United States)

    Okada, Sachiko; Nagase, Keisuke; Ito, Ayako; Ando, Fumihiko; Nakagawa, Yoshiaki; Okamoto, Kazuya; Kume, Naoto; Takemura, Tadamasa; Kuroda, Tomohiro; Yoshihara, Hiroyuki

    2014-01-01

    Comparison of financial indices helps to illustrate differences in operations and efficiency among similar hospitals. Outlier data tend to influence statistical indices, and so detection of outliers is desirable. Development of a methodology for financial outlier detection using information systems will help to reduce the time and effort required, eliminate the subjective elements in detection of outlier data, and improve the efficiency and quality of analysis. The purpose of this research was to develop such a methodology. Financial outliers were defined based on a case model. An outlier-detection method using the distances between cases in multi-dimensional space is proposed. Experiments using three diagnosis groups indicated successful detection of cases for which the profitability and income structure differed from other cases. Therefore, the method proposed here can be used to detect outliers. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    Georgsen, Marianne; Buus, Lillian; Glud, Louise Nørgaard;

    2010-01-01

    of particular ‘mediating design artefacts’. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or “Designing for Learning”. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  12. Risk assessment methodology applied to counter IED research & development portfolio prioritization

    Energy Technology Data Exchange (ETDEWEB)

    Shevitz, Daniel W [Los Alamos National Laboratory; O' Brien, David A [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Chavez, Gregory M [Los Alamos National Laboratory

    2009-01-01

    In an effort to protect the United States from the ever increasing threat of domestic terrorism, the Department of Homeland Security, Science and Technology Directorate (DHS S&T), has significantly increased research activities to counter the terrorist use of explosives. More over, DHS S&T has established a robust Counter-Improvised Explosive Device (C-IED) Program to Deter, Predict, Detect, Defeat, and Mitigate this imminent threat to the Homeland. The DHS S&T portfolio is complicated and changing. In order to provide the ''best answer'' for the available resources, DHS S&T would like some ''risk based'' process for making funding decisions. There is a definite need for a methodology to compare very different types of technologies on a common basis. A methodology was developed that allows users to evaluate a new ''quad chart'' and rank it, compared to all other quad charts across S&T divisions. It couples a logic model with an evidential reasoning model using an Excel spreadsheet containing weights of the subjective merits of different technologies. The methodology produces an Excel spreadsheet containing the aggregate rankings of the different technologies. It uses Extensible Logic Modeling (ELM) for logic models combined with LANL software called INFTree for evidential reasoning.

  13. METHODOLOGICAL BASES OF ECOLOGICAL CULTURE FORMATION OF PUPILS ON THE BASIS OF ECO-DEVELOPMENT IDEAS

    Directory of Open Access Journals (Sweden)

    Natalia F. Vinokurova

    2016-01-01

    Full Text Available Aim. The article describes methodological bases of formation of ecological culture of students as the aim of innovative training for a sustainable future. The authors take into account international and the Russian experience, connected with development of ecological culture as an educational resource of society adaptation to environmental constraints, risks, crises and present-day consolidated actions towards sustainable development of civilization. Methods. The methodological basis of constructing of the model formation of pupils’ ecological culture is developed from the standpoint of the idea of eco-development (noosphere, co-evolution, sustainable development and a set of axiological, cultural, personal-activity, co-evolutionary, cultural and ecological approaches. Justified methodical basis has allowed to construct educational level of formation of ecological culture of pupils, comprising interconnected unity of the target, substantive, procedural, effectively and appraisal components. Results and scientific novelty. The article presents the results of many years research of authors on environmental education for sustainable development in the framework of the Nizhny Novgorod scientific school. A characteristic of ecological culture of students as the goal of environmental education based on ecodevelopment ideas is given. It is shown that the ecological culture of students directs them to new values in life meanings, methods of ecological-oriented actions and behavior, changing settings of the consumer society and ensuring the development of the younger generation of co-evolutionary, spiritual guidance in a postindustrial society. The authors’ model of the formation of ecological culture of pupils is represented by conjugation philosophical and methodological, theoretical, methodological and pedagogical levels that ensure the integrity and hierarchical pedagogical research on the issue. The article discloses a pedagogical assessment

  14. A methodology for semiautomatic generation of finite element models: Application to mechanical devices

    Directory of Open Access Journals (Sweden)

    Jesús López

    2015-02-01

    Full Text Available In this work, a methodology to create parameterized finite element models is presented, particularly focusing on the development of suitable algorithms in order to generate models and meshes with high computational efficiency. The methodology is applied to the modeling of two common mechanical devices: an optical linear encoder and a gear transmission. This practical application constitutes a tough test to the methodology proposed, given the complexity and the large number of components that set up this high-precision measurement device and the singularity of the internal gears. The geometrical and mechanical particularities of the components lead to multidimensional modeling, seeking to ensure proper interaction between the different types of finite elements. Besides, modeling criteria to create components such as compression and torsion springs, sheet springs, bearings, or adhesive joints are also presented in the article. The last part of the work aims to validate the simulation results obtained with the methodology proposed with those derived from experimental tests through white noise base-driven vibration and hammer impact excitation modal analysis.

  15. A component modes projection and assembly model reduction methodology for articulated, multi-flexible body structures

    Science.gov (United States)

    Lee, Allan Y.; Tsuha, Walter S.

    1993-01-01

    A two-stage model reduction methodology, combining the classical Component Mode Synthesis (CMS) method and the newly developed Enhanced Projection and Assembly (EP&A) method, is proposed in this research. The first stage of this methodology, called the COmponent Modes Projection and Assembly model REduction (COMPARE) method, involves the generation of CMS mode sets, such as the MacNeal-Rubin mode sets. These mode sets are then used to reduce the order of each component model in the Rayleigh-Ritz sense. The resultant component models are then combined to generate reduced-order system models at various system configurations. A composite mode set which retains important system modes at all system configurations is then selected from these reduced-order system models. In the second stage, the EP&A model reduction method is employed to reduce further the order of the system model generated in the first stage. The effectiveness of the COMPARE methodology has been successfully demonstrated on a high-order, finite-element model of the cruise-configured Galileo spacecraft.

  16. Optimization-based methodology for the development of wastewater facilities for energy and nutrient recovery.

    Science.gov (United States)

    Puchongkawarin, C; Gomez-Mont, C; Stuckey, D C; Chachuat, B

    2015-12-01

    A paradigm shift is currently underway from an attitude that considers wastewater streams as a waste to be treated, to a proactive interest in recovering materials and energy from these streams. This paper is concerned with the development and application of a systematic, model-based methodology for the development of wastewater resource recovery systems that are both economically attractive and sustainable. With the array of available treatment and recovery options growing steadily, a superstructure modeling approach based on rigorous mathematical optimization appears to be a natural approach for tackling these problems. The development of reliable, yet simple, performance and cost models is a key issue with this approach in order to allow for a reliable solution based on global optimization. We argue that commercial wastewater simulators can be used to derive such models, and we illustrate this approach with a simple resource recovery system. The results show that the proposed methodology is computationally tractable, thereby supporting its application as a decision support system for selection of promising resource recovery systems whose development is worth pursuing.

  17. Methodology, status and plans for development and assessment of Cathare code

    Energy Technology Data Exchange (ETDEWEB)

    Bestion, D.; Barre, F.; Faydide, B. [CEA - Grenoble (France)

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  18. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS

    Science.gov (United States)

    Jadaan, Osama M.

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. This includes completion of a literature survey regarding Weibull size effect in MEMS and strength testing techniques. Also of interest is the design of a proper test for the Weibull size effect in tensile specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. Another potential item of interest is analysis and modeling of material interfaces for strength as well as developing a strategy to handle stress singularities at sharp corners, filets, and material interfaces. The ultimate objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structuredlife (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. Along these lines work may also be performed on transient fatigue life prediction methodologies.

  19. The Pre-Conceptual Map Methodology: Development and Application

    Science.gov (United States)

    Hipsky, Shellie

    2006-01-01

    The objective of this article is to present the Pre-Conceptual Map methodology as a formalized way to identify, document, and utilize preconceived assumptions on the part of the researcher in qualitative inquiry. This technique can be used as a stand alone method or in conjunction with other qualitative techniques (i.e., naturalistic inquiry).…

  20. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-02-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  1. A coupled groundwater-flow-modelling and vulnerability-mapping methodology for karstic terrain management

    Science.gov (United States)

    Kavouri, Konstantina P.; Karatzas, George P.; Plagnes, Valérie

    2017-08-01

    A coupled groundwater-flow-modelling and vulnerability-mapping methodology for the management of karst aquifers with spatial variability is developed. The methodology takes into consideration the duality of flow and recharge in karst and introduces a simple method to integrate the effect of temporal storage in the unsaturated zone. In order to investigate the applicability of the developed methodology, simulation results are validated against available field measurement data. The criteria maps from the PaPRIKa vulnerability-mapping method are used to document the groundwater flow model. The FEFLOW model is employed for the simulation of the saturated zone of Palaikastro-Chochlakies karst aquifer, in the island of Crete, Greece, for the hydrological years 2010-2012. The simulated water table reproduces typical karst characteristics, such as steep slopes and preferred drain axes, and is in good agreement with field observations. Selected calculated error indicators—Nash-Sutcliffe efficiency (NSE), root mean squared error (RMSE) and model efficiency (E')—are within acceptable value ranges. Results indicate that different storage processes take place in different parts of the aquifer. The north-central part seems to be more sensitive to diffuse recharge, while the southern part is affected primarily by precipitation events. Sensitivity analysis is performed on the parameters of hydraulic conductivity and specific yield. The methodology is used to estimate the feasibility of artificial aquifer recharge (AAR) at the study area. Based on the developed methodology, guidelines were provided for the selection of the appropriate AAR scenario that has positive impact on the water table.

  2. Development of a reference biospheres methodology for radioactive waste disposal. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Dorp, F. van [NAGRA (Switzerland)] [and others

    1996-09-01

    The BIOMOVS II Working Group on Reference Biospheres has focused on the definition and testing of a methodology for developing models to analyse radionuclide behaviour in the biosphere and associated radiological exposure pathways(a Reference Biospheres Methodology). The Working Group limited the scope to the assessment of the long-term implications of solid radioactive waste disposal. Nevertheless, it is considered that many of the basic principles would be equally applicable to other areas of biosphere assessment. The recommended methodology has been chosen to be relevant to different types of radioactive waste and disposal concepts. It includes the justification, arguments and documentation for all the steps in the recommended methodology. The previous experience of members of the Reference Biospheres Working Group was that the underlying premises of a biosphere assessment have often been taken for granted at the early stages of model development, and can therefore fail to be recognized later on when questions of model sufficiency arise, for example, because of changing regulatory requirements. The intention has been to define a generic approach for the formation of an 'audit trail' and hence provide demonstration that a biosphere model is fit for its intended purpose. The starting point for the methodology has three. The Assessment Context sets out what the assessment has to achieve, eg. in terms of assessment purpose and related regulatory criteria, as well as information about the repository system and types of release from the geosphere. The Basic System Description includes the fundamental premises about future climate conditions and human behaviour which, to a significant degree, are beyond prediction. The International FEP List is a generically relevant list of Features, Events and Processes potentially important for biosphere model development. The International FEP List includes FEPs to do with the assessment context. The context examined in

  3. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available that customer interaction is not the only factor that influences a successful project; teamwork among software developers also plays a role. Hence, Moe et al. (2010) examined teamwork among developers in a Scrum project, using Dickson and Mcintyre’s teamwork...://www.mccormickpcs.com/ images/Waterfall_vs_Agile_Methodology.pdf Moe, N. B., Dingsøyr, T & Dybå, T. (2010) A Teamwork Model for Understanding an Agile Team: A Case Study of a Scrum Project. Information and Software Technology, 52, 480-491. Mohammadi, S., Nikkhahan, B...

  4. Development of CANDU ECCS performance evaluation methodology and guides

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Kwang Hyun; Park, Kyung Soo; Chu, Won Ho [Korea Maritime Univ., Jinhae (Korea, Republic of)

    2003-03-15

    The objectives of the present work are to carry out technical evaluation and review of CANDU safety analysis methods in order to assist development of performance evaluation methods and review guides for CANDU ECCS. The applicability of PWR ECCS analysis models are examined and it suggests that unique data or models for CANDU are required for the following phenomena: break characteristics and flow, frictional pressure drop, post-CHF heat transfer correlations, core flow distribution during blowdown, containment pressure, and reflux rate. For safety analysis of CANDU, conservative analysis or best estimate analysis can be used. The main advantage of BE analysis is a more realistic prediction of margins to acceptance criteria. The expectation is that margins demonstrated with BE methods would be larger that when a conservative approach is applied. Some outstanding safety analysis issues can be resolved by demonstration that accident consequences are more benign than previously predicted. Success criteria for analysis and review of Large LOCA can be developed by top-down approach. The highest-level success criteria can be extracted from C-6 and from them, the lower level criteria can be developed step-by-step, in a logical fashion. The overall objectives for analysis and review are to verify radiological consequences and frequency are met.

  5. Developing Empirically Based Models of Practice.

    Science.gov (United States)

    Blythe, Betty J.; Briar, Scott

    1985-01-01

    Over the last decade emphasis has shifted from theoretically based models of practice to empirically based models whose elements are derived from clinical research. These models are defined and a developing model of practice through the use of single-case methodology is examined. Potential impediments to this new role are identified. (Author/BL)

  6. A Dynamic Defense Modeling and Simulation Methodology using Semantic Web Services

    Directory of Open Access Journals (Sweden)

    Kangsun Lee

    2010-04-01

    Full Text Available Defense Modeling and Simulations require interoperable and autonomous federates in order to fully simulate complex behavior of war-fighters and to dynamically adapt themselves to various war-game events, commands and controls. In this paper, we propose a semantic web service based methodology to develop war-game simulations. Our methodology encapsulates war-game logic into a set of web services with additional semantic information in WSDL (Web Service Description Language and OWL (Web Ontology Language. By utilizing dynamic discovery and binding power of semantic web services, we are able to dynamically reconfigure federates according to various simulation events. An ASuW (Anti-Surface Warfare simulator is constructed to demonstrate the methodology and successfully shows that the level of interoperability and autonomy can be greatly improved.

  7. Methodology for optimizing the development and operation of gas storage fields

    Energy Technology Data Exchange (ETDEWEB)

    Mercer, J.C.; Ammer, J.R.; Mroz, T.H.

    1995-04-01

    The Morgantown Energy Technology Center is pursuing the development of a methodology that uses geologic modeling and reservoir simulation for optimizing the development and operation of gas storage fields. Several Cooperative Research and Development Agreements (CRADAs) will serve as the vehicle to implement this product. CRADAs have been signed with National Fuel Gas and Equitrans, Inc. A geologic model is currently being developed for the Equitrans CRADA. Results from the CRADA with National Fuel Gas are discussed here. The first phase of the CRADA, based on original well data, was completed last year and reported at the 1993 Natural Gas RD&D Contractors Review Meeting. Phase 2 analysis was completed based on additional core and geophysical well log data obtained during a deepening/relogging program conducted by the storage operator. Good matches, within 10 percent, of wellhead pressure were obtained using a numerical simulator to history match 2 1/2 injection withdrawal cycles.

  8. Computational and methodological developments towards 3D full waveform inversion

    Science.gov (United States)

    Etienne, V.; Virieux, J.; Hu, G.; Jia, Y.; Operto, S.

    2010-12-01

    Full waveform inversion (FWI) is one of the most promising techniques for seismic imaging. It relies on a formalism taking into account every piece of information contained in the seismic data as opposed to more classical techniques such as travel time tomography. As a result, FWI is a high resolution imaging process able to reach a spatial accuracy equal to half a wavelength. FWI is based on a local optimization scheme and therefore the main limitation concerns the starting model which has to be closed enough to the real one in order to converge to the global minimum. Another counterpart of FWI is the required computational resources when considering models and frequencies of interest. The task becomes even more tremendous when one tends to perform the inversion using the elastic equation instead of using the acoustic approximation. This is the reason why until recently most studies were limited to 2D cases. In the last few years, due to the increase of the available computational power, FWI has focused a lot of interests and continuous efforts towards inversion of 3D models, leading to remarkable applications up to the continental scale. We investigate the computational burden induced by FWI in 3D elastic media and propose some strategic features leading to the reduction of the numerical cost while providing a great flexibility in the inversion parametrization. First, in order to release the memory requirements, we developed our FWI algorithm in the frequency domain and take benefit of the wave-number redundancy in the seismic data to process a quite reduced number of frequencies. To do so, we extract frequency solutions from time marching techniques which are efficient for 3D structures. Moreover, this frequency approach permits a multi-resolution strategy by proceeding from low to high frequencies: the final model at one frequency is used as the starting model for the next frequency. This procedure overcomes partially the non-linear behavior of the inversion

  9. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    2010-01-01

    Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning......In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or “Designing for Learning”. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ‘mediating design artefacts’. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co...

  10. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    2010-01-01

    In this paper we discuss the notion of a learning methodology and situate this within the wider frame of learning design or “Designing for Learning”. We discuss existing work within this broad area by trying to categorize different approaches and interpretations and we present our development...... of particular ‘mediating design artefacts’. We discuss what can be viewed as a lack of attention paid to integrating the preferred teaching styles and learning philosophies of practitioners into design tools, and present a particular method for learning design; the COllaborative E-learning Design method (Co......Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  11. Methodology matters: measuring urban spatial development using alternative methods

    OpenAIRE

    Daniel E Orenstein; Amnon Frenkel; Faris Jahshan

    2014-01-01

    The effectiveness of policies implemented to prevent urban sprawl has been a contentious issue among scholars and practitioners for at least two decades. While disputes range from the ideological to the empirical, regardless of the subject of dispute, participants must bring forth reliable data to buttress their claims. In this study we discuss several sources of complexity inherent in measuring sprawl. We then exhibit how methodological decisions can lead to disparate results regarding the q...

  12. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  13. THEORETIC AND METHODOLOGIC BASICS OF DEVELOPMENT OF THE NATIONAL LOGISTICS SYSTEM IN THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    R. B. Ivut

    2016-01-01

    Full Text Available The article presents the results of a study, the aim of which is the formation of the theoretical and methodological foundations in the framework of scientific maintenance for the further development processes of the national logistics system in the Republic of Belarus. The relevance of the study relates to the fact that at present the introduction of the concept of logistics and the formation of the optimal infrastructure for its implementation are the key factors for economic development of Belarus as a transit country. At the same time the pace of development of the logistic activities in the country is currently slightly lower in comparison with the neighboring countries, as evidenced by the dynamics of the country’s position in international rankings (in particular, according to the LPI index. Overcoming these gaps requires improved competitiveness of the logistics infrastructure in the international market. This, in turn, is possible due to the clear formulation and adherence of the effective functioning principles for macro logistics system of Belarus, as well as by increasing the quality of logistics design by means of applying econometric models and methods presented in the article. The proposed auctorial approach is the differentiation of the general principles of logistics specific to the logistics systems of all levels, and the specific principles of development of the macro level logistics system related to improving its transit attractiveness for international freight carriers. The study also systematizes the model for determining the optimal location of logistics facilities. Particular attention is paid to the methodological basis of the analysis of transport terminals functioning as part of the logistics centers both in the stages of design and operation. The developed theoretical and methodological recommendations are universal and can be used in the design of the logistics infrastructure for various purposes and functions

  14. SR-Site groundwater flow modelling methodology, setup and results

    Energy Technology Data Exchange (ETDEWEB)

    Selroos, Jan-Olof (Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden)); Follin, Sven (SF GeoLogic AB, Taeby (Sweden))

    2010-12-15

    As a part of the license application for a final repository for spent nuclear fuel at Forsmark, the Swedish Nuclear Fuel and Waste Management Company (SKB) has undertaken three groundwater flow modelling studies. These are performed within the SR-Site project and represent time periods with different climate conditions. The simulations carried out contribute to the overall evaluation of the repository design and long-term radiological safety. Three time periods are addressed; the Excavation and operational phases, the Initial period of temperate climate after closure, and the Remaining part of the reference glacial cycle. The present report is a synthesis of the background reports describing the modelling methodology, setup, and results. It is the primary reference for the conclusions drawn in a SR-Site specific context concerning groundwater flow during the three climate periods. These conclusions are not necessarily provided explicitly in the background reports, but are based on the results provided in these reports. The main results and comparisons presented in the present report are summarised in the SR-Site Main report.

  15. APPLICATION OF METHODOLOGY OF STRATEGIC PLANNING IN DEVELOPING NATIONAL PROGRAMMES ON DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Inna NOVAK

    2015-07-01

    Full Text Available Actuality: The main purpose of strategic planning is that long-term interests of sustainable development of a market economy require the use of effective measures of state regulation of economic and social processes. Objective: The aim of the article is determined to analyze the development of strategic planning methodology and practical experience of its application in the design of national development programs. Methods: When writing the article the following research methods were used: analysis and synthesis, target-oriented and monographic. Results: In Ukraine at the level of state and local government authorities strategies of development of branches, regions, cities, etc. are being developed but given the lack of state funding a unified investment strategy of the country is not developed. After analyzing development of the strategic planning methodology and examples of its application in the design of state development programs we identified the need to develop an investment strategy of the state (sectors, regions, etc., as due to defined directions and guidelines of the activity it will increase the investment level in the country and ensure national strategy “Ukraine-2020”.

  16. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T.; Desarrollo Metodologico del Modelo Probabilista de Evaluacion de Seguridad de la P.D.T. de Hontomin

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, A.; Eguilior, S.; Recreo, F.

    2011-06-07

    In the framework of CO{sub 2} Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  17. Contribution to developing the environment radiation protection methodology

    Energy Technology Data Exchange (ETDEWEB)

    Oudalova, A. [Institute of Atomic Power Engineering NRNU MEPhI (Russian Federation); Alexakhin, R.; Dubynina, M. [Russian Institute of Agricultural Radiology and Agroecology (Russian Federation)

    2014-07-01

    The environment sustainable development and biota protection, including the environment radiation protection are issues of nowadays interest in the society. An activity is ongoing on the development of a system of radiation protection for non-human biota. Anthropocentric and eco-centric principles are widely discussed. ICRP Publications 103, 108, 114 and many other reports and articles refer to the topic of environmental protection, reference animals and plants set, corresponding transfer parameters, dose models and derived consideration reference levels. There is still an open field for discussion of methods and approaches to get well-established procedure to assess environmental risks of radiation impacts to different organisms, populations and ecosystems. A huge work has been done by the ICRP and other organizations and research groups to develop and systematize approaches for this difficult subject. This activity, however, is not everywhere well-known and perceived, and more efforts are needed to bring ideas of eco-centric strategy in the environment radiation protection not only to public but to specialists in many countries as well. One of the main points of interest is an assessment of critical doses and doses rates for flora and fauna species. Some aspects of a possible procedure to find their estimates are studied in this work, including criteria for datasets of good quality, models of dose dependence, sensitivity of different umbrella endpoints and methods of original massive datasets treatment. Estimates are done based on information gathered in a database on radiation-induced effects in plants. Data on biological effects in plants (umbrella endpoints of reproductive potential, survival, morbidity, morphological, biochemical, and genetic effects) in dependence on dose and dose rates of ionizing radiation have been collected from reviewed publications and maintained in MS Access format. The database now contains about 7000 datasets and 25000 records

  18. A methodology for ecosystem-scale modeling of selenium

    Science.gov (United States)

    Presser, T.S.; Luoma, S.N.

    2010-01-01

    The main route of exposure for selenium (Se) is dietary, yet regulations lack biologically based protocols for evaluations of risk. We propose here an ecosystem-scale model that conceptualizes and quantifies the variables that determinehow Se is processed from water through diet to predators. This approach uses biogeochemical and physiological factors from laboratory and field studies and considers loading, speciation, transformation to particulate material, bioavailability, bioaccumulation in invertebrates, and trophic transfer to predators. Validation of the model is through data sets from 29 historic and recent field case studies of Se-exposed sites. The model links Se concentrations across media (water, particulate, tissue of different food web species). It can be used to forecast toxicity under different management or regulatory proposals or as a methodology for translating a fish-tissue (or other predator tissue) Se concentration guideline to a dissolved Se concentration. The model illustrates some critical aspects of implementing a tissue criterion: 1) the choice of fish species determines the food web through which Se should be modeled, 2) the choice of food web is critical because the particulate material to prey kinetics of bioaccumulation differs widely among invertebrates, 3) the characterization of the type and phase of particulate material is important to quantifying Se exposure to prey through the base of the food web, and 4) the metric describing partitioning between particulate material and dissolved Se concentrations allows determination of a site-specific dissolved Se concentration that would be responsible for that fish body burden in the specific environment. The linked approach illustrates that environmentally safe dissolved Se concentrations will differ among ecosystems depending on the ecological pathways and biogeochemical conditions in that system. Uncertainties and model sensitivities can be directly illustrated by varying exposure

  19. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  20. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  1. Development and Application of Urban Landslide Vulnerability Assessment Methodology Reflecting Social and Economic Variables

    Directory of Open Access Journals (Sweden)

    Yoonkyung Park

    2016-01-01

    Full Text Available An urban landslide vulnerability assessment methodology is proposed with major focus on considering urban social and economic aspects. The proposed methodology was developed based on the landslide susceptibility maps that Korean Forest Service utilizes to identify landslide source areas. Frist, debris flows are propagated to urban areas from such source areas by Flow-R (flow path assessment of gravitational hazards at a regional scale, and then urban vulnerability is assessed by two categories: physical and socioeconomic aspect. The physical vulnerability is related to buildings that can be impacted by a landslide event. This study considered two popular building structure types, reinforced-concrete frame and nonreinforced-concrete frame, to assess the physical vulnerability. The socioeconomic vulnerability is considered a function of the resistant levels of the vulnerable people, trigger factor of secondary damage, and preparedness level of the local government. An index-based model is developed to evaluate the life and indirect damage under landslide as well as the resilience ability against disasters. To illustrate the validity of the proposed methodology, physical and socioeconomic vulnerability levels are analyzed for Seoul, Korea, using the suggested approach. The general trend found in this study indicates that the higher population density areas under a weaker fiscal condition that are located at the downstream of mountainous areas are more vulnerable than the areas in opposite conditions.

  2. Engendering Development: Some Methodological Perspectives on Child Labour

    Directory of Open Access Journals (Sweden)

    Erica Burman

    2006-01-01

    Full Text Available In this article I address when and why it is useful to focus on gender in the design and conceptualisation of developmental psychological research. Since methodological debates treated in the abstract tend to lack both the specificity and rigour that application to a particular context or topic imports, I take a particular focus for my discussion: child labour. In doing so I hope to highlight the analytical and practical gains of bringing gendered agendas alongside, and into, developmental research. While child labour may seem a rather curious topic for discussion of developmental psychological research practice, this article will show how it indicates with particular clarity issues that mainstream psychological research often occludes or forgets. In particular, I explore analytical and methodological benefits of exploring the diverse ways gender structures notions of childhood, alongside the developmental commonalities and asymmetries of gender and age as categories. I suggest that the usual assumed elision between women and children is often unhelpful for both women and children. Instead, an analytical attention to the shifting forms and relations of children's work facilitates more differentiated perspectives on how its meanings reflect economic and cultural (including gendered conditions, and so attends better to social inequalities. These inequalities also structure the methodological conditions and paradigms for research with children, and so the article finishes by elaborating from this discussion of child labour four key principles for engendering psychological research with and about children, which also have broader implications for conceptualisations of the relations between gender, childhood, culture and families. URN: urn:nbn:de:0114-fqs060111

  3. New temperature model of the Netherlands from new data and novel modelling methodology

    Science.gov (United States)

    Bonté, Damien; Struijk, Maartje; Békési, Eszter; Cloetingh, Sierd; van Wees, Jan-Diederik

    2017-04-01

    Deep geothermal energy has grown in interest in Western Europe in the last decades, for direct use but also, as the knowledge of the subsurface improves, for electricity generation. In the Netherlands, where the sector took off with the first system in 2005, geothermal energy is seen has a key player for a sustainable future. The knowledge of the temperature subsurface, together with the available flow from the reservoir, is an important factor that can determine the success of a geothermal energy project. To support the development of deep geothermal energy system in the Netherlands, we have made a first assessment of the subsurface temperature based on thermal data but also on geological elements (Bonté et al, 2012). An outcome of this work was ThermoGIS that uses the temperature model. This work is a revision of the model that is used in ThermoGIS. The improvement from the first model are multiple, we have been improving not only the dataset used for the calibration and structural model, but also the methodology trough an improved software (called b3t). The temperature dataset has been updated by integrating temperature on the newly accessible wells. The sedimentary description in the basin has been improved by using an updated and refined structural model and an improved lithological definition. A major improvement in from the methodology used to perform the modelling, with b3t the calibration is made not only using the lithospheric parameters but also using the thermal conductivity of the sediments. The result is a much more accurate definition of the parameters for the model and a perfected handling of the calibration process. The result obtain is a precise and improved temperature model of the Netherlands. The thermal conductivity variation in the sediments associated with geometry of the layers is an important factor of temperature variations and the influence of the Zechtein salt in the north of the country is important. In addition, the radiogenic heat

  4. Advances in Artificial Neural Networks – Methodological Development and Application

    Directory of Open Access Journals (Sweden)

    Yanbo Huang

    2009-08-01

    Full Text Available Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other networks such as radial basis function, recurrent network, feedback network, and unsupervised Kohonen self-organizing network. These networks, especially the multilayer perceptron network with a backpropagation training algorithm, have gained recognition in research and applications in various scientific and engineering areas. In order to accelerate the training process and overcome data over-fitting, research has been conducted to improve the backpropagation algorithm. Further, artificial neural networks have been integrated with other advanced methods such as fuzzy logic and wavelet analysis, to enhance the ability of data interpretation and modeling and to avoid subjectivity in the operation of the training algorithm. In recent years, support vector machines have emerged as a set of high-performance supervised generalized linear classifiers in parallel with artificial neural networks. A review on development history of artificial neural networks is presented and the standard architectures and algorithms of artificial neural networks are described. Furthermore, advanced artificial neural networks will be introduced with support vector machines, and limitations of ANNs will be identified. The future of artificial neural network development in tandem with support vector machines will be discussed in conjunction with further applications to food science and engineering, soil and water relationship for crop management, and decision support for precision agriculture. Along with the network structures and training algorithms, the applications of artificial neural networks will be reviewed as well, especially in the fields of agricultural and biological

  5. Development of a flow structure interaction methodology applicable to a convertible car roof

    CERN Document Server

    Knight, J J

    2003-01-01

    The current research investigates the flow-induced deformation of a convertible roof of a vehicle using experimental and numerical methods. A computational methodology is developed that entails the coupling of a commercial Computational Fluid Dynamics (CFD) code with an in-house structural code. A model two-dimensional problem is first studied. The CFD code and a Source Panel Method (SPM) code are used to predict the pressure acting on the surface of a rigid roof of a scale model. Good agreement is found between predicted pressure distribution and that obtained in a parallel wind-tunnel experimental programme. The validated computational modelling of the fluid flow is then used in a coupling strategy with a line-element structural model that incorporates initial slackness of the flexible roof material. The computed flow-structure interaction yields stable solutions, the aerodynamically loaded flexible roof settling into static equilibrium. The effects of slackness and material properties on deformation and co...

  6. An Intelligent Response Surface Methodology for Modeling of Domain Level Constraints

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An effective modeling method of domain level constraints in the constraint network for concurrent engineering (CE) was developed. The domain level constraints were analyzed and the framework of modeling of domain level constraints based on simulation and approximate technology was given. An intelligent response surface methodology (IRSM) was proposed, in which artificial intelligence technologies are introduced into the optimization process. The design of crank and connecting rod in the V6 engine as example was given to show the validity of the modeling method.

  7. A holistic methodology for modeling consumer response to innovation.

    Science.gov (United States)

    Bagozzi, R P

    1983-01-01

    A general structural equation model for representing consumer response to innovation is derived and illustrated. The approach both complements and extends an earlier model proposed by Hauser and Urban. Among other benefits, the model is able to take measurement error into account explicitly, to estimate the intercorrelation among exogenous factors if these exist, to yield a unique solution in a statistical sense, and to test complex hypotheses (e.g., systems of relations, simultaneity, feedback) associated with the measurement of consumer responses and their impact on actual choice behavior. In addition, the procedures permit one to model environmental and managerially controllable stimuli as they constrain and influence consumer choice. Limitations of the procedures are discussed and related to existing approaches. Included in the discussion is a development of four generic response models designed to provide a framework for modeling how consumers behave and how managers might better approach the design of products, persuasive appeals, and other controllable factors in the marketing mix.

  8. Develop and demonstrate a methodology using Janus(A) to analyze advanced technologies.

    OpenAIRE

    Wright, Jerry Vernon

    1991-01-01

    Approved for public release; Distribution is unlimited This thesis presents a study of a methodology for analyzing advanced technologies using the Janus(A) High Resolution Combat Model. The goal of this research was to verify that the methodology using Janus(A) gave expected or realistic results. The methodology used a case where the results were known: the addition of a long range direct fire weapon into a force on force battle. Both the weapon characteristics and force mixes were used as...

  9. Synthesis of semantic modelling and risk analysis methodology applied to animal welfare.

    Science.gov (United States)

    Bracke, M B M; Edwards, S A; Metz, J H M; Noordhuizen, J P T M; Algers, B

    2008-07-01

    Decision-making on animal welfare issues requires a synthesis of information. For the assessment of farm animal welfare based on scientific information collected in a database, a methodology called 'semantic modelling' has been developed. To date, however, this methodology has not been generally applied. Recently, a qualitative Risk Assessment approach has been published by the European Food Safety Authority (EFSA) for the first time, concerning the welfare of intensively reared calves. This paper reports on a critical analysis of this Risk Assessment (RA) approach from a semantic-modelling (SM) perspective, emphasizing the importance of several seemingly self-evident principles, including the definition of concepts, application of explicit methodological procedures and specification of how underlying values and scientific information lead to the RA output. In addition, the need to include positive aspects of welfare and overall welfare assessments are emphasized. The analysis shows that the RA approach for animal welfare could benefit from SM methodology to support transparent and science-based decision-making.

  10. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    Science.gov (United States)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  11. Methodologies for modelling energy and amino acid responses in poultry

    Directory of Open Access Journals (Sweden)

    Robert Mervyn Gous

    2007-07-01

    Full Text Available The objective of this paper is to present some of the issues faced by those whose interest is to predict responses in poultry, concentrating mainly on those related to the prediction of voluntary food intake, as this should be the basis of models designed to optimise both performance and feeding programmes. The value of models designed to predict growth or reproductive performance has been improved inestimably by making food intake an output from, as opposed to an input to, such models. Predicting voluntary food intake requires the potential of the bird to be known, be this the growth of body protein or lipid, the growth of feather protein, or the rate at which yolk and albumen may be deposited daily in the form of an egg, and some of the issues relating to the description of potentials are discussed. This potential defines the nutrients that would be required by the bird on the day, which can be converted to a desired food intake by dividing each requirement by the content of that nutrient in the feed. There will be occasions when the bird will be unable to consume what is required, and predicting the magnitude of these constraints on intake and performance provides the greatest challenge for modellers. This paper concentrates on some issues raised in defining the nutrient requirements of an individual, on constraints such as high temperatures and the social and infectious environment on voluntary food intake, on some recent differences in the response to dietary protein that have been observed between the major broiler strains, and on the methodologies used to deal with populations of birds, and finally with broiler breeder hens, whose food intake is constrained by management, not by the environment. These issues suggest that there are still challenges that lie ahead for those wishing to predict responses to nutrients in poultry. It is imperative, however, that the methods used to measure the numbers that make theories work, and that the

  12. Methodological improvements of geoid modelling for the Austrian geoid computation

    Science.gov (United States)

    Kühtreiber, Norbert; Pail, Roland; Wiesenhofer, Bernadette; Pock, Christian; Wirnsberger, Harald; Hofmann-Wellenhof, Bernhard; Ullrich, Christian; Höggerl, Norbert; Ruess, Diethard; Imrek, Erich

    2010-05-01

    The geoid computation method of Least Squares Collocation (LSC) is usually applied in connection with the remove-restore technique. The basic idea is to remove, before applying LSC, not only the long-wavelength gravity field effect represented by the global gravity field model, but also the high-frequency signals, which are mainly related to topography, by applying a topographic-isostatic reduction. In the current Austrian geoid solution, an Airy-Heiskanen model with a standard density of 2670 kg/m3 was used. A close investigation of the absolute error structure of this solution reveals some correlations with topography, which may be explained with these simplified assumptions. On parameter of the remove-restore process to be investigated in this work is the depth of the reference surface of isostatic compensation, the Mohorovicic discontinuity (Moho). The recently compiled European plate Moho depth model, which is based on 3D-seismic tomography and other geophysical measurements, is used instead of the reference surface derived from the Airy-Heiskanen isostatic model. Additionally, the use of of the standard density of 2670 kg/m3 is replaced by a laterally variable (surface) density model. The impact of these two significant modifications of the geophysical conception of the remove-restore procedure on the Austrian geoid solution is investigated and analyzed in detail. In the current Austrian geoid solution the above described remove-restore concept was used in a first step to derive a pure gravimetric geoid and predicting the geoid height for 161 GPS/levelling points. The difference between measured and predicted geoid heights shows a long-wavelength structure. These systematic distortions are commonly attributed to inconsistencies in the datum, distortions of the orthometric height system, and systematic GPS errors. In order to cope with this systematic term, a polynomial of degree 3 was fitted to the difference of predicted geoid heights and GPS

  13. Simulation methodology development for rotating blade containment analysis

    Institute of Scientific and Technical Information of China (English)

    Qing HE; Hai-jun XUAN; Lian-fang LIAO; Wei-rong HONG; Rong-ren WU

    2012-01-01

    An experimental and numerical investigation on the aeroengine blade/case containment analysis is presented.Blade out containment capability analysis is an essential step in the new aeroengine design,but containment tests are time-consuming and incur significant costs; thus,developing a short-period and low-cost numerical method is warranted.Using explicit nonlinear dynamic finite element analysis software,the present study numerically investigated the high-speed impact process for simulated blade containment tests which were carried out on high-speed spin testing facility.A number of simulations were conducted using finite element models with different mesh sizes and different values of both the contact penalty factor and the friction coefficient.Detailed comparisons between the experimental and numerical results reveal that the mesh size and the friction coefficient have a considerable impact on the results produced.It is shown that a finer mesh will predict lower containment capability of the case,which is closer to the test data.A larger value of the friction coefficient also predicts lower containment capability.However,the contact penalty factor has little effect on the simulation results if it is large enough to avoid false penetration.

  14. A methodology for developing anisotropic AAA phantoms via additive manufacturing.

    Science.gov (United States)

    Ruiz de Galarreta, Sergio; Antón, Raúl; Cazón, Aitor; Finol, Ender A

    2017-05-24

    An Abdominal Aortic Aneurysm (AAA) is a permanent focal dilatation of the abdominal aorta at least 1.5 times its normal diameter. The criterion of maximum diameter is still used in clinical practice, although numerical studies have demonstrated the importance of biomechanical factors for rupture risk assessment. AAA phantoms could be used for experimental validation of the numerical studies and for pre-intervention testing of endovascular grafts. We have applied multi-material 3D printing technology to manufacture idealized AAA phantoms with anisotropic mechanical behavior. Different composites were fabricated and the phantom specimens were characterized by biaxial tensile tests while using a constitutive model to fit the experimental data. One composite was chosen to manufacture the phantom based on having the same mechanical properties as those reported in the literature for human AAA tissue; the strain energy and anisotropic index were compared to make this choice. The materials for the matrix and fibers of the selected composite are, respectively, the digital materials FLX9940 and FLX9960 developed by Stratasys. The fiber proportion for the composite is equal to 0.15. The differences between the composite behavior and the AAA tissue are small, with a small difference in the strain energy (0.4%) and a maximum difference of 12.4% in the peak Green strain ratio. This work represents a step forward in the application of 3D printing technology for the manufacturing of AAA phantoms with anisotropic mechanical behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. DATA MINING METHODOLOGY FOR DETERMINING THE OPTIMAL MODEL OF COST PREDICTION IN SHIP INTERIM PRODUCT ASSEMBLY

    Directory of Open Access Journals (Sweden)

    Damir Kolich

    2016-03-01

    Full Text Available In order to accurately predict costs of the thousands of interim products that are assembled in shipyards, it is necessary to use skilled engineers to develop detailed Gantt charts for each interim product separately which takes many hours. It is helpful to develop a prediction tool to estimate the cost of interim products accurately and quickly without the need for skilled engineers. This will drive down shipyard costs and improve competitiveness. Data mining is used extensively for developing prediction models in other industries. Since ships consist of thousands of interim products, it is logical to develop a data mining methodology for a shipyard or any other manufacturing industry where interim products are produced. The methodology involves analysis of existing interim products and data collection. Pre-processing and principal component analysis is done to make the data “user-friendly” for later prediction processing and the development of both accurate and robust models. The support vector machine is demonstrated as the better model when there are a lower number of tuples. However as the number of tuples is increased to over 10000, then the artificial neural network model is recommended.

  16. Modeling the Capacity and Emissions Impacts of Reduced Electricity Demand. Part 1. Methodology and Preliminary Results

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Shen, Hongxia [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Chan, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; McDevitt, Brian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Sturges, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division

    2013-02-07

    Policies aimed at energy conservation and efficiency have broad environmental and economic impacts. Even if these impacts are relatively small, they may be significant compared to the cost of implementing the policy. Methodologies that quantify the marginal impacts of reduced demand for energy have an important role to play in developing accurate measures of both the benefits and costs of a given policy choice. This report presents a methodology for estimating the impacts of reduced demand for electricity on the electric power sector as a whole. The approach uses the National Energy Modeling System (NEMS), a mid-range energy forecast model developed and maintained by the U.S. Department of Energy, Energy Information Administration (EIA)(DOE EIA 2013). The report is organized as follows: In the rest of this section the traditional NEMS-BT approach is reviewed and an outline of the new reduced form NEMS methodology is presented. Section 2 provides an overview of how the NEMS model works, and describes the set of NEMS-BT runs that are used as input to the reduced form approach. Section 3 presents our NEMS-BT simulation results and post-processing methods. In Section 4 we show how the NEMS-BT output can be generalized to apply to a broader set of end-uses. In Section 5 we disuss the application of this approach to policy analysis, and summarize some of the issues that will be further investigated in Part 2 of this study.

  17. A Methodology to Develop Design Support Tools for Stand-alone Photovoltaic Systems in Developing Countries

    Directory of Open Access Journals (Sweden)

    Stefano Mandelli

    2014-08-01

    Full Text Available As pointed out in several analyses, Stand-Alone Photovoltaic systems may be a relevant option for rural electrification in Developing Countries. In this context, Micro and Small Enterprises which supply customized Stand-Alone Photovoltaic systems play a pivotal role in the last-mile-distribution of this technology. Nevertheless, a number of issues limit the development of these enterprises curbing also potential spinoff benefits. A common business bottleneck is the lack of technical skills since usually few people have the expertise to design and formulate estimates for customers. The long-term solution to tackle this issue implies the implementation of a capacity building process, but this solution rarely matches with time-to-market urgency of local enterprises. Therefore, we propose in this study a simple, but general methodology which can be used to set up Design Support Tools for Micro and Small Enterprises that supply Stand-Alone Photovoltaic systems in rural areas of Developing Countries. After a brief review of the techniques and commercial software available to design the targeted technology, we describe the methodology highlighting the structure, the sizing equations and the main features that should be considered in developing a Design Support Tool. Then, we apply the methodology to set up a tool for use in Uganda and we compare the results with two commercial codes (NSolVx and HOMER. The results show that the implemented Design Support Tool develops correct system designs and presents some advantages for being disseminated in rural areas. Indeed it supports the user in providing the input data, selecting the main system components and delivering estimates to customers.

  18. A fault diagnosis methodology for rolling element bearings based on advanced signal pretreatment and autoregressive modelling

    Science.gov (United States)

    Al-Bugharbee, Hussein; Trendafilova, Irina

    2016-05-01

    This study proposes a methodology for rolling element bearings fault diagnosis which gives a complete and highly accurate identification of the faults present. It has two main stages: signals pretreatment, which is based on several signal analysis procedures, and diagnosis, which uses a pattern-recognition process. The first stage is principally based on linear time invariant autoregressive modelling. One of the main contributions of this investigation is the development of a pretreatment signal analysis procedure which subjects the signal to noise cleaning by singular spectrum analysis and then stationarisation by differencing. So the signal is transformed to bring it close to a stationary one, rather than complicating the model to bring it closer to the signal. This type of pretreatment allows the use of a linear time invariant autoregressive model and improves its performance when the original signals are non-stationary. This contribution is at the heart of the proposed method, and the high accuracy of the diagnosis is a result of this procedure. The methodology emphasises the importance of preliminary noise cleaning and stationarisation. And it demonstrates that the information needed for fault identification is contained in the stationary part of the measured signal. The methodology is further validated using three different experimental setups, demonstrating very high accuracy for all of the applications. It is able to correctly classify nearly 100 percent of the faults with regard to their type and size. This high accuracy is the other important contribution of this methodology. Thus, this research suggests a highly accurate methodology for rolling element bearing fault diagnosis which is based on relatively simple procedures. This is also an advantage, as the simplicity of the individual processes ensures easy application and the possibility for automation of the entire process.

  19. Methodology for Constructing Reduced-Order Power Block Performance Models for CSP Applications: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M.

    2010-10-01

    The inherent variability of the solar resource presents a unique challenge for CSP systems. Incident solar irradiation can fluctuate widely over a short time scale, but plant performance must be assessed for long time periods. As a result, annual simulations with hourly (or sub-hourly) timesteps are the norm in CSP analysis. A highly detailed power cycle model provides accuracy but tends to suffer from prohibitively long run-times; alternatively, simplified empirical models can run quickly but don?t always provide enough information, accuracy, or flexibility for the modeler. The ideal model for feasibility-level analysis incorporates both the detail and accuracy of a first-principle model with the low computational load of a regression model. The work presented in this paper proposes a methodology for organizing and extracting information from the performance output of a detailed model, then using it to develop a flexible reduced-order regression model in a systematic and structured way. A similar but less generalized approach for characterizing power cycle performance and a reduced-order modeling methodology for CFD analysis of heat transfer from electronic devices have been presented. This paper builds on these publications and the non-dimensional approach originally described.

  20. Contributions to the Development of a General Methodology for Innovation and Forecasting

    OpenAIRE

    Ionescu, Gabriela; Ion IONIŢĂ

    2011-01-01

    The paper presents authors’ contributions to the achievement of a first variant of the innovation and forecasting methodology. The various tools of TRIZ methodology (laws of systems development set for technical systems, the matrix of contradictions, the 40 inventive principles, the 39 parameters, Su-Field analysis, the method of the 9 screens etc) are already available, or can be customised to the specific type of the organization system. The TRIZ methodology for economics was embedded in a ...

  1. Contributions to the Development of a General Methodology for Innovation and Forecasting

    OpenAIRE

    Ionescu, Gabriela; Ion IONIŢĂ

    2011-01-01

    The paper presents authors’ contributions to the achievement of a first variant of the innovation and forecasting methodology. The various tools of TRIZ methodology (laws of systems development set for technical systems, the matrix of contradictions, the 40 inventive principles, the 39 parameters, Su-Field analysis, the method of the 9 screens etc) are already available, or can be customised to the specific type of the organization system. The TRIZ methodology for economics was embedded in a ...

  2. Development of six sigma concurrent parameter and tolerance design method based on response surface methodology

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Using Response Surface Methodology (RSM), an optimizing model of concurrent parameter and tolerance design is proposed where response mean equals its target in the target being best. The optimizing function of the model is the sum of quality loss and tolerance cost subjecting to the variance confidence region of which six sigma capability can be assured. An example is illustrated in order to compare the differences between the developed model and the parameter design with minimum variance. The results show that the proposed method not only achieves robustness, but also greatly reduces cast. The objectives of high quality and low cost of product and process can be achieved simultaneously by the application of six sigma concurrent parameter and tolerance design.

  3. Development of the Methodology for the Economic Evaluation of Managerial Decisions as a Factor of Economic Security Increase

    Directory of Open Access Journals (Sweden)

    Olga Aleksandrovna Romanova

    2016-09-01

    Full Text Available In the article, it is noted that the emergence of the phenomenon of interdependence between security and development — so-called security-development nexus, becomes determining during the development of strategic documents at all hierarchical levels. It gives relevance to the search of the methodological decisions allowing to consider the possible threats to economic security at the strategic level, and the pragmatical actions which are not contradicting a strategic vector of economic entities development — at the tactical level. Instability factors which threat the economic security are revealed. A rationale for the development of the new model of national economy development, whose central element is new industrialization, is substantiated. The most important trends of the development of world economy influencing the strategic vector of the increase of the Russian economic security are considered. It is discovered that in the conditions of new industrialization, the intellectual core of the high-technology sector of the economy is formed by convergent technologies (NBICS technology. A methodological approach to the economic evaluation of management decisions in the conditions of uncertainty is offered. The methodological principles, which have to be accounted in the case of the development of the modern methodology for the economic evaluation of economic decisions, are allocated. Among them, there are the development of the preferred reality or so-called «vision of the future», the priority of network decisions as the basis for new markets development; the mass customization and individualization of requirements, basic changes of the profile of competences which provides the competitiveness in the labour market, use of the ideology of the inclusive development and reformative investment creating general values. The offered methodology is based on an optimum combination of the traditional methods of the economic evaluation of managerial

  4. Development of a standardised methodology for event impact ...

    African Journals Online (AJOL)

    ... Government (WCG) developed an Integrated Events Strategy for Cape Town and ... supporting events to maximise brand building potential and triple bottom line benefits. ... The WCG thus undertook research to develop a standardised set of ...

  5. WRF Model Methodology for Offshore Wind Energy Applications

    Directory of Open Access Journals (Sweden)

    Evangelia-Maria Giannakopoulou

    2014-01-01

    Full Text Available Among the parameters that must be considered for an offshore wind farm development, the stability conditions of the marine atmospheric boundary layer (MABL are of significant importance. Atmospheric stability is a vital parameter in wind resource assessment (WRA due to its direct relation to wind and turbulence profiles. A better understanding of the stability conditions occurring offshore and of the interaction between MABL and wind turbines is needed. Accurate simulations of the offshore wind and stability conditions using mesoscale modelling techniques can lead to a more precise WRA. However, the use of any mesoscale model for wind energy applications requires a proper validation process to understand the accuracy and limitations of the model. For this validation process, the weather research and forecasting (WRF model has been applied over the North Sea during March 2005. The sensitivity of the WRF model performance to the use of different horizontal resolutions, input datasets, PBL parameterisations, and nesting options was examined. Comparison of the model results with other modelling studies and with high quality observations recorded at the offshore measurement platform FINO1 showed that the ERA-Interim reanalysis data in combination with the 2.5-level MYNN PBL scheme satisfactorily simulate the MABL over the North Sea.

  6. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    Directory of Open Access Journals (Sweden)

    Andi Saptono

    2009-09-01

    Full Text Available The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT model. This model describes five required characteristics for a telerehabilitation (TR infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. Keywords: Telerehabilitation, Information Management, Infrastructure Development Methodology, Videoconferencing, Online Portal, Database

  7. Kokum Fruit Bar Development via Response Surface Methodology (RSM.

    Directory of Open Access Journals (Sweden)

    Pritam Bafna

    2014-02-01

    Full Text Available A response surface methodology (RSM was used for the determination of optimum ingredients level to prepare kokum fruit bar. Kokum pulp was extracted using water extraction method at temperature (24.97ºC and time (30.42 min. The effects of ingredients levels on sensory parameters like texture, overall acceptability and calcium content of the prepared fruit bar were studied by employing a Box- Behnken Design (BBD. During the experimental trials the pulp quantity is kept fixed. The coefficient of determination R² for texture, overall acceptability and calcium content were 0.8298, 0.9239 and 0.9842 respectively. Analysis of variable (ANOVA performed on the experimental values showed that sugar and milk powder were the most important factors that affected characteristics of the kokum fruit bar as it exerted a highly significant influence (p < 0.05 on all the dependent variables. Based on surface and contour plots, optimum ingredients level for formation of kokum fruit bar were pulp, sugar, milk powder; 50g, 40g and 9.39g respectively.

  8. Concepts and methodologies for modeling and simulation a tribute to Tuncer Oren

    CERN Document Server

    Yilmaz, Levent

    2015-01-01

    This comprehensive text/reference presents cutting-edge advances in the theory and methodology of modeling and simulation (M&S), and reveals how this work has been influenced by the fundamental contributions of Professor Tuncer Ören to this field. Exploring the synergies among the domains of M&S and systems engineering (SE), the book describes how M&S and SE can help to address the complex problems identified as "Grand Challenges" more effectively under a model-driven and simulation-directed systems engineering framework. Topics and features: examines frameworks for the development of advan

  9. Avoiding and identifying errors in health technology assessment models: qualitative study and methodological review.

    Science.gov (United States)

    Chilcott, J; Tappenden, P; Rawdin, A; Johnson, M; Kaltenthaler, E; Paisley, S; Papaioannou, D; Shippam, A

    2010-05-01

    Health policy decisions must be relevant, evidence-based and transparent. Decision-analytic modelling supports this process but its role is reliant on its credibility. Errors in mathematical decision models or simulation exercises are unavoidable but little attention has been paid to processes in model development. Numerous error avoidance/identification strategies could be adopted but it is difficult to evaluate the merits of strategies for improving the credibility of models without first developing an understanding of error types and causes. The study aims to describe the current comprehension of errors in the HTA modelling community and generate a taxonomy of model errors. Four primary objectives are to: (1) describe the current understanding of errors in HTA modelling; (2) understand current processes applied by the technology assessment community for avoiding errors in development, debugging and critically appraising models for errors; (3) use HTA modellers' perceptions of model errors with the wider non-HTA literature to develop a taxonomy of model errors; and (4) explore potential methods and procedures to reduce the occurrence of errors in models. It also describes the model development process as perceived by practitioners working within the HTA community. A methodological review was undertaken using an iterative search methodology. Exploratory searches informed the scope of interviews; later searches focused on issues arising from the interviews. Searches were undertaken in February 2008 and January 2009. In-depth qualitative interviews were performed with 12 HTA modellers from academic and commercial modelling sectors. All qualitative data were analysed using the Framework approach. Descriptive and explanatory accounts were used to interrogate the data within and across themes and subthemes: organisation, roles and communication; the model development process; definition of error; types of model error; strategies for avoiding errors; strategies for

  10. The Development Methodology of the UML Electronic Guide

    Directory of Open Access Journals (Sweden)

    N.A. Magariu

    2006-09-01

    Full Text Available A technological model for realization of the electronic guide to UML language is considered. This model includes description of peculiarities of using the special graphic editor for constructing the UML diagrams, XML vocabularies (XMI, DocBook, SVG, XSLT for representing the text and diagrams and JavaScript code for constructing the tests.

  11. Developing an Interactive Augmented Prototyping Methodology to Support Design Reviews

    NARCIS (Netherlands)

    Verlinden, J.C.

    2014-01-01

    Physical prototypes and scale models play an important role in engineering design processes, especially in the field of industrial design. Such models are typically used to explore and discuss design concepts in various stages, from initial idea generation to manufacturing. Over the last decade, au

  12. Developing an Interactive Augmented Prototyping Methodology to Support Design Reviews

    NARCIS (Netherlands)

    Verlinden, J.C.

    2014-01-01

    Physical prototypes and scale models play an important role in engineering design processes, especially in the field of industrial design. Such models are typically used to explore and discuss design concepts in various stages, from initial idea generation to manufacturing. Over the last decade, au

  13. On the development of a strength prediction methodology for fibre metal laminates in pin bearing

    Science.gov (United States)

    Krimbalis, Peter Panagiotis

    The development of Fibre Metal Laminates (FMLs) for application into aerospace structures represents a paradigm shift in airframe and material technology. By consolidating both monolithic metallic alloys and fibre reinforced composite layers, a new material structure is born exhibiting desired qualities emerging from its heterogeneous constituency. When mechanically fastened via pins, bolts and rivets, these laminated materials develop damage and ultimately fail via mechanisms that were not entirely understood and different than either their metallic or composite constituents. The development of a predictive methodology capable of characterizing how FMLs fastened with pins behave and fail would drastically reduce the amount of experimentation necessary for material qualification and be an invaluable design tool. The body of this thesis discusses the extension of the characteristic dimension approach to FMLs and the subsequent development of a new failure mechanism as part of a progressive damage finite element (FE) modeling methodology with yielding, delamination and buckling representing the central tenets of the new mechanism. This yielding through delamination buckling (YDB) mechanism and progressive FE model were investigated through multiple experimental studies. The experimental investigations required the development of a protocol with emphasis on measuring deformation on a local scheme in addition to a global one. With the extended protocol employed, complete characterization of the material response was possible and a new definition for yield in a pin bearing configuration was developed and subsequently extended to a tensile testing configuration. The performance of this yield definition was compared directly to existing definitions and was shown to be effective in both quasi-isotropic and orthotropic materials. The results of the experiments and FE simulations demonstrated that yielding (according to the new definition), buckling and delamination

  14. Contributions to the Development of a General Methodology for Innovation and Forecasting

    Directory of Open Access Journals (Sweden)

    Gabriela IONESCU

    2011-12-01

    Full Text Available The paper presents authors’ contributions to the achievement of a first variant of the innovation and forecasting methodology. The various tools of TRIZ methodology (laws of systems development set for technical systems, the matrix of contradictions, the 40 inventive principles, the 39 parameters, Su-Field analysis, the method of the 9 screens etc are already available, or can be customised to the specific type of the organization system. The TRIZ methodology for economics was embedded in a more general methodology for innovation and forecasting. The eight laws of evolution systems were customised to economics. The authors also make a comparative analysis of the technical TRIZ matrix to the company management matrix. Based on the analysis performed, it can be concluded that a general methodology can be prepared for innovation and forecasting, making use of TRIZ methodology, by customising some classical instruments of the technical field, and bringing in other specific economic tools.

  15. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    Energy Technology Data Exchange (ETDEWEB)

    Ambrosini, W., E-mail: walter.ambrosini@ing.unipi.it; Pucciarelli, A.; Borroni, I.

    2015-05-15

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  16. Object-oriented modelling with unified modelling language 2.0 for simple software application based on agile methodology

    CERN Document Server

    Warnars, Spits

    2010-01-01

    Unified modelling language (UML) 2.0 introduced in 2002 has been developing and influencing object-oriented software engineering and has become a standard and reference for information system analysis and design modelling. There are many concepts and theories to model the information system or software application with UML 2.0, which can make ambiguities and inconsistencies for a novice to learn to how to model the system with UML especially with UML 2.0. This article will discuss how to model the simple software application by using some of the diagrams of UML 2.0 and not by using the whole diagrams as suggested by agile methodology. Agile methodology is considered as convenient for novices because it can deliver the information technology environment to the end-user quickly and adaptively with minimal documentation. It also has the ability to deliver best performance software application according to the customer's needs. Agile methodology will make simple model with simple documentation, simple team and si...

  17. Developing an Item Bank for Use in Testing in Africa: Theory and Methodology

    Science.gov (United States)

    Furtuna, Daniela

    2014-01-01

    The author describes the steps taken by a research team, of which she was part, to develop a specific methodology for assessing student attainment in primary school, working with the Programme for the Analysis of Education Systems (PASEC) of the Conference of Ministers of Education of French-speaking Countries (CONFEMEN). This methodology provides…

  18. Model checking methodology for large systems, faults and asynchronous behaviour. SARANA 2011 work report

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J. [VTT Technical Research Centre of Finland, Espoo (Finland); Launiainen, T.; Heljanko, K.; Ropponen, J. [Aalto Univ., Espoo (Finland). Dept. of Information and Computer Science

    2012-07-01

    Digital instrumentation and control (I and C) systems are challenging to verify. They enable complicated control functions, and the state spaces of the models easily become too large for comprehensive verification through traditional methods. Model checking is a formal method that can be used for system verification. A number of efficient model checking systems are available that provide analysis tools to determine automatically whether a given state machine model satisfies the desired safety properties. This report reviews the work performed in the Safety Evaluation and Reliability Analysis of Nuclear Automation (SARANA) project in 2011 regarding model checking. We have developed new, more exact modelling methods that are able to capture the behaviour of a system more realistically. In particular, we have developed more detailed fault models depicting the hardware configuration of a system, and methodology to model function-block-based systems asynchronously. In order to improve the usability of our model checking methods, we have developed an algorithm for model checking large modular systems. The algorithm can be used to verify properties of a model that could otherwise not be verified in a straightforward manner. (orig.)

  19. A Review of Kinetic Modeling Methodologies for Complex Processes

    Directory of Open Access Journals (Sweden)

    de Oliveira Luís P.

    2016-05-01

    Full Text Available In this paper, kinetic modeling techniques for complex chemical processes are reviewed. After a brief historical overview of chemical kinetics, an overview is given of the theoretical background of kinetic modeling of elementary steps and of multistep reactions. Classic lumping techniques are introduced and analyzed. Two examples of lumped kinetic models (atmospheric gasoil hydrotreating and residue hydroprocessing developed at IFP Energies nouvelles (IFPEN are presented. The largest part of this review describes advanced kinetic modeling strategies, in which the molecular detail is retained, i.e. the reactions are represented between molecules or even subdivided into elementary steps. To be able to retain this molecular level throughout the kinetic model and the reactor simulations, several hurdles have to be cleared first: (i the feedstock needs to be described in terms of molecules, (ii large reaction networks need to be automatically generated, and (iii a large number of rate equations with their rate parameters need to be derived. For these three obstacles, molecular reconstruction techniques, deterministic or stochastic network generation programs, and single-event micro-kinetics and/or linear free energy relationships have been applied at IFPEN, as illustrated by several examples of kinetic models for industrial refining processes.

  20. Population-Development-Environment Modeling in the Philippines: A Review

    OpenAIRE

    1996-01-01

    This article surveys existing forecasting models in the Philippines and discusses several promising alternatives in the process of developing a methodological modeling. Investigation of CGE models leads to the findings of the absence of population-environment interactions.

  1. Economic modeling of electricity production from hot dry rock geothermal reservoirs: methodology and analyses. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cummings, R.G.; Morris, G.E.

    1979-09-01

    An analytical methodology is developed for assessing alternative modes of generating electricity from hot dry rock (HDR) geothermal energy sources. The methodology is used in sensitivity analyses to explore relative system economics. The methodology used a computerized, intertemporal optimization model to determine the profit-maximizing design and management of a unified HDR electric power plant with a given set of geologic, engineering, and financial conditions. By iterating this model on price, a levelized busbar cost of electricity is established. By varying the conditions of development, the sensitivity of both optimal management and busbar cost to these conditions are explored. A plausible set of reference case parameters is established at the outset of the sensitivity analyses. This reference case links a multiple-fracture reservoir system to an organic, binary-fluid conversion cycle. A levelized busbar cost of 43.2 mills/kWh ($1978) was determined for the reference case, which had an assumed geothermal gradient of 40/sup 0/C/km, a design well-flow rate of 75 kg/s, an effective heat transfer area per pair of wells of 1.7 x 10/sup 6/ m/sup 2/, and plant design temperature of 160/sup 0/C. Variations in the presumed geothermal gradient, size of the reservoir, drilling costs, real rates of return, and other system parameters yield minimum busbar costs between -40% and +76% of the reference case busbar cost.

  2. Methodology for urban rail and construction technology research and development planning

    Science.gov (United States)

    Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.

    1980-01-01

    A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.

  3. Development of Management Quality Assessment Methodology in the Public Sector: Problems and Contradictions

    Directory of Open Access Journals (Sweden)

    Olga Vladimirovna Kozhevina

    2015-09-01

    Full Text Available The development management quality assessment methodology in the public sector is relevant scientific and practical problem of economic research. The utilization of the results of the assessment on the basis of the authors’ methodology allows us to rate the public sector organizations, to justify decisions on the reorganization and privatization, and to monitor changes in the level of the management quality of the public sector organizations. The study determined the place of the quality of the control processes of the public sector organization in the system of “Quality of public administration — the effective operation of the public sector organization,” the contradictions associated with the assessment of management quality are revealed, the conditions for effective functioning of the public sector organizations are proved, a mechanism of comprehensive assessment and algorithm for constructing and evaluating the control models of management quality are developed, the criteria for assessing the management quality in the public sector organizations, including economic, budgetary, social and public, informational, innovation and institutional criteria are empirically grounded. By utilizing the proposed algorithm, the assessment model of quality management in the public sector organizations, including the financial, economic, social, innovation, informational and institutional indicators is developed. For each indicator of quality management, the coefficients of importance in the management quality assessment model, as well as comprehensive and partial evaluation indicators are determined on the basis of the expert evaluations. The main conclusion of the article is that management quality assessment for the public sector organizations should be based not only on the indicators achieved in the dynamics and utilized for analyzing the effectiveness of management, but also should take into account the reference levels for the values of these

  4. Methodology of the Access to Care and Timing Simulation Model for Traumatic Spinal Cord Injury Care.

    Science.gov (United States)

    Santos, Argelio; Fallah, Nader; Lewis, Rachel; Dvorak, Marcel F; Fehlings, Michael G; Burns, Anthony Scott; Noonan, Vanessa K; Cheng, Christiana L; Chan, Elaine; Singh, Anoushka; Belanger, Lise M; Atkins, Derek

    2017-03-12

    Despite the relatively low incidence, the management and care of persons with traumatic spinal cord injury (tSCI) can be resource intensive and complex, spanning multiple phases of care and disciplines. Using a simulation model built with a system level view of the healthcare system allows for prediction of the impact of interventions on patient and system outcomes from injury through to community reintegration after tSCI. The Access to Care and Timing (ACT) project developed a simulation model for tSCI care using techniques from operations research and its development has been described previously. The objective of this article is to briefly describe the methodology and the application of the ACT Model as it was used in several of the articles in this focus issue. The approaches employed in this model provide a framework to look into the complexity of interactions both within and among the different SCI programs, sites and phases of care.

  5. Methodology for physical modeling of melter electrode power plug

    Energy Technology Data Exchange (ETDEWEB)

    Heath, W.O.

    1984-09-01

    A method is presented for building and testing a one-third scale model of an electrode power plug used to supply up to 3000 amperes to a liquid fed ceramic melter. The method describes how a one-third scale model can be used to verify the ampacity of the power plug, the effectiveness of the power plug cooling system and the effect of the high amperage current on eddy current heating of rebar in the cell wall. Scale-up of the test data, including cooling air flow rate and pressure drop, temperature profiles, melter water jacket heat duty and electrical resistance is covered. The materials required to build the scale model are specified as well as scale surface finish and dimensions. The method for designing and testing a model power plug involves developing a way to recreate the thermal conditions including heat sources, sinks and boundary temperatures on a scale basis. The major heat sources are the molten glass in contact with the electrode, joule heat generation within the power plug, and eddy current heating of the wall rebar. The melting cavity heat source is modelled using a plate heater to provide radiant heat transfer to a geometrically similar, one-third scale electrode housed in a scale model of a melting cavity having a thermally and geometrically similar wall and floor. The joule heat generation within the power plug is simulated by passing electricity through the model power plug with geometrically similar rebar positioned to simulate the eddy heating phenomenon. The proposed model also features two forced air cooling circuits similar to those on the full design. The interaction of convective, natural and radiant heat transfer in the wall cooling circuit are considered. The cell environment and a melter water jacket, along with the air cooling circuits, constitute the heat sinks and are also simulated.

  6. Integrating Social Activity Theory and Critical Discourse Analysis: A Multilayered Methodological Model for Examining Knowledge Mediation in Mentoring

    Science.gov (United States)

    Becher, Ayelet; Orland-Barak, Lily

    2016-01-01

    This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…

  7. IMPLEMENTATION OF DATA ASSIMILATION METHODOLOGY FOR PHYSICAL MODEL UNCERTAINTY EVALUATION USING POST-CHF EXPERIMENTAL DATA

    Directory of Open Access Journals (Sweden)

    JAESEOK HEO

    2014-10-01

    Full Text Available The Best Estimate Plus Uncertainty (BEPU method has been widely used to evaluate the uncertainty of a best-estimate thermal hydraulic system code against a figure of merit. This uncertainty is typically evaluated based on the physical model's uncertainties determined by expert judgment. This paper introduces the application of data assimilation methodology to determine the uncertainty bands of the physical models, e.g., the mean value and standard deviation of the parameters, based upon the statistical approach rather than expert judgment. Data assimilation suggests a mathematical methodology for the best estimate bias and the uncertainties of the physical models which optimize the system response following the calibration of model parameters and responses. The mathematical approaches include deterministic and probabilistic methods of data assimilation to solve both linear and nonlinear problems with the a posteriori distribution of parameters derived based on Bayes' theorem. The inverse problem was solved analytically to obtain the mean value and standard deviation of the parameters assuming Gaussian distributions for the parameters and responses, and a sampling method was utilized to illustrate the non-Gaussian a posteriori distributions of parameters. SPACE is used to demonstrate the data assimilation method by determining the bias and the uncertainty bands of the physical models employing Bennett's heated tube test data and Becker's post critical heat flux experimental data. Based on the results of the data assimilation process, the major sources of the modeling uncertainties were identified for further model development.

  8. A New Mathematical Model for Flank Wear Prediction Using Functional Data Analysis Methodology

    Directory of Open Access Journals (Sweden)

    Sonja Jozić

    2014-01-01

    Full Text Available This paper presents a new approach improving the reliability of flank wear prediction during the end milling process. In the present work, prediction of flank wear has been achieved by using cutting parameters and force signals as the sensitive carriers of information about the machining process. A series of experiments were conducted to establish the relationship between flank wear and cutting force components as well as the cutting parameters such as cutting speed, feed per tooth, and radial depth of cut. In order to be able to predict flank wear a new linear regression mathematical model has been developed by utilizing functional data analysis methodology. Regression coefficients of the model are in the form of time dependent functions that have been determined through the use of functional data analysis methodology. The mathematical model has been developed by means of applied cutting parameters and measured cutting forces components during the end milling of workpiece made of 42CrMo4 steel. The efficiency and flexibility of the developed model have been verified by comparing it with the separate experimental data set.

  9. A Comparative Analysis of Two Software Development Methodologies: Rational Unified Process and Extreme Programming

    Directory of Open Access Journals (Sweden)

    Marcelo Rafael Borth

    2014-01-01

    Full Text Available Software development methodologies were created to meet the great market demand for innovation, productivity, quality and performance. With the use of a methodology, it is possible to reduce the cost, the risk, the development time, and even increase the quality of the final product. This article compares two of these development methodologies: the Rational Unified Process and the Extreme Programming. The comparison shows the main differences and similarities between the two approaches, and highlights and comments some of their predominant features.

  10. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  11. Development of a Methodology for Assessing Military Team Processes

    OpenAIRE

    Fraser, Brent DeWayne

    2003-01-01

    This study is based upon the premise that overall team performance is the sum of the team's performance in several individual team processes. The purpose of this study was to develop a tool to measure performance in each of these individual team processes. This study begins the measurement development cycle by developing a tool that uses direct observation to collect data on team processes. The tool was then tested in a battle simulation being used as a C2 training exercise. The study showed ...

  12. Development of Fuzzy Logic and Soft Computing Methodologies

    Science.gov (United States)

    Zadeh, L. A.; Yager, R.

    1999-01-01

    Our earlier research on computing with words (CW) has led to a new direction in fuzzy logic which points to a major enlargement of the role of natural languages in information processing, decision analysis and control. This direction is based on the methodology of computing with words and embodies a new theory which is referred to as the computational theory of perceptions (CTP). An important feature of this theory is that it can be added to any existing theory - especially to probability theory, decision analysis, and control - and enhance the ability of the theory to deal with real-world problems in which the decision-relevant information is a mixture of measurements and perceptions. The new direction is centered on an old concept - the concept of a perception - a concept which plays a central role in human cognition. The ability to reason with perceptions perceptions of time, distance, force, direction, shape, intent, likelihood, truth and other attributes of physical and mental objects - underlies the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Everyday examples of such tasks are parking a car, driving in city traffic, cooking a meal, playing golf and summarizing a story. Perceptions are intrinsically imprecise. Imprecision of perceptions reflects the finite ability of sensory organs and ultimately, the brain, to resolve detail and store information. More concretely, perceptions are both fuzzy and granular, or, for short, f-granular. Perceptions are f-granular in the sense that: (a) the boundaries of perceived classes are not sharply defined; and (b) the elements of classes are grouped into granules, with a granule being a clump of elements drawn together by indistinguishability, similarity. proximity or functionality. F-granularity of perceptions may be viewed as a human way of achieving data compression. In large measure, scientific progress has been, and continues to be

  13. Establishing a methodology to develop complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2013-02-01

    Full Text Available Many modern management systems, such as military command and control, tend to be large and highly interconnected sociotechnical systems operating in a complex environment. Successful development, assessment and implementation of these systems...

  14. Methodology for analyzing and developing information management infrastructure to support telerehabilitation.

    Science.gov (United States)

    Saptono, Andi; Schein, Richard M; Parmanto, Bambang; Fairman, Andrea

    2009-01-01

    The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR) to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT) model. This model describes five required characteristics for a telerehabilitation (TR) infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania.

  15. Development of suitable solvent system for downstream processing of biopolymer pullulan using response surface methodology.

    Directory of Open Access Journals (Sweden)

    Anirban Roy Choudhury

    Full Text Available Downstream processing is an important aspect of all biotechnological processes and has significant implications on quality and yield of the final product. Several solvents were examined for their efficacy on pullulan precipitation from fermentation broth. Interactions among four selected solvents and their effect on pullulan yield were studied using response surface methodology. A polynomial model was developed using D-optimal design and three contour plots were generated by performing 20 different experiments and the model was validated by performing optimization experiments. The results indicated that lower concentration of ethanol in combination with the other three solvents has resulted in higher yield of polymer from fermentation broth and the optimized solvent system was able to recover 1.44 times more pullulan as compared to the conventional ethanolic precipitation method. These observations may help in enhancing efficiency of pullulan recovery from fermentation broth and also result in reduced cost of production for the final product.

  16. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  17. Organizational Culture and Scale Development: Methodological Challenges and Future Directions

    OpenAIRE

    Bavik Ali; Duncan Tara

    2014-01-01

    Defining and measuring organizational culture (OC) is of paramount importance to organizations because a strong culture could potentially increase service quality and yield sustainable competitive advantages. However, such process could be challenging to managers because the scope of OC has been defined differently across disciplines and industries, which has led to the development of various scales for measuring OC. In addition, previously developed OC scales may also not be fully applicable...

  18. Learning challenges and sustainable development: A methodological perspective.

    Science.gov (United States)

    Seppänen, Laura

    2017-01-01

    Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.

  19. Nonlinear Time Domain Seismic Soil-Structure Interaction (SSI) Deep Soil Site Methodology Development

    Energy Technology Data Exchange (ETDEWEB)

    Spears, Robert Edward [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    Currently the Department of Energy (DOE) and the nuclear industry perform seismic soil-structure interaction (SSI) analysis using equivalent linear numerical analysis tools. For lower levels of ground motion, these tools should produce reasonable in-structure response values for evaluation of existing and new facilities. For larger levels of ground motion these tools likely overestimate the in-structure response (and therefore structural demand) since they do not consider geometric nonlinearities (such as gaping and sliding between the soil and structure) and are limited in the ability to model nonlinear soil behavior. The current equivalent linear SSI (SASSI) analysis approach either joins the soil and structure together in both tension and compression or releases the soil from the structure for both tension and compression. It also makes linear approximations for material nonlinearities and generalizes energy absorption with viscous damping. This produces the potential for inaccurately establishing where the structural concerns exist and/or inaccurately establishing the amplitude of the in-structure responses. Seismic hazard curves at nuclear facilities have continued to increase over the years as more information has been developed on seismic sources (i.e. faults), additional information gathered on seismic events, and additional research performed to determine local site effects. Seismic hazard curves are used to develop design basis earthquakes (DBE) that are used to evaluate nuclear facility response. As the seismic hazard curves increase, the input ground motions (DBE’s) used to numerically evaluation nuclear facility response increase causing larger in-structure response. As ground motions increase so does the importance of including nonlinear effects in numerical SSI models. To include material nonlinearity in the soil and geometric nonlinearity using contact (gaping and sliding) it is necessary to develop a nonlinear time domain methodology. This

  20. Testing spectral models for stellar populations with star clusters: I. Methodology

    CERN Document Server

    Fernandes, Roberto Cid

    2009-01-01

    High resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well studied star clusters from the work of Leonardi & Rose (2003) spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic clouds. This paper concentrates on methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the MILES library. Best-fit and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal un...

  1. Comparative Heat Conduction Model of a Cold Storage with Puf & Eps Insulation Using Taguchi Methodology

    Directory of Open Access Journals (Sweden)

    Dr. N. Mukhopadhyay

    2015-05-01

    Full Text Available In this project work a mathematical heat conduction model of a cold storage (with the help of computer program; and multiple regression analysis has been proposed which can be used for further development of cold storages in the upcoming future. In cold storage refrigeration system brings down the temperature initially during start up but thermal insulation maintains the temperature later on continuously. In this view, the simple methodology is presented to calculate heat transfer by analytical method also attempt has been made to minimize the energy consumption by replacing 150 mm Expanded polystyrene (EPS by 100 mm Poly Urethane foam (PUF insulation. The methodology is validated against actual data obtained from Penguin cold storage situated in Pune, India. Insulation thickness of the side walls (TW, area of the wall (AW, and insulation thickness of the roof (TR have been chosen as predictor variables of the study.

  2. Methodological support to develop interoperable applications for pervasive healthcare

    NARCIS (Netherlands)

    Cardoso de Moraes, João Luís

    2014-01-01

    The healthcare model currently being used in most countries will soon be inadequate, due to the increasing care costs of a growing population of elderly people, the rapid increase of chronic diseases, the growing demand for new treatments and technologies, and the relative decrease in the number of

  3. [Methodology for the development and update of practice guidelines: current state].

    Science.gov (United States)

    Barrera-Cruz, Antonio; Viniegra-Osorio, Arturo; Valenzuela-Flores, Adriana Abigail; Torres-Arreola, Laura Pilar; Dávila-Torres, Javier

    2016-01-01

    The current scenario of health services in Mexico reveals as a priority the implementation of strategies that allow us to better respond to the needs and expectations of individuals and society as a whole, through the provision of efficient and effective alternatives for the prevention, diagnosis and treatment of diseases. In this context, clinical practice guidelines constitute an element of management in the health care system, whose objective is to establish a national bechmark for encouraging clinical and management decision making, based on recommendations from the best available evidence, in order to contribute to the quality and effectiveness of health care. The purpose of this document is to show the methodology used for the development and updating of clinical practice guidelines that the Instituto Mexicano del Seguro Social has developed in line with the sectorial model in order to serve the user of these guidelines.

  4. Methodology for Evaluating the Rural Tourism Potentials: A Tool to Ensure Sustainable Development of Rural Settlements

    Directory of Open Access Journals (Sweden)

    Alexander Trukhachev

    2015-03-01

    Full Text Available The paper analyses potentials, challenges and problems of the rural tourism from the point of view of its impact on sustainable rural development. It explores alternative sources of income for rural people by means of tourism and investigates effects of the rural tourism on agricultural production in local rural communities. The aim is to identify the existing and potential tourist attractions within the rural areas in Southern Russia and to provide solutions to be introduced in particular rural settlements in order to make them attractive for tourists. The paper includes the elaboration and testing of a methodology for evaluating the rural tourism potentials using the case of rural settlements of Stavropol Krai, Russia. The paper concludes with a ranking of the selected rural settlements according to their rural tourist capacity and substantiation of the tourism models to be implemented to ensure a sustainable development of the considered rural areas.

  5. Model-driven methodology for rapid deployment of smart spaces based on resource-oriented architectures.

    Science.gov (United States)

    Corredor, Iván; Bernardos, Ana M; Iglesias, Josué; Casar, José R

    2012-01-01

    Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  6. Model-Driven Methodology for Rapid Deployment of Smart Spaces Based on Resource-Oriented Architectures

    Directory of Open Access Journals (Sweden)

    José R. Casar

    2012-07-01

    Full Text Available Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT and Web of Things (WoT are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i to integrate sensing and actuating functionalities into everyday objects, and (ii to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD methodology based on the Model Driven Architecture (MDA. This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.

  7. Application of Box-Behnken design and response surface methodology for modeling of some Turkish coals

    Energy Technology Data Exchange (ETDEWEB)

    N. Aslan; Y. Cebeci [Cumhuriyet University, Sivas (Turkey). Mining Engineering Department

    2007-01-15

    The aim of our research was to apply Box-Behnken experimental design and response surface methodology for modeling of some Turkish coals. As a base for this study, standard Bond grindability tests were initially done and Bond work indexes (Wi) values were calculated for three Turkish coals. The Box-Behnken experimental design was used to provide data for modeling and the variables of model were Bond work index, grinding time and ball diameter of mill. Coal grinding tests were performed changing these three variables for three size fractions of coals (-3350 + 1700 {mu}m, -1700 + 710 {mu}m and -710 {mu}m). Using these sets of experimental data obtained by mathematical software package (MATLAB 7.1), mathematical models were then developed to show the effect of each parameter and their interactions on product 80% passing size (d{sub 80}). Predicted values of d80 obtained using model equations were in good agreement with the experimental values of d{sub 80} (R{sup 2} value of 0.96 for -3350 + 1700 {mu}m, R{sup 2} value of 0.98 for -1700 + 710 {mu}m and R{sup 2} value of 0.94 for -710 {mu}m). This study proved that Box-Behnken design and response surface methodology could efficiently be applied for modeling of grinding of some Turkish coals. 19 refs., 14 figs., 6 tabs.

  8. Development of an aggregation methodology for risk analysis in aerospace conceptual vehicle design

    Science.gov (United States)

    Chytka, Trina Marsh

    2003-10-01

    The growing complexity of technical systems has emphasized a need to gather as much information as possible regarding specific systems of interest in order to make robust, sound decisions about their design and deployment. Acquiring as much data as possible requires the use of empirical statistics, historical information and expert opinion. In much of the aerospace conceptual design environment, the lack of historical information and infeasibility of gathering empirical data relegates the data collection to expert opinion. The conceptual design of a space vehicle requires input from several disciplines (weights and sizing, operations, trajectory, etc.). In this multidisciplinary environment, the design variables are often not easily quantified and have a high degree of uncertainty associated with their values. Decision-makers must rely on expert assessments of the uncertainty associated with the design variables to evaluate the risk level of a conceptual design. Since multiple experts are often queried for their evaluation of uncertainty, a means to combine/aggregate multiple expert assessments must be developed. Providing decision-makers with a solitary assessment that captures the consensus of the multiple experts would greatly enhance the ability to evaluate risk associated with a conceptual design. The objective of this research has been to develop an aggregation methodology that efficiently combines the uncertainty assessments of multiple experts in multiple disciplines involved in aerospace conceptual design. Bayesian probability augmented by uncertainty modeling and expert calibration was employed in the methodology construction. Appropriate questionnaire techniques were used to acquire expert opinion; the responses served as input distributions to the aggregation algorithm. Application of the derived techniques were applied as part of a larger expert assessment elicitation and calibration study. Results of this research demonstrate that aggregation of

  9. Development of a Graphical Tool to integrate the Prometheus AEOlus methodology and Jason Platform

    Directory of Open Access Journals (Sweden)

    Rafhael CUNHA

    2017-07-01

    Full Text Available Software Engineering (SE is an area that intends to build high-quality software in a systematic way. However, traditional software engineering techniques and methods do not support the demand for developing Multiagent Systems (MAS. Therefore a new subarea has been studied, called Agent Oriented Software Engineering (AOSE. The AOSE area proposes solutions to issues related to the development of agent oriented systems. There is still no standardization in this subarea, resulting in several methodologies. Another issue of this subarea is that there are very few tools that are able to automatically generate code. In this work we propose a tool to support the Prometheus AEOlus Methodology because it provides modelling artifacts to all MAS dimensions: agents, environment, interaction, and organization. The tool supports all Prometheus AEOlus artifacts and can automatically generated code to the agent and interaction dimensions in the AgentSpeak Language, which is the language used in the Jason Platform. We have done some validations with the proposed tool and a case study is presented.

  10. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    potential risks, and estimated development times associated with each technology. The established WSTAT framework utilizes elicitation techniques with...and stakeholder value in order to inform and potentially influence requirements documents and associated specifications. The WSTA tool (WSTAT) can...exploitation, requirements definition, early cost informed trades, requirements analysis, Analysis of Alternatives (AoA), contractor trades, and technology

  11. Methodological issues concerning the development of sustainable industrial parks

    Directory of Open Access Journals (Sweden)

    Marian NASTASE

    2010-12-01

    Full Text Available Nowadays, in order to make a sustainable economic growth possible, especially within manufacturing industries, with auspicious effects concerning the standard of living and the employment rates, the governments should develop and implement projects and strategies, aiming the transition towards knowledge-based economy. The entities that need investments the most are the countries which are crossing a development process, usually facing difficulties in developing a high-performance industrial sector. Under these circumstances, industrial parks prove to be important tools in order to ensure the competitiveness of the national industry. However, the more and more frequent relocation of industrial companies within industrial parks, as a consequence of amplifying the urban areas’ environmental regulations, is due to generate uncontrollable pollution centers, placed just nearby towns and cities, with highly destructive impact upon the environment. This article aims to describe and recommend several strategic options for developing sustainable industrial parks, focused on protecting the environment and promoting the “new economy’s” principles. Among the strategic options recommended, some of them are highly innovative, such as environmental benchmarking or environmental leadership.

  12. Advances in Artificial Neural Networks - Methodological Development and Application

    Science.gov (United States)

    Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...

  13. Methodological Barriers Precluding the Development of Comprehensive Theory.

    Science.gov (United States)

    Wilmot, William; King, Stephen

    The authors examine published research in speech communication and evaluate its potential for theory development. Two major suggestions are advanced that will facilitate the quest for viable theory of speech communication. First, research should begin to focus on relevant communication behaviors rather than merely using them as convenient contexts…

  14. Review of methodological developments in laser Doppler flowmetry

    NARCIS (Netherlands)

    Rajan, Vinayakrishnan; Varghese, Babu; van Leeuwen, Ton; Steenbergen, Wiendelt

    2009-01-01

    Laser Doppler flowmetry is a non-invasive method of measuring microcirculatory blood flow in tissue. In this review the technique is discussed in detail. The theoretical and experimental developments to improve the technique are reviewed. The limitations of the method are elaborated upon, and the re

  15. Postlaunch Monitoring of Functional Foods - Methodology development (I)

    NARCIS (Netherlands)

    Jong N de; Ocke MC; CVG

    2004-01-01

    Already for some years, the development of a Postlaunch Monitoring system for functional foods is on the research agenda of several stakeholders involved, e.g. the industries, the government, and research institutes. Up till now, proposals for such a system have been highly hypothetical and only li

  16. Integrated management model. Methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization; Modelo de gestion integrada del activo. Una Metodologia para la Optimizacion Economica de la Gestion del Envejecimiento y la Fiabilidad de los Equipos de la Central

    Energy Technology Data Exchange (ETDEWEB)

    Llovet, R.; Ibanez, R.; Woodcock, J.

    2005-07-01

    A key concern for utilities today is optimizing station aging and realibility management activities in a manner that maximizes the value of those activities withing an affordable budget. The Westinghouse Proactive Asset Management Model is a methodology and software-enabled tood designed to assist a utility in developing a station-wide optimization of those activities. The process and tool support the development of an optimized, station-wide plan for inspection, testing, maintenance, repaor and replacement of aging components. The optimization identifies the benefit and optimal timing of those activities based on minimizing unplanned outage costs (avoided costs) and maximizing station Net Present Value. (Author)

  17. A Classification Methodology and Retrieval Model to Support Software Reuse

    Science.gov (United States)

    1988-01-01

    Smart .... 42 2.22.2 Sire ....... 42 2.2.2.3 Caliban _ 43 2.23 Probabilistic Information Retrieval 44 2.23.1 Harter’s Model __ 45 2.23.2 University of...attribute vectors destroys the boolean structure. 22.23 Caliban Caliban is an experimental IR system developed at the Swiss Federal Institute of...information items. To retrieve information using Caliban , the user specifies a "virtual information item" (fills out a template describing the item

  18. Methodological aspects of journaling a dynamic adjusting entry model

    Directory of Open Access Journals (Sweden)

    Vlasta Kašparovská

    2011-01-01

    Full Text Available This paper expands the discussion of the importance and function of adjusting entries for loan receivables. Discussion of the cyclical development of adjusting entries, their negative impact on the business cycle and potential solutions has intensified during the financial crisis. These discussions are still ongoing and continue to be relevant to members of the professional public, banking regulators and representatives of international accounting institutions. The objective of this paper is to evaluate a method of journaling dynamic adjusting entries under current accounting law. It also expresses the authors’ opinions on the potential for consistently implementing basic accounting principles in journaling adjusting entries for loan receivables under a dynamic model.

  19. Organizational Culture and Scale Development: Methodological Challenges and Future Directions

    Directory of Open Access Journals (Sweden)

    Bavik Ali

    2014-12-01

    Full Text Available Defining and measuring organizational culture (OC is of paramount importance to organizations because a strong culture could potentially increase service quality and yield sustainable competitive advantages. However, such process could be challenging to managers because the scope of OC has been defined differently across disciplines and industries, which has led to the development of various scales for measuring OC. In addition, previously developed OC scales may also not be fully applicable in the hospitality and tourism context. Therefore, by highlighting the key factors affecting the business environment and the unique characteristics of hospitality industry, this paper aims to align the scope of OC closely with the industry and to put forth the need for a new OC scale that accurately responds to the context of the hospitality industry.

  20. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  1. A geostatistical methodology to assess the accuracy of unsaturated flow models

    Energy Technology Data Exchange (ETDEWEB)

    Smoot, J.L.; Williams, R.E.

    1996-04-01

    The Pacific Northwest National Laboratory spatiotemporal movement of water injected into (PNNL) has developed a Hydrologic unsaturated sediments at the Hanford Site in Evaluation Methodology (HEM) to assist the Washington State was used to develop a new U.S. Nuclear Regulatory Commission in method for evaluating mathematical model evaluating the potential that infiltrating meteoric predictions. Measured water content data were water will produce leachate at commercial low- interpolated geostatistically to a 16 x 16 x 36 level radioactive waste disposal sites. Two key grid at several time intervals. Then a issues are raised in the HEM: (1) evaluation of mathematical model was used to predict water mathematical models that predict facility content at the same grid locations at the selected performance, and (2) estimation of the times. Node-by-node comparison of the uncertainty associated with these mathematical mathematical model predictions with the model predictions. The technical objective of geostatistically interpolated values was this research is to adapt geostatistical tools conducted. The method facilitates a complete commonly used for model parameter estimation accounting and categorization of model error at to the problem of estimating the spatial every node. The comparison suggests that distribution of the dependent variable to be model results generally are within measurement calculated by the model. To fulfill this error. The worst model error occurs in silt objective, a database describing the lenses and is in excess of measurement error.

  2. Development cooperation as methodology for teaching social responsibility to engineers

    Science.gov (United States)

    Lappalainen, Pia

    2011-12-01

    The role of engineering in promoting global well-being has become accentuated, turning the engineering curriculum into a means of dividing well-being equally. The gradual fortifying calls for humanitarian engineering have resulted in the incorporation of social responsibility themes in the university curriculum. Cooperation, communication, teamwork, intercultural cooperation, sustainability, social and global responsibility represent the socio-cultural dimensions that are becoming increasingly important as globalisation intensifies the demands for socially and globally adept engineering communities. This article describes an experiment, the Development Cooperation Project, which was conducted at Aalto University in Finland to integrate social responsibility themes into higher engineering education.

  3. Applying axiomatic design methodology in developing modified libertation products

    Directory of Open Access Journals (Sweden)

    Bibiana Margarita Vallejo Díaz

    2010-04-01

    Full Text Available Some conceptual elements regarding the axiomatic design method were applied to a specific case-study regarding developing modified liberation compressed product (CLM-UN, for use in the agricultural sector as pH regulating agent in solil. The study was orientated towards defining functional requeriments, design parameters and process variables for manufacturing the product. Independence and information were evaluated, supporting axiomatic design as an alternative for integral product and process design (as a rational and systemic exercise, facilitating producing products having the quality which future users expect from them.

  4. A methodology to promote business development from research outcomes in food science and technology

    Directory of Open Access Journals (Sweden)

    Eduardo L. Cardoso

    2015-04-01

    Full Text Available Valorization of knowledge produced in research units has been a major challenge for research universities in contemporary societies. The prevailing forces have led these institutions to develop a “third mission”, the facilitation of technology transfer and activity in an entrepreneurial paradigm. Effective management of challenges encountered in the development of academic entrepreneurship and the associated valorization of knowledge produced by universities are major factors to bridge the gap between research and innovation in Europe.The need to improve the existing institutional knowledge valorization processes, concerning entrepreneurship and business development and the processes required were discussed.A case study was designed to describe the institutional knowledge valorization process in a food science and technology research unit and a related incubator, during a five year evaluation period that ended in 2012.The knowledge valorization processes benefited from the adoption of a structured framework methodology that led to ideas and teams from a business model generation to client development, in parallel, when possible, with an agile product/service development.Although academic entrepreneurship engagement could be improved, this case study demonstrated that stronger skills development was needed to enable the researcher to be more aware of business development fundamentals and therefore contribute to research decisions and the valorisation of individual and institutional knowledge assets. It was noted that the timing for involvement of companies in the research projects or programs varied with the nature of the research.

  5. A methodology to promote business development from research outcomes in food science and technology

    Directory of Open Access Journals (Sweden)

    Eduardo L. Cardoso

    2015-04-01

    Full Text Available Valorization of knowledge produced in research units has been a major challenge for research universities in contemporary societies. The prevailing forces have led these institutions to develop a “third mission”, the facilitation of technology transfer and activity in an entrepreneurial paradigm. Effective management of challenges encountered in the development of academic entrepreneurship and the associated valorization of knowledge produced by universities are major factors to bridge the gap between research and innovation in Europe.The need to improve the existing institutional knowledge valorization processes, concerning entrepreneurship and business development and the processes required were discussed.A case study was designed to describe the institutional knowledge valorization process in a food science and technology research unit and a related incubator, during a five year evaluation period that ended in 2012.The knowledge valorization processes benefited from the adoption of a structured framework methodology that led to ideas and teams from a business model generation to client development, in parallel, when possible, with an agile product/service development.Although academic entrepreneurship engagement could be improved, this case study demonstrated that stronger skills development was needed to enable the researcher to be more aware of business development fundamentals and therefore contribute to research decisions and the valorisation of individual and institutional knowledge assets. It was noted that the timing for involvement of companies in the research projects or programs varied with the nature of the research.

  6. Efficient methodologies for system matrix modelling in iterative image reconstruction for rotating high-resolution PET

    Energy Technology Data Exchange (ETDEWEB)

    Ortuno, J E; Kontaxakis, G; Rubio, J L; Santos, A [Departamento de Ingenieria Electronica (DIE), Universidad Politecnica de Madrid, Ciudad Universitaria s/n, 28040 Madrid (Spain); Guerra, P [Networking Research Center on Bioengineering, Biomaterials and Nanomedicine (CIBER-BBN), Madrid (Spain)], E-mail: juanen@die.upm.es

    2010-04-07

    A fully 3D iterative image reconstruction algorithm has been developed for high-resolution PET cameras composed of pixelated scintillator crystal arrays and rotating planar detectors, based on the ordered subsets approach. The associated system matrix is precalculated with Monte Carlo methods that incorporate physical effects not included in analytical models, such as positron range effects and interaction of the incident gammas with the scintillator material. Custom Monte Carlo methodologies have been developed and optimized for modelling of system matrices for fast iterative image reconstruction adapted to specific scanner geometries, without redundant calculations. According to the methodology proposed here, only one-eighth of the voxels within two central transaxial slices need to be modelled in detail. The rest of the system matrix elements can be obtained with the aid of axial symmetries and redundancies, as well as in-plane symmetries within transaxial slices. Sparse matrix techniques for the non-zero system matrix elements are employed, allowing for fast execution of the image reconstruction process. This 3D image reconstruction scheme has been compared in terms of image quality to a 2D fast implementation of the OSEM algorithm combined with Fourier rebinning approaches. This work confirms the superiority of fully 3D OSEM in terms of spatial resolution, contrast recovery and noise reduction as compared to conventional 2D approaches based on rebinning schemes. At the same time it demonstrates that fully 3D methodologies can be efficiently applied to the image reconstruction problem for high-resolution rotational PET cameras by applying accurate pre-calculated system models and taking advantage of the system's symmetries.

  7. Methodology development to support NPR strategic planning. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-04-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget.

  8. Summary of FY-1978 consultant input for scenario methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Scott, B.L.; Benson, G.L.; Craig, R.A. (eds.); Harwell, M.A.

    1979-11-01

    Associated with commercial nuclear power production in the United States is the generation of potentially hazardous radioactive waste products. The Department of Energy (DOE), through the National Waste Terminal Storage (NWTS) Program, is seeking to develop nuclear waste isolation systems in geologic formations. These underground waste isolation systems will preclude contact with the biosphere of waste radionuclides in concentrations which are sufficient to cause deleterious impact on humans or their environments. Comprehensive analyses of specific isolation systems are needed to assess the postclosure expectations of the systems. Assessment of Effectiveness of Geologic Isolation Systems (AEGIS) program has been established for developing the capability of making those analyses. The assessment of repository post-closure safety has two basic components: identification and analyses of breach scenarios and the pattern of events and processes causing each breach, and identification and analyses of the environmental consequences of radionuclide transport and interactions subsequent to a repository breach. Specific processes and events which might affect potential repository sites and, the rates and probabilities for those phenomena are presented. The description of the system interactions and synergisms and of the repository system as an evolving and continuing process are included. Much of the preliminary information derived from the FY-1978 research effort is summarized in this document. This summary report contains information pertaining to the following areas of study: climatology, geomorphology, glaciology, hydrology, meteorites, sea level fluctuations, structural geology and volcanology.

  9. Develop a Model Component

    Science.gov (United States)

    Ensey, Tyler S.

    2013-01-01

    During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a

  10. The Relationships of Soft Systems Methodology (SSM, Business Process Modeling and e-Government

    Directory of Open Access Journals (Sweden)

    Arief Ramadhan

    2012-01-01

    Full Text Available e-Government have emerged in several countries. Because of many aspects that must be considered, and because of there are exist some soft components in e-Government, then the Soft Systems Methodology (SSM can be considered to use in e-Government systems development process. On the other hand, business process modeling is essential in many fields nowadays, as well as in e-Government. Some researchers have used SSM in e-Government. Several studies that relate the business processes modeling with e-Government have been conducted. This paper tries to reveal the relationship between SSM and business process modeling. Moreover, this paper also tries to explain how business process modeling is integrated within SSM, and further link that integration to the e-Government.

  11. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations.

  12. Research Activities on Development of Piping Design Methodology of High Temperature Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Nam-Su [Seoul National Univ. of Science and Technology, Seoul(Korea, Republic of); Won, Min-Gu [Sungkyukwan Univ., Suwon (Korea, Republic of); Oh, Young-Jin [KEPCO Engineering and Construction Co. Inc., Gimcheon (Korea, Republic of); Lee, Hyeog-Yeon; Kim, Yoo-Gon [Korea Atomic Energy Research Institute, Daejeon(Korea, Republic of)

    2016-10-15

    A SFR is operated at high temperature and low pressure compared with commercial pressurized water reactor (PWR), and such an operating condition leads to time-dependent damages such as creep rupture, excessive creep deformation, creep-fatigue interaction and creep crack growth. Thus, high temperature design and structural integrity assessment methodology should be developed considering such failure mechanisms. In terms of design of mechanical components of SFR, ASME B and PV Code, Sec. III, Div. 5 and RCC-MRx provide high temperature design and assessment procedures for nuclear structural components operated at high temperature, and a Leak-Before-Break (LBB) assessment procedure for high temperature piping is also provided in RCC-MRx, A16. Three web-based evaluation programs based on the current high temperature codes were developed for structural components of high temperature reactors. Moreover, for the detailed LBB analyses of high temperature piping, new engineering methods for predicting creep C*-integral and creep COD rate based either on GE/EPRI or on reference stress concepts were proposed. Finally, the numerical methods based on Garofalo's model and RCC-MRx have been developed, and they have been implemented into ABAQUS. The predictions based on both models were compared with the experimental results, and it has been revealed that the predictions from Garafalo's model gave somewhat successful results to describe the deformation behavior of Gr. 91 at elevated temperatures.

  13. A Methodology for assessing Agile Software Development Approaches

    CERN Document Server

    Soundararajan, Shvetha

    2011-01-01

    Agile methods provide an organization or a team the flexibility to adopt a selected subset of principles and practices based on their culture, their values, and the types of systems that they develop. More specifically, every organization or team implements a customized agile method, tailored to better accommodate its needs. However, the extent to which a customized method supports the organizational objectives, or rather the 'goodness' of that method is questionable. Existing agile assessment approaches focus on a comparative analysis, or are limited in scope and application. In this research, we propose a structured, systematic and comprehensive approach to assess the 'goodness' of agile methods. We examine an agile method based on (1) its adequacy, (2) the capability of the organization to support the adopted principles and practices specified by the method, and (3) the method's effectiveness. We propose the Objectives, Principles and Practices (OPP) Framework to guide our assessment. The Framework identif...

  14. Development of probabilistic rigid pavement design methodologies for military airfields

    Science.gov (United States)

    Witczak, M. W.; Uzan, J.; Johnson, M.

    1983-12-01

    The current Corps of Engineers design procedures for rigid airfield pavements is based on the Westergaard free edge stress slab theory, and a proposed procedure is based on the multilayer elastic theory. These two design procedures have been expanded to airfield pavement designs expressed in probabilistic and reliability terms. Further developments were required in these procedures to make the analysis more practicable. Two major investigations were conducted: (1) Evaluation and use of the composite modulus of elasticity for layers beneath the rigid pavement, and (2) Evaluation of the maximum tensile stress at the bottom of the slab for different aircraft types. Derivations obtained from the investigation of the composite modulus and maximum tensile stress are reported and are included in computer programs for probabilistic/reliability analysis of rigid pavements. The approximate closed form (Taylor series expansion) is utilized. Example runs of the computer program are presented.

  15. Sharing on Web 3d Models of Ancient Theatres. a Methodological Workflow

    Science.gov (United States)

    Scianna, A.; La Guardia, M.; Scaduto, M. L.

    2016-06-01

    In the last few years, the need to share on the Web the knowledge of Cultural Heritage (CH) through navigable 3D models has increased. This need requires the availability of Web-based virtual reality systems and 3D WEBGIS. In order to make the information available to all stakeholders, these instruments should be powerful and at the same time very user-friendly. However, research and experiments carried out so far show that a standardized methodology doesn't exist. All this is due both to complexity and dimensions of geometric models to be published, on the one hand, and to excessive costs of hardware and software tools, on the other. In light of this background, the paper describes a methodological approach for creating 3D models of CH, freely exportable on the Web, based on HTML5 and free and open source software. HTML5, supporting the WebGL standard, allows the exploration of 3D spatial models using most used Web browsers like Chrome, Firefox, Safari, Internet Explorer. The methodological workflow here described has been tested for the construction of a multimedia geo-spatial platform developed for three-dimensional exploration and documentation of the ancient theatres of Segesta and of Carthage, and the surrounding landscapes. The experimental application has allowed us to explore the potential and limitations of sharing on the Web of 3D CH models based on WebGL standard. Sharing capabilities could be extended defining suitable geospatial Web-services based on capabilities of HTML5 and WebGL technology.

  16. Methodological choices for the clinical development of medical devices

    Directory of Open Access Journals (Sweden)

    Bernard A

    2014-09-01

    reduce the number of patients, but are not applicable when a learning curve is required. Sequential trials have the advantage of allowing a trial to be stopped early depending on the results of first inclusions, but they require an independent committee. Bayesian methods combine existing information with information from the ongoing trial. These methods are particularly useful in situations where the number of subjects is small. The disadvantage is the risk of including erroneous prior information. Other types of experimental designs exist when conventional trials cannot always be applied to the clinical development of MDs. Keywords: medical device, randomized controlled trials, assessment, clinical development

  17. Development of a methodology for defining whole-building energy design targets for commercial buildings: Phase 2, Development Concept Stage Report

    Energy Technology Data Exchange (ETDEWEB)

    Deringer, J.J. (American Inst. of Architects, Washington, DC (USA)); Hall, J.D. (Deringer Group, Riva, MD (USA)); Jones, J.W. (American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc., New York, NY (USA)); McKay, H.N. (Illuminating Engineering Society of North America, New York, NY (USA)); Alley, P.K. (Pacific Northwest Lab., Richland, WA (USA))

    1990-09-01

    The primary focus of the Whole-Building Energy Design Targets project is to develop a flexible methodology for setting target guidelines with which to assess energy efficiency in commercial building design. The proposed methodology has several innovative features. In this report, the authors document their work to define the software development concepts upon which the overall Targets methodology will be based. Three task reports are included here. Development of the user interface--that critical connection through which the human end-user (architect, engineer, planner, owner) will apply the methodology--is described in Section 2. In Section 3, the use of the software engineering process in Targets model development efforts is described. Section 4 provides details on the data and system integration task, in which interactions between and among all the major components, termed modules, of the Targets model were examined to determine how to put them together to create a methodology that is effective and easy to use. 4 refs., 26 figs.

  18. Evaluation of a proposed expert system development methodology: Two case studies

    Science.gov (United States)

    Gilstrap, Lewey

    1990-01-01

    Two expert system development projects were studied to evaluate a proposed Expert Systems Development Methodology (ESDM). The ESDM was developed to provide guidance to managers and technical personnel and serve as a standard in the development of expert systems. It was agreed that the proposed ESDM must be evaluated before it could be adopted; therefore a study was planned for its evaluation. This detailed study is now underway. Before the study began, however, two ongoing projects were selected for a retrospective evaluation. They were the Ranging Equipment Diagnostic Expert System (REDEX) and the Backup Control Mode Analysis and Utility System (BCAUS). Both projects were approximately 1 year into development. Interviews of project personnel were conducted, and the resulting data was used to prepare the retrospective evaluation. Decision models of the two projects were constructed and used to evaluate the completeness and accuracy of key provisions of ESDM. A major conclusion reached from these case studies is that suitability and risk analysis should be required for all AI projects, large and small. Further, the objectives of each stage of development during a project should be selected to reduce the next largest area of risk or uncertainty on the project.

  19. Geared rotor dynamic methodologies for advancing prognostic modeling capabilities in rotary-wing transmission systems

    Science.gov (United States)

    Stringer, David Blake

    The overarching objective in this research is the development of a robust, rotor dynamic, physics based model of a helicopter drive train as a foundation for the prognostic modeling for rotary-wing transmissions. Rotorcrafts rely on the integrity of their drive trains for their airworthiness. Drive trains rely on gear technology for their integrity and function. Gears alter the vibration characteristics of a mechanical system and significantly contribute to noise, component fatigue, and personal discomfort prevalent in rotorcraft. This research effort develops methodologies for generating a rotor dynamic model of a rotary-wing transmission based on first principles, through (i) development of a three-dimensional gear-mesh stiffness model for helical and spur gears and integration of this model in a finite element rotor dynamic model, (ii) linear and nonlinear analyses of a geared system for comparison and validation of the gear-mesh model, (iii) development of a modal synthesis technique for potentially providing model reduction and faster analysis capabilities for geared systems, and (iv) extension of the gear-mesh model to bevel and epicyclic configurations. In addition to model construction and validation, faults indigenous to geared systems are presented and discussed. Two faults are selected for analysis and seeded into the transmission model. Diagnostic vibration parameters are presented and used as damage indicators in the analysis. The fault models produce results consistent with damage experienced during experimental testing. The results of this research demonstrate the robustness of the physics-based approach in simulating multiple normal and abnormal conditions. The advantages of this physics-based approach, when combined with contemporary probabilistic and time-series techniques, provide a useful method for improving health monitoring technologies in mechanical systems.

  20. Methodological choices for the clinical development of medical devices.

    Science.gov (United States)

    Bernard, Alain; Vaneau, Michel; Fournel, Isabelle; Galmiche, Hubert; Nony, Patrice; Dubernard, Jean Michel

    2014-01-01

    clinical development of MDs.

  1. Pediatric hospital medicine core competencies: development and methodology.

    Science.gov (United States)

    Stucky, Erin R; Ottolini, Mary C; Maniscalco, Jennifer

    2010-01-01

    Pediatric hospital medicine is the most rapidly growing site-based pediatric specialty. There are over 2500 unique members in the three core societies in which pediatric hospitalists are members: the American Academy of Pediatrics (AAP), the Academic Pediatric Association (APA) and the Society of Hospital Medicine (SHM). Pediatric hospitalists are fulfilling both clinical and system improvement roles within varied hospital systems. Defined expectations and competencies for pediatric hospitalists are needed. In 2005, SHM's Pediatric Core Curriculum Task Force initiated the project and formed the editorial board. Over the subsequent four years, multiple pediatric hospitalists belonging to the AAP, APA, or SHM contributed to the content of and guided the development of the project. Editors and collaborators created a framework for identifying appropriate competency content areas. Content experts from both within and outside of pediatric hospital medicine participated as contributors. A number of selected national organizations and societies provided valuable feedback on chapters. The final product was validated by formal review from the AAP, APA, and SHM. The Pediatric Hospital Medicine Core Competencies were created. They include 54 chapters divided into four sections: Common Clinical Diagnoses and Conditions, Core Skills, Specialized Clinical Services, and Healthcare Systems: Supporting and Advancing Child Health. Each chapter can be used independently of the others. Chapters follow the knowledge, skills, and attitudes educational curriculum format, and have an additional section on systems organization and improvement to reflect the pediatric hospitalist's responsibility to advance systems of care. These competencies provide a foundation for the creation of pediatric hospital medicine curricula and serve to standardize and improve inpatient training practices. (c) 2010 Society of Hospital Medicine.

  2. Development of infrared methodologies applied to determination of the metformin hidrochloride in pharmaceutical formulations

    Directory of Open Access Journals (Sweden)

    Graciele Parisotto

    2008-12-01

    Full Text Available In this work 20 metformin hydrochloride commercial pill samples and 14 synthetic samples were used and their spectra acquired in the 4000 – 650 cm-1 range by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS. Multivariate calibration models were built and compared using all spectral region (global model PLS and optimized models applying FTIR spectral sub-regions (iPLS and siPLS models. The best results were the ones obtained by using the siPLS algorithm, dividing the spectrum into 40 intervals and combining 3 intervals (37, 39 and 40 with 4 latent variables, which also presented an RMSEP value lower than the one obtained by using the global model and the iPLS algorithm. Finally, the application of selection techniques for infrared spectral regions (iPLS and siPLS has been efficient to develop simpler, faster and non-destructive methodologies to analyze pharmaceutical pill formulations containing metformin, highlighting the potentialities of these techniques to control and inspect industrialized medicines.

  3. Validation of multi-body modelling methodology for reconfigurable underwater robots

    DEFF Research Database (Denmark)

    Nielsen, M.C.; Eidsvik, O. A.; Blanke, Mogens;

    2016-01-01

    This paper investigates the problem of employing reconfigurable robots in an underwater setting. The main results presented is the experimental validation of a modelling methodology for a system consisting of N dynamically connected robots with heterogeneous dynamics. Two distinct types....... The purpose of the model is to enable design of control strategies for cooperative reconfigurable underwater systems....... of experiments are performed, a series of hydrostatic free-decay tests and a series of open-loop trajectory tests. The results are compared to a simulation based on the modelling methodology. The modelling methodology shows promising results for usage with systems composed of reconfigurable underwater modules...

  4. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K.R.; Sharp, D.A. (Westinghouse Savannah River Co., Aiken, SC (United States)); Amos, C.N.; Wagner, K.C.; Bradley, D.R. (Science Applications International Corp., Albuquerque, NM (United States))

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained.

  5. DEVELOPMENT OF METHODOLOGY FOR DESIGNING TESTABLE COMPONENT STRUCTURE OF DISCIPLINARY COMPETENCE

    Directory of Open Access Journals (Sweden)

    Vladimir I. Freyman

    2014-01-01

    Full Text Available The aim of the study is to present new methods of quality results assessment of the education corresponding to requirements of Federal State Educational Standards (FSES of the Third Generation developed for the higher school. The urgency of search of adequate tools for quality competency measurement and its elements formed in the course of experts’ preparation are specified. Methods. It is necessary to consider interference of competency components such as knowledge, abilities, possession in order to make procedures of assessment of students’ achievements within the limits of separate discipline or curriculum section more convenient, effective and exact. While modeling of component structure of the disciplinary competence the testable design of components is used; the approach borrowed from technical diagnostics. Results. The research outcomes include the definition and analysis of general iterative methodology for testable designing component structure of the disciplinary competence. Application of the proposed methodology is illustrated as the example of an abstract academic discipline with specified data and index of labour requirement. Methodology restrictions are noted; practical recommendations are given. Scientific novelty. Basic data and a detailed step-by-step implementation phase of the proposed common iterative approach to the development of disciplinary competence testable component structure are considered. Tests and diagnostic tables for different options of designing are proposed. Practical significance. The research findings can help promoting learning efficiency increase, a choice of adequate control devices, accuracy of assessment, and also efficient use of personnel, temporal and material resources of higher education institutions. Proposed algorithms, methods and approaches to procedure of control results organization and realization of developed competences and its components can be used as methodical base while

  6. Using digital models for evaluation of effective solar radiance. Development of methodology and practice application; Empleo de modelos digitales del terreno para la evaluacion de la radiacionsolar effectiva. Desarrollo de una metodologia y aplicacion practica

    Energy Technology Data Exchange (ETDEWEB)

    Izco, E.; De Blas, M.; Torres, J. L.; Garcia, R.

    2004-07-01

    In this communication it has been described the use of advanced tools for determining the effective solar radiance, and its possible passive and active, thermic or photovoltaic, development, in various areas of buildings or urban zones. According to cartographic information of a Digital Elevation Model , and in ECOTEC v5.20 software. An hourly treatment of illuminated and shaded zones has been carried out for several days of the year. In a case study it has been proven that the software ECOTEC v5.20 can work with the Digital Elevation Model of Public University of Navarra, and it has been analyzed illuminated and shaded zones visual and quantitatively for two days of the year, summer and winter solstices . (Author)

  7. Testing spectral models for stellar populations with star clusters - I. Methodology

    Science.gov (United States)

    Cid Fernandes, Roberto; González Delgado, Rosa M.

    2010-04-01

    High-resolution spectral models for simple stellar populations (SSP) developed in the past few years have become a standard ingredient in studies of stellar population of galaxies. As more such models become available, it becomes increasingly important to test them. In this and a companion paper, we test a suite of publicly available evolutionary synthesis models using integrated optical spectra in the blue-near-UV range of 27 well-studied star clusters from the work of Leonardi and Rose spanning a wide range of ages and metallicities. Most (23) of the clusters are from the Magellanic Clouds. This paper concentrates on the methodological aspects of spectral fitting. The data are fitted with SSP spectral models from Vazdekis and collaborators, based on the Medium-resolution INT Library of Empirical Spectra. Best-fitting and Bayesian estimates of age, metallicity and extinction are presented, and degeneracies between these parameters are mapped. We find that these models can match the observed spectra very well in most cases, with small formal uncertainties in t,Z and AV. In some cases, the spectral fits indicate that the models lack a blue old population, probably associated with the horizontal branch. This methodology, which is mostly based on the publicly available code STARLIGHT, is extended to other sets of models in Paper II, where a comparison with properties derived from spatially resolved data (colour-magnitude diagrams) is presented. The global aim of these two papers is to provide guidance to users of evolutionary synthesis models and empirical feedback to model makers.

  8. Development and extraction optimization of baicalein and pinostrobin from Scutellaria violacea through response surface methodology

    Directory of Open Access Journals (Sweden)

    Shankar Subramaniam

    2015-01-01

    Full Text Available Objective: To develop a process that involves optimization of the amount of baicalein and pinostrobin from the hydro-methanolic extract of the leaves of Scutellaria violacea by response surface methodology (RSM. Materials and Methods: The combinatorial influence of various extraction parameters on the extraction yield was investigated by adopting Box-Behnken experimental design. Preliminary experiments carried out based on the traditional one variable at a time optimization revealed four such operational parameters to play a crucial role by influencing the yield. These four process parameters at three levels were considered to obtain the Box-Behnken experimental design. Results: RSM based model fitted to the resulting experimental data suggested that 52.3% methanol/water, 12.46:1 solvent-solid ratio, 285 rpm agitation and 6.07 h of extraction time are the optimal conditions which yielded a maximized amount of baicalein and pinostrobin of 2.9 and 4.05 mg/g DM. Analysis of variance revealed a high correlation coefficient (R2 = 0.999 for baicalein and 0.994 for pinostrobin, signifying a good fit between the regression model (second order and the experimental observations. Conclusion: The present study signifies that both the metabolites have been extracted from S. violacea for the first time. Further, this study developed an optimized extraction procedure to obtain maximum yield of the metabolites, which is unique and better than conventional extraction methodology. The operational parameters under optimized conditions accounts for the lowest cost in extraction process thus, providing an efficient, rapid and cost-effective method for isolation and scale up of these commercially vital flavonoids.

  9. The Idea of National HRD: An Analysis Based on Economics and Theory Development Methodology

    Science.gov (United States)

    Wang, Greg G.; Swanson, Richard A.

    2008-01-01

    Recent human resource development (HRD) literature focuses attention on national HRD (NHRD) research and represents problems in both HRD identity and research methodology. Based on a review of development economics and international development literature, this study analyzes the existing NHRD literature with respect to the theory development…

  10. State-space models for bio-loggers: A methodological road map

    DEFF Research Database (Denmark)

    Jonsen, I.D.; Basson, M.; Bestley, S.

    2012-01-01

    development of state-space modelling approaches for animal movement data provides statistical rigor for inferring hidden behavioural states, relating these states to bio-physical data, and ultimately for predicting the potential impacts of climate change. Despite the widespread utility, and current popularity......-physical datasets to understand physiological and ecological influences on habitat selection. In most cases, however, the behavioural context is not directly observable and therefore, must be inferred. Animal movement data are complex in structure, entailing a need for stochastic analysis methods. The recent......, of state-space models for analysis of animal tracking data, these tools are not simple and require considerable care in their use. Here we develop a methodological “road map” for ecologists by reviewing currently available state-space implementations. We discuss appropriate use of state-space methods...

  11. Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology

    Science.gov (United States)

    Macioł, Piotr; Michalik, Kazimierz

    2016-10-01

    Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.

  12. A consistent modelling methodology for secondary settling tanks in wastewater treatment.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Nopens, Ingmar

    2011-03-01

    The aim of this contribution is partly to build consensus on a consistent modelling methodology (CMM) of complex real processes in wastewater treatment by combining classical concepts with results from applied mathematics, and partly to apply it to the clarification-thickening process in the secondary settling tank. In the CMM, the real process should be approximated by a mathematical model (process model; ordinary or partial differential equation (ODE or PDE)), which in turn is approximated by a simulation model (numerical method) implemented on a computer. These steps have often not been carried out in a correct way. The secondary settling tank was chosen as a case since this is one of the most complex processes in a wastewater treatment plant and simulation models developed decades ago have no guarantee of satisfying fundamental mathematical and physical properties. Nevertheless, such methods are still used in commercial tools to date. This particularly becomes of interest as the state-of-the-art practice is moving towards plant-wide modelling. Then all submodels interact and errors propagate through the model and severely hamper any calibration effort and, hence, the predictive purpose of the model. The CMM is described by applying it first to a simple conversion process in the biological reactor yielding an ODE solver, and then to the solid-liquid separation in the secondary settling tank, yielding a PDE solver. Time has come to incorporate established mathematical techniques into environmental engineering, and wastewater treatment modelling in particular, and to use proven reliable and consistent simulation models.

  13. An analysis of complex multiple-choice science-technology-society items: Methodological development and preliminary results

    Science.gov (United States)

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; Acevedo-Díaz, José-Antonio

    2006-07-01

    The scarce attention to the assessment and evaluation in science education research has been especially harmful for teaching science-technology-society (STS) issues, due to the dialectical, tentative, value-laden, and polemic nature of most STS topics. This paper tackles the methodological difficulties of the instruments that monitor views related to STS topics and rationalizes a quantitative methodology and an analysis technique to improve the utility of an empirically developed multiple-choice item pool, the Questionnaire of Opinions on STS. This methodology embraces an item-scaling psychometrics based on the judgments by a panel of experts, a multiple response model, a scoring system, and the data analysis. The methodology finally produces normalized attitudinal indices that represent the respondent's reasoned beliefs toward STS statements, the respondent's position on an item that comprises several statements, or the respondent's position on an entire STS topic that encompasses a set of items. Some preliminary results show the methodology's ability to evaluate the STS attitudes in a qualitative and quantitative way and for statistical hypothesis testing. Lastly, some applications for teacher training and STS curriculum development in science classrooms are discussed.

  14. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    Science.gov (United States)

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  15. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects...... will be considered, as the intentions are that a prototype should be implemented in the production line at Odense Steel Shipyard. Hence, a Multiview approach will be considered incorporating the informational need of many actors/machines. Parameter identification, i.e. describing the parameters which PSM...

  16. Modeling Customer Loyalty by System Dynamics Methodology (Case Study: Internet Service Provider Company

    Directory of Open Access Journals (Sweden)

    Alireza Bafandeh Zendeh

    2016-03-01

    Full Text Available Due to the complexity of the customer loyalty, we tried to provide a conceptual model to explain it in an Internet service provider company with system dynamics approach. To do so, the customer’s loyalty for statistical population was analyzed according to Sterman’s modeling methodology. First of all the reference modes (historical behavior of customer loyalty was evaluated. Then dynamic hypotheses was developed by utilizing causal - loop diagrams and stock-flow maps, based on theoretical literature. In third stage, initial conditions of variables, parameters, and mathematical functions between them were estimated. The model was tested, finally advertising, quality of services improvement and continuing the current situation scenarios were evaluated. Results showed improving the quality of service scenario is more effectiveness in compare to others

  17. Artificial Neural Network and Response Surface Methodology Modeling in Ionic Conductivity Predictions of Phthaloylchitosan-Based Gel Polymer Electrolyte

    Directory of Open Access Journals (Sweden)

    Ahmad Danial Azzahari

    2016-01-01

    Full Text Available A gel polymer electrolyte system based on phthaloylchitosan was prepared. The effects of process variables, such as lithium iodide, caesium iodide, and 1-butyl-3-methylimidazolium iodide were investigated using a distance-based ternary mixture experimental design. A comparative approach was made between response surface methodology (RSM and artificial neural network (ANN to predict the ionic conductivity. The predictive capabilities of the two methodologies were compared in terms of coefficient of determination R2 based on the validation data set. It was shown that the developed ANN model had better predictive outcome as compared to the RSM model.

  18. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias; Zhang, Jie

    2017-03-01

    With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by first layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.

  19. A new methodology for dynamic modelling of health risks arising from wastewater influenced urban flooding

    Science.gov (United States)

    Jørgensen, Claus; Mark, Ole; Djordjevic, Slobodan; Hammond, Michael; Khan, David M.; Erichsen, Anders; Dorrit Enevoldsen, Ann; Heinicke, Gerald; Helwigh, Birgitte

    2015-04-01

    Indroduction Urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and it has significant economic and social consequences. While the cost of the direct flood damages of urban flooding is well understood, the indirect damages, like the water borne diseases is in general still poorly understood. Climate changes are expected to increase the frequency of urban flooding in many countries which is likely to increase water borne diseases. Diarrheal diseases are most prevalent in developing countries, where poor sanitation, poor drinking water and poor surface water quality causes a high disease burden and mortality, especially during floods. The level of water borne diarrhea in countries with well-developed water and waste water infrastructure has been reduced to an acceptable level, and the population in general do not consider waste water as being a health risk. Hence, exposure to wastewater influenced urban flood water still has the potential to cause transmission of diarrheal diseases. When managing urban flooding and planning urban climate change adaptations, health risks are rarely taken into consideration. This paper outlines a novel methodology for linking dynamic urban flood modelling with Quantitative Microbial Risk Assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and the health risks caused by direct human contact with flood water and provides an option for reducing the burden of disease in the population through the use of intelligent urban flood risk management. Methodology We have linked hydrodynamic urban flood modelling with quantitative microbial risk assessment (QMRA) to determine the risk of infection caused by exposure to wastewater influenced urban flood water. The deterministic model MIKE Flood, which integrates the sewer network model in MIKE Urban and the 2D surface model MIKE21, was used to calculate the concentration of pathogens in the

  20. An Effective Methodology with Automated Product Configuration for Software Product Line Development

    Directory of Open Access Journals (Sweden)

    Scott Uk-Jin Lee

    2015-01-01

    Full Text Available The wide adaptation of product line engineering in software industry has enabled cost effective development of high quality software for diverse market segments. In software product line (SPL, a family of software is specified with a set of core assets representing reusable features with their variability, dependencies, and constraints. From such core assets, valid software products are configured after thoroughly analysing the represented features and their properties. However, current implementations of SPL lack effective means to configure a valid product as core assets specified in SPL, being high-dimensional data, are often too complex to analyse. This paper presents a time and cost effective methodology with associated tool supports to design a SPL model, analyse features, and configure a valid product. The proposed approach uses eXtensible Markup Language (XML to model SPL, where an adequate schema is defined to precisely specify core assets. Furthermore, it enables automated product configuration by (i extracting all the properties of required features from a given SPL model and calculating them with Alloy Analyzer; (ii generating a decision model with appropriate eXtensible Stylesheet Language Transformation (XSLT instructions embedded in each resolution effect; and (iii processing XSLT instructions of all the selected resolution effects.

  1. Teaching Mathematical Modelling in a Design Context: A Methodology Based on the Mechanical Analysis of a Domestic Cancrusher.

    Science.gov (United States)

    Pace, Sydney

    2000-01-01

    Presents a methodology for teaching mathematical modeling skills to A-level students. Gives an example illustrating how mathematics teachers and design teachers can take joint perspective in devising learning opportunities that develop mathematical and design skills concurrently. (Contains 14 references.) (Author/ASK)

  2. Novel Methodology for Functional Modeling and Simulation of Wireless Embedded Systems

    Directory of Open Access Journals (Sweden)

    Nitasha Jugessur

    2008-07-01

    Full Text Available A novel methodology is presented for the modeling and the simulation of wireless embedded systems. Tight interaction between the analog and the digital functionality makes the design and verification of such systems a real challenge. The applied methodology brings together the functional models of the baseband algorithms written in C language with the circuit descriptions at behavioral level in Verilog or Verilog-AMS for the system simulations in a single kernel environment. The physical layer of an ultrawideband system has been successfully modeled and simulated. The results confirm that this methodology provides a standardized framework in order to efficiently and accurately simulate complex mixed signal applications for embedded systems.

  3. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  4. Modeling and Analysis of MRR, EWR and Surface Roughness in EDM Milling through Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    A. K.M.S. Iqbal

    2010-01-01

    Full Text Available Problem statement: Electrical Discharge Machining (EDM has grown over the last few decades from a novelty to a mainstream manufacturing process. Though, EDM process is very demanding but the mechanism of the process is complex and far from completely understood. It is difficult to establish a model that can accurately predict the performance by correlating the process parameters. The optimum processing parameters are essential to increase the production rate and decrease the machining time, since the materials, which are processed by EDM and even the process is very costly. This research establishes empirical relations regarding machining parameters and the responses in analyzing the machinability of the stainless steel. Approach: The machining factors used are voltage, rotational speed of electrode and feed rate over the responses MRR, EWR and Ra. Response surface methodology was used to investigate the relationships and parametric interactions between the three controllable variables on the MRR, EWR and Ra. Central composite experimental design was used to estimate the model coefficients of the three factors. The responses were modeled using a response surface model based on experimental results. The significant coefficients were obtained by performing Analysis Of Variance (ANOVA at 95% level of significance. Results: The variation in percentage errors for developed models was found within 5%. Conclusion: The developed models show that voltage and rotary motion of electrode are the most significant machining parameters influencing MRR, EWR and Ra. These models can be used to get the desired responses within the experimental range.

  5. METHODOLOGICAL APPROACH AND MODEL ANALYSIS FOR IDENTIFICATION OF TOURIST TRENDS

    Directory of Open Access Journals (Sweden)

    Neven Šerić

    2015-06-01

    Full Text Available The draw and diversity of the destination’s offer is an antecedent of the tourism visits growth. The destination supply differentiation is carried through new, specialised tourism products. The usual approach consists of forming specialised tourism products in accordance with the existing tourism destination image. Another approach, prevalent in practice of developed tourism destinations is based on innovating the destination supply through accordance with the global tourism trends. For this particular purpose, it is advisable to choose a monitoring and analysis method of tourism trends. The goal is to determine actual trends governing target markets, differentiating whims from trends during the tourism preseason. When considering the return on investment, modifying the destination’s tourism offer on the basis of a tourism whim is a risky endeavour, indeed. Adapting the destination’s supply to tourism whims can result in a shifted image, one that is unable to ensure a long term interest and tourist vacation growth. With regard to tourism trend research and based on the research conducted, an advisable model for evaluating tourism phenomena is proposed, one that determines whether tourism phenomena is a tourism trend or a tourism whim.

  6. A system-of-systems modeling methodology for strategic general aviation design decision-making

    Science.gov (United States)

    Won, Henry Thome

    General aviation has long been studied as a means of providing an on-demand "personal air vehicle" that bypasses the traffic at major commercial hubs. This thesis continues this research through development of a system of systems modeling methodology applicable to the selection of synergistic product concepts, market segments, and business models. From the perspective of the conceptual design engineer, the design and selection of future general aviation aircraft is complicated by the definition of constraints and requirements, and the tradeoffs among performance and cost aspects. Qualitative problem definition methods have been utilized, although their accuracy in determining specific requirement and metric values is uncertain. In industry, customers are surveyed, and business plans are created through a lengthy, iterative process. In recent years, techniques have developed for predicting the characteristics of US travel demand based on travel mode attributes, such as door-to-door time and ticket price. As of yet, these models treat the contributing systems---aircraft manufacturers and service providers---as independently variable assumptions. In this research, a methodology is developed which seeks to build a strategic design decision making environment through the construction of a system of systems model. The demonstrated implementation brings together models of the aircraft and manufacturer, the service provider, and most importantly the travel demand. Thus represented is the behavior of the consumers and the reactive behavior of the suppliers---the manufacturers and transportation service providers---in a common modeling framework. The results indicate an ability to guide the design process---specifically the selection of design requirements---through the optimization of "capability" metrics. Additionally, results indicate the ability to find synergetic solutions, that is solutions in which two systems might collaborate to achieve a better result than acting

  7. A MAINTENANCE STRATEGY MODEL FOR STATIC EQUIPMENT USING INSPECTION METHODOLOGIES AND RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    J.K. Visser

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: Mechanical equipment used on process plants can be categorised into two main types, namely static and rotating equipment. A brief survey at a number of chemical process plants indicated that a number of maintenance strategies exist and are used for rotating equipment. However, some of these strategies are not directly applicable to static equipment, although the risk-based inspection (RBI methodology has been developed for pressure vessels. A generalised risk-based maintenance strategy for all types of static equipment does not currently exist. This paper describes the development of an optimised model of inspection methodologies, maintenance strategies, and risk management principles that are generically applicable for static equipment. It enables maintenance managers and engineers to select an applicable maintenance strategy and inspection methodology, based on the operational and business risks posed by the individual pieces of equipment.

    AFRIKAANSE OPSOMMING: Meganiese toerusting wat op prosesaanlegte gebruik word kan in twee kategorieë verdeel word, naamlik statiese en roterende toerusting. 'n Bondige ondersoek by 'n aantal chemiese prosesaanlegte het aangedui dat 'n aantal strategieë vir instandhouding van roterende toerusting gebruik word, terwyl die risikogebaseerde inspeksiemetodologie wel vir drukvate gebruik word. 'n Algemene risikogebaseerde instandhoudingstrategie vir alle tipes statiese toerusting is egter nie tans beskikbaar nie. Hierdie artikel beskryf die ontwikkeling van 'n geoptimeerde model van inspeksiemetodologieë, instandhoudingstrategieë, en risikobestuursbeginsels wat algemeen gebruik kan word vir statiese toerusting. Dit stel die instandhouding-bestuurders en -ingenieurs in staat om 'n instandhoudingstrategie en inspeksie-metodologie te kies, gebaseer op die operasionele en besigheidsrisiko's van die individuele toerusting.

  8. A methodology for assessing the market benefits of alternative motor fuels: The Alternative Fuels Trade Model

    Energy Technology Data Exchange (ETDEWEB)

    Leiby, P.N.

    1993-09-01

    This report describes a modeling methodology for examining the prospective economic benefits of displacing motor gasoline use by alternative fuels. The approach is based on the Alternative Fuels Trade Model (AFTM). AFTM development was undertaken by the US Department of Energy (DOE) as part of a longer term study of alternative fuels issues. The AFTM is intended to assist with evaluating how alternative fuels may be promoted effectively, and what the consequences of substantial alternative fuels use might be. Such an evaluation of policies and consequences of an alternative fuels program is being undertaken by DOE as required by Section 502(b) of the Energy Policy Act of 1992. Interest in alternative fuels is based on the prospective economic, environmental and energy security benefits from the substitution of these fuels for conventional transportation fuels. The transportation sector is heavily dependent on oil. Increased oil use implies increased petroleum imports, with much of the increase coming from OPEC countries. Conversely, displacement of gasoline has the potential to reduce US petroleum imports, thereby reducing reliance on OPEC oil and possibly weakening OPEC`s ability to extract monopoly profits. The magnitude of US petroleum import reduction, the attendant fuel price changes, and the resulting US benefits, depend upon the nature of oil-gas substitution and the supply and demand behavior of other world regions. The methodology applies an integrated model of fuel market interactions to characterize these effects.

  9. A methodology aimed at fostering and sustaining the development processes of an IE-based industry

    Science.gov (United States)

    Corallo, Angelo; Errico, Fabrizio; de Maggio, Marco; Giangreco, Enza

    In the current competitive scenario, where business relationships are fundamental in building successful business models and inter/intra organizational business processes are progressively digitalized, an end-to-end methodology is required that is capable of guiding business networks through the Internetworked Enterprise (IE) paradigm: a new and innovative organizational model able to leverage Internet technologies to perform real-time coordination of intra and inter-firm activities, to create value by offering innovative and personalized products/services and reduce transaction costs. This chapter presents the TEKNE project Methodology of change that guides business networks, by means of a modular and flexible approach, towards the IE techno-organizational paradigm, taking into account the competitive environment of the network and how this environment influences its strategic, organizational and technological levels. Contingency, the business model, enterprise architecture and performance metrics are the key concepts that form the cornerstone of this methodological framework.

  10. Testing the methodology for site descriptive modelling. Application for the Laxemar area

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan [JA Streamflow AB, Aelvsjoe (Sweden); Berglund, Johan [SwedPower AB, Stockholm (Sweden); Follin, Sven [SF Geologic AB, Stockholm (Sweden); Hakami, Eva [Itasca Geomekanik AB, Stockholm (Sweden); Halvarson, Jan [Swedish Nuclear Fuel and Waste Management Co, Stockholm (Sweden); Hermanson, Jan [Golder Associates AB, Stockholm (Sweden); Laaksoharju, Marcus [Geopoint (Sweden); Rhen, Ingvar [Sweco VBB/VIAK, Stockholm (Sweden); Wahlgren, C.H. [Sveriges Geologiska Undersoekning, Uppsala (Sweden)

    2002-08-01

    A special project has been conducted where the currently available data from the Laxemar area, which is part of the Simpevarp site, have been evaluated and interpreted into a Site Descriptive Model covering: geology, hydrogeology, hydrogeochemistry and rock mechanics. Description of the surface ecosystem has been omitted, since it was re-characterised in another, parallel, project. Furthermore, there has been no evaluation of transport properties. The project is primarily a methodology test. The lessons learnt will be implemented in the Site Descriptive Modelling during the coming site investigation. The intent of the project has been to explore whether available methodology for Site Descriptive Modelling based on surface and borehole data is adequate and to identify potential needs for development and improvement in the methodology. The project has developed, with limitations in scope, a Site Descriptive Model in local scale, corresponding to the situation after completion of the Initial Site Investigations for the Laxemar area (i.e. 'version 1.2' using the vocabulary of the general execution program for the site investigations). The Site Descriptive Model should be reasonable, but should not be regarded as a 'real' model. There are limitations both in input data and in the scope of the analysis. The measured (primary) data constitute a wide range of different measurement results including data from two deep core drilled boreholes. These data both need to be checked for consistency and to be interpreted into a format more amenable for three-dimensional modelling. Examples of such evaluations are estimation of surface geology, lineament interpretation, geological single hole interpretation, hydrogeological single hole interpretation and assessment of hydrogeochemical data. Furthermore, while cross discipline interpretation is encouraged there is also a need for transparency. This means that the evaluations first are made within each discipline

  11. A new methodology for building energy benchmarking: An approach based on clustering concept and statistical models

    Science.gov (United States)

    Gao, Xuefeng

    Though many building energy benchmarking programs have been developed during the past decades, they hold certain limitations. The major concern is that they may cause misleading benchmarking due to not fully considering the impacts of the multiple features of buildings on energy performance. The existing methods classify buildings according to only one of many features of buildings -- the use type, which may result in a comparison between two buildings that are tremendously different in other features and not properly comparable as a result. This research aims to tackle this challenge by proposing a new methodology based on the clustering concept and statistical analysis. The clustering concept, which reflects on machine learning algorithms, classifies buildings based on a multi-dimensional domain of building features, rather than the single dimension of use type. Buildings with the greatest similarity of features that influence energy performance are classified into the same cluster, and benchmarked according to the centroid reference of the cluster. Statistical analysis is applied to find the most influential features impacting building energy performance, as well as provide prediction models for the new design energy consumption. The proposed methodology as applicable to both existing building benchmarking and new design benchmarking was discussed in this dissertation. The former contains four steps: feature selection, clustering algorithm adaptation, results validation, and interpretation. The latter consists of three parts: data observation, inverse modeling, and forward modeling. The experimentation and validation were carried out for both perspectives. It was shown that the proposed methodology could account for the total building energy performance and was able to provide a more comprehensive approach to benchmarking. In addition, the multi-dimensional clustering concept enables energy benchmarking among different types of buildings, and inspires a new

  12. Development of Registration methodology to 3-D Point Clouds in Robot Scanning

    Directory of Open Access Journals (Sweden)

    Chen Liang-Chia

    2016-01-01

    Full Text Available The problem of multi-view 3-D point clouds registration is investigated and effectively resolved by the developed methodology. A registration method is proposed to register two series of scans into an object model by using the proposed oriented-bounding-box (OBB regional area-based descriptor. Robot 3-D scanning is often employed to generate set of point clouds of physical objects. The automated operation has to successively digitize view-dependent area-scanned point clouds from complex shaped objects by multi-view point clouds registration. To achieve this, the OBB regional area-based descriptor is employed to determine an initial transformation matrix and is then refined employing iterative closest point (ICP algorithm. The developed method can be used to resolve the commonly encountered difficulty in accurately merging two neighbouring area-scanned images when no coordinate reference exists. The developed method has been verified through some experimental tests for its registration accuracy. Experimental results have preliminarily demonstrated the feasibility of the developed method.

  13. Interval Methods for Model Qualification: Methodology and Advanced Application

    OpenAIRE

    Alexandre dit Sandretto, Julien; Trombettoni, Gilles; Daney, David

    2012-01-01

    It is often too complex to use, and sometimes impossible to obtain, an actual model in simulation or command field . To handle a system in practice, a simplification of the real model is then necessary. This simplification goes through some hypotheses made on the system or the modeling approach. In this paper, we deal with all models that can be expressed by real-valued variables involved in analytical relations and depending on parameters. We propose a method that qualifies the simplificatio...

  14. Development of a methodology of evaluation of financial stability of commercial banks

    Directory of Open Access Journals (Sweden)

    Brauers Willem Karel M.

    2014-01-01

    Full Text Available The field of evaluation of financial stability of commercial banks, which emanates from persistent existence of financial crisis, induces interest of researchers for over a century. The span of prevailing methodologies stretches from over-simplified risk-return approaches to ones comprising large number of economic variables on the micro- and/or macro-economic level. Methodologies of rating agencies and current methodologies reviewed and applied by the ECB are not intended for reducing information asymmetry in the market of commercial banks. In the paper it is shown that the Lithuanian financial system is bankbased with deposits of households being its primary sources, and its stability is primarily depending on behavior of depositors. A methodology of evaluation of commercial banks with features of decreasing information asymmetry in the market of commercial banks is being developed by comparing different MCDA methods.

  15. New systematic methodology for incorporating dynamic heat transfer modelling in multi-phase biochemical reactors.

    Science.gov (United States)

    Fernández-Arévalo, T; Lizarralde, I; Grau, P; Ayesa, E

    2014-09-01

    This paper presents a new modelling methodology for dynamically predicting the heat produced or consumed in the transformations of any biological reactor using Hess's law. Starting from a complete description of model components stoichiometry and formation enthalpies, the proposed modelling methodology has integrated successfully the simultaneous calculation of both the conventional mass balances and the enthalpy change of reaction in an expandable multi-phase matrix structure, which facilitates a detailed prediction of the main heat fluxes in the biochemical reactors. The methodology has been implemented in a plant-wide modelling methodology in order to facilitate the dynamic description of mass and heat throughout the plant. After validation with literature data, as illustrative examples of the capability of the methodology, two case studies have been described. In the first one, a predenitrification-nitrification dynamic process has been analysed, with the aim of demonstrating the easy integration of the methodology in any system. In the second case study, the simulation of a thermal model for an ATAD has shown the potential of the proposed methodology for analysing the effect of ventilation and influent characterization.

  16. Spreadsheets Grow Up: Three Spreadsheet Engineering Methodologies for Large Financial Planning Models

    CERN Document Server

    Grossman, Thomas A

    2010-01-01

    Many large financial planning models are written in a spreadsheet programming language (usually Microsoft Excel) and deployed as a spreadsheet application. Three groups, FAST Alliance, Operis Group, and BPM Analytics (under the name "Spreadsheet Standards Review Board") have independently promulgated standardized processes for efficiently building such models. These spreadsheet engineering methodologies provide detailed guidance on design, construction process, and quality control. We summarize and compare these methodologies. They share many design practices, and standardized, mechanistic procedures to construct spreadsheets. We learned that a written book or standards document is by itself insufficient to understand a methodology. These methodologies represent a professionalization of spreadsheet programming, and can provide a means to debug a spreadsheet that contains errors. We find credible the assertion that these spreadsheet engineering methodologies provide enhanced productivity, accuracy and maintain...

  17. Object-oriented analysis and design: a methodology for modeling the computer-based patient record.

    Science.gov (United States)

    Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L

    1998-08-01

    The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.

  18. Conceptualizing sustainable development. An assessment methodology connecting values, knowledge, worldviews and scenarios

    Energy Technology Data Exchange (ETDEWEB)

    De Vries, Bert J.M.; Petersen, Arthur C. [Netherlands Environmental Assessment Agency, P.O. Box 303, 3720 AH Bilthoven (Netherlands)

    2009-02-15

    Sustainability science poses severe challenges to classical disciplinary science. To bring the perspectives of diverse disciplines together in a meaningful way, we describe a novel methodology for sustainability assessment of a particular social-ecological system, or country. Starting point is that a sustainability assessment should investigate the ability to continue and develop a desirable way of living vis-a-vis later generations and life elsewhere on the planet. Evidently, people hold different values and beliefs about the way societies sustain quality of life for their members. The first step, therefore, is to analyze people's value orientations and the way in which they interpret sustainability problems i.e. their beliefs. The next step is to translate the resulting worldviews into model-based narratives, i.e. scenarios. The qualitative and quantitative outcomes are then investigated in terms of associated risks and opportunities and robustness of policy options. The Netherlands Environmental Assessment Agency (PBL) has followed this methodology, using extensive surveys among the Dutch population. In its First Sustainability Outlook (2004), the resulting archetypical worldviews became the basis for four different scenarios for policy analysis, with emphases on the domains of transport, energy and food. The goal of the agency's Sustainability Outlooks is to show that choices are inevitable in policy making for sustainable development, to indicate which positive and negative impacts one can expect of these choices (trade-offs), and to identify options that may be robust under several worldviews. The conceptualization proposed here is both clear and applicable in practical sustainability assessments for policy making. (author)

  19. Testing for Equivalence: A Methodology for Computational Cognitive Modelling

    Science.gov (United States)

    Stewart, Terrence; West, Robert

    2010-12-01

    The equivalence test (Stewart and West, 2007; Stewart, 2007) is a statistical measure for evaluating the similarity between a model and the system being modelled. It is designed to avoid over-fitting and to generate an easily interpretable summary of the quality of a model. We apply the equivalence test to two tasks: Repeated Binary Choice (Erev et al., 2010) and Dynamic Stocks and Flows (Gonzalez and Dutt, 2007). In the first case, we find a broad range of statistically equivalent models (and win a prediction competition) while identifying particular aspects of the task that are not yet adequately captured. In the second case, we re-evaluate results from the Dynamic Stocks and Flows challenge, demonstrating how our method emphasizes the breadth of coverage of a model and how it can be used for comparing different models. We argue that the explanatory power of models hinges on numerical similarity to empirical data over a broad set of measures.

  20. Empirical-Statistical Methodology and Methods for Modeling and Forecasting of Climate Variability of Different Temporal Scales

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Main problem of modern climatology is to assess the present as well as future climate change. For this aim two approaches are used: physic-mathematic modeling on the basis of GCMs and palaeoclimatic analogues. The third approach is based on the empirical-statistical methodology and is developed in this paper. This ap proach allows to decide two main problems: to give a real assessment of climate changes by observed data for climate monitoring and extrapolation of obtained climate tendencies to the nearest future (10-15 years) and give the empiricai basis for further development of physic-mathematicai models. The basic theory and methodology of empirical-statistic approach have been developed as well as a common model for description of space-time climate variatiom taking into account the processes of different time scales. The way of decreasing of the present and future uncertainty is suggested as the extraction of long-term climate changes components in the particular time series and spatial generalization of the same climate tendencies in the obtained homogeneous regions. Algorithm and methods for realization of empirical-statistic methodology have been developed along with methods for generalization of intraannual fluctuations, methods for extraction of homogeneous components of different time scales (interannual, decadal, century), methodology and methods for spatial generalization and modeling, methods for extrapolation on the basis of two main kinds of time models: stochastic and deterministic--stochastic. Some applications of developed methodology and methods are given for the longest time series of temperature and precipitation over the world and for spatial generalization over the European area.

  1. Optimization of spray drying process for developing seabuckthorn fruit juice powder using response surface methodology.

    Science.gov (United States)

    Selvamuthukumaran, Meenakshisundaram; Khanum, Farhath

    2014-12-01

    The response surface methodology was used to optimize the spray drying process for development of seabuckthorn fruit juice powder. The independent variables were different levels of inlet air temperature and maltodextrin concentration. The responses were moisture, solubility, dispersibility, vitamin C and overall color difference value. Statistical analysis revealed that independent variables significantly affected all the responses. The Inlet air temperature showed maximum influence on moisture and vitamin C content, while the maltodextrin concentration showed similar influence on solubility, dispersibility and overall color difference value. Contour plots for each response were used to generate an optimum area by superimposition. The seabuckthorn fruit juice powder was developed using the derived optimum processing conditions to check the validity of the second order polynomial model. The experimental values were found to be in close agreement to the predicted values and were within the acceptable limits indicating the suitability of the model in predicting quality attributes of seabuckthorn fruit juice powder. The recommended optimum spray drying conditions for drying 100 g fruit juice slurry were inlet air temperature and maltodextrin concentration of 162.5 °C and 25 g, respectively. The spray dried juice powder contains higher amounts of antioxidants viz., vitamin C, vitamin E, total carotenoids, total anthocyanins and total phenols when compared to commercial fruit juice powders and they are also found to be free flowing without any physical alterations such as caking, stickiness, collapse and crystallization by exhibiting greater glass transition temperature.

  2. RISK ANALYSIS DEVELOPED MODEL

    Directory of Open Access Journals (Sweden)

    Georgiana Cristina NUKINA

    2012-07-01

    Full Text Available Through Risk analysis developed model deciding whether control measures suitable for implementation. However, the analysis determines whether the benefits of a data control options cost more than the implementation.

  3. Direct Adaptive Control Methodologies for Flexible-Joint Space Manipulators with Uncertainties and Modeling Errors

    Science.gov (United States)

    Ulrich, Steve

    This work addresses the direct adaptive trajectory tracking control problem associated with lightweight space robotic manipulators that exhibit elastic vibrations in their joints, and which are subject to parametric uncertainties and modeling errors. Unlike existing adaptive control methodologies, the proposed flexible-joint control techniques do not require identification of unknown parameters, or mathematical models of the system to be controlled. The direct adaptive controllers developed in this work are based on the model reference adaptive control approach, and manage modeling errors and parametric uncertainties by time-varying the controller gains using new adaptation mechanisms, thereby reducing the errors between an ideal model and the actual robot system. More specifically, new decentralized adaptation mechanisms derived from the simple adaptive control technique and fuzzy logic control theory are considered in this work. Numerical simulations compare the performance of the adaptive controllers with a nonadaptive and a conventional model-based controller, in the context of 12.6 m xx 12.6 m square trajectory tracking. To validate the robustness of the controllers to modeling errors, a new dynamics formulation that includes several nonlinear effects usually neglected in flexible-joint dynamics models is proposed. Results obtained with the adaptive methodologies demonstrate an increased robustness to both uncertainties in joint stiffness coefficients and dynamics modeling errors, as well as highly improved tracking performance compared with the nonadaptive and model-based strategies. Finally, this work considers the partial state feedback problem related to flexible-joint space robotic manipulators equipped only with sensors that provide noisy measurements of motor positions and velocities. An extended Kalman filter-based estimation strategy is developed to estimate all state variables in real-time. The state estimation filter is combined with an adaptive

  4. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  5. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    Science.gov (United States)

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  6. Theoretic-methodological approaches to determine the content and classification of innovation-investment development strategies

    Directory of Open Access Journals (Sweden)

    Gerashenkova Tatyana

    2016-01-01

    Full Text Available The article states the necessity to form an innovation-investment strategy of enterprise development, offers an approach to its classification, determines the place of this strategy in a corporatewide strategy, gives the methodology of formation and the realization form of the innovation-investment development strategy.

  7. Application Development Methodology Appropriateness: An Exploratory Case Study Bridging the Gap between Framework Characteristics and Selection

    Science.gov (United States)

    Williams, Lawrence H., Jr.

    2013-01-01

    This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…

  8. Methodological approaches to developing a plan of land and economic unit of the settlement

    OpenAIRE

    Dorosh, O.

    2015-01-01

    This paper deals with problematic of legislation, the legal relations regulated by which are associated with the use of land in the settlements of Ukraine. Methodological approaches on the development of the plan on land-economic settlements unit have been suggested. It is proved that the land management documentation provides an effective planning of the territorial development of urban and rural settlements.

  9. Educational Planning Methodology for the Integrated Development of Rural Areas. Reports Studies... S.83.

    Science.gov (United States)

    United Nations Educational, Scientific, and Cultural Organization, Paris (France).

    A summary of educational planning methodologies tested in Argentina, Guatemala, Brazil, Ecuador, and Bolivia, the document offers opinions and proposals about integrated rural development. Integrated rural development is seen as a social, economic, political, and cultural process in rural areas, designed to improve living conditions. Chapters…

  10. Development of Hydrophobic Coatings for Water-Repellent Surfaces Using Hybrid Methodology

    Science.gov (United States)

    2014-04-01

    windows, optical components, protective eyewear, and clothing, this type of surface is desired for the material to be soil repellent and water ...Development of Hydrophobic Coatings for Water - Repellent Surfaces Using Hybrid Methodology by Amanda S. Weerasooriya, Jacqueline Yim, Andres A...Proving Ground, MD 21005-5069 ARL-TR-6898 April 2014 Development of Hydrophobic Coatings for Water - Repellent Surfaces Using Hybrid

  11. The economics of climate change mitigation in developing countries - methodological and empirical results

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.

    1997-12-01

    This thesis presents a methodological and empirical discussion of the costs associated with implementing greenhouse gas reduction strategies in developing countries. It presents a methodological framework for national costing studies and evaluates a number of associated valuation methods. The methodological framework has been applied in several developing countries as part of a UNEP project in which the author has participated, and reference is made to the results of these country studies. Some of the theoretical issues associated with the determination of the costs of emission reductions are discussed with reference to a number of World Bank and UN guidelines for project analysis in developing countries. The use of several accounting prices is recommended for mitigation projects, with a distinction being made between internationally and domestically traded goods. The consequences of using different accounting prices are discussed with respect to the methodology applied in the UNEP country studies. In conclusion the thesis reviews the results of some of the most important international studies of greenhouse gas emissions in developing countries. The review, which encompasses a total of 27 country studies, was undertaken by the author for the Intergovernmental Panel of Climate Change, the IPCC. Its conclusion is that the UNEP methodological framework and associated country study results are consistent with the recommendations and conclusions of the IPCC. (EG) 23 refs.

  12. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  13. Towards a methodology for educational modelling: a case in educational assessment

    NARCIS (Netherlands)

    Giesbers, Bas; Van Bruggen, Jan; Hermans, Henry; Joosten-ten Brinke, Desirée; Burgers, Jan; Koper, Rob; Latour, Ignace

    2005-01-01

    Giesbers, B., van Bruggen, J., Hermans, H., Joosten-ten Brinke, D., Burgers, J., Koper, R., & Latour, I. (2007). Towards a methodology for educational modelling: a case in educational assessment. Educational Technology & Society, 10 (1), 237-247.

  14. A changing climate: impacts on human exposures to O3 using an integrated modeling methodology

    Science.gov (United States)

    Predicting the impacts of changing climate on human exposure to air pollution requires future scenarios that account for changes in ambient pollutant concentrations, population sizes and distributions, and housing stocks. An integrated methodology to model changes in human exposu...

  15. Modelling river dune development

    NARCIS (Netherlands)

    Paarlberg, Andries; Weerts, H.J.T.; Dohmen-Janssen, Catarine M.; Ritsema, I.L; Hulscher, Suzanne J.M.H.; van Os, A.G.; Termes, A.P.P.

    2005-01-01

    Since river dunes influence flow resistance, predictions of dune dimensions are required to make accurate water level predictions. A model approach to simulate developing river dunes is presented. The model is set-up to be appropriate, i.e. as simple as possible, but with sufficient accuracy for

  16. Developing Agent-Oriented Video Surveillance System through Agent-Oriented Methodology (AOM

    Directory of Open Access Journals (Sweden)

    Cheah Wai Shiang

    2016-12-01

    Full Text Available Agent-oriented methodology (AOM is a comprehensive and unified agent methodology for agent-oriented software development. Although AOM is claimed to be able to cope with a complex system development, it is still not yet determined up to what extent this may be true. Therefore, it is vital to conduct an investigation to validate this methodology. This paper presents the adoption of AOM in developing an agent-oriented video surveillance system (VSS. An intruder handling scenario is designed and implemented through AOM. AOM provides an alternative method to engineer a distributed security system in a systematic manner. It presents the security system at a holistic view; provides a better conceptualization of agent-oriented security system and supports rapid prototyping as well as simulation of video surveillance system.

  17. Methodology for Modeling Building Energy Performance across the Commercial Sector

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, B.; Long, N.; Torcellini, P.; Judkoff, R.; Crawley, D.; Ryan, J.

    2008-03-01

    This report uses EnergyPlus simulations of each building in the 2003 Commercial Buildings Energy Consumption Survey (CBECS) to document and demonstrate bottom-up methods of modeling the entire U.S. commercial buildings sector (EIA 2006). The ability to use a whole-building simulation tool to model the entire sector is of interest because the energy models enable us to answer subsequent 'what-if' questions that involve technologies and practices related to energy. This report documents how the whole-building models were generated from the building characteristics in 2003 CBECS and compares the simulation results to the survey data for energy use.

  18. A Methodology for Robust Multiproxy Paleoclimate Reconstructions and Modeling of Temperature Conditional Quantiles.

    Science.gov (United States)

    Janson, Lucas; Rajaratnam, Bala

    Great strides have been made in the field of reconstructing past temperatures based on models relating temperature to temperature-sensitive paleoclimate proxies. One of the goals of such reconstructions is to assess if current climate is anomalous in a millennial context. These regression based approaches model the conditional mean of the temperature distribution as a function of paleoclimate proxies (or vice versa). Some of the recent focus in the area has considered methods which help reduce the uncertainty inherent in such statistical paleoclimate reconstructions, with the ultimate goal of improving the confidence that can be attached to such endeavors. A second important scientific focus in the subject area is the area of forward models for proxies, the goal of which is to understand the way paleoclimate proxies are driven by temperature and other environmental variables. One of the primary contributions of this paper is novel statistical methodology for (1) quantile regression with autoregressive residual structure, (2) estimation of corresponding model parameters, (3) development of a rigorous framework for specifying uncertainty estimates of quantities of interest, yielding (4) statistical byproducts that address the two scientific foci discussed above. We show that by using the above statistical methodology we can demonstrably produce a more robust reconstruction than is possible by using conditional-mean-fitting methods. Our reconstruction shares some of the common features of past reconstructions, but we also gain useful insights. More importantly, we are able to demonstrate a significantly smaller uncertainty than that from previous regression methods. In addition, the quantile regression component allows us to model, in a more complete and flexible way than least squares, the conditional distribution of temperature given proxies. This relationship can be used to inform forward models relating how proxies are driven by temperature.

  19. Modelling and Statistical Optimization of Dilute Acid Hydrolysis of Corn Stover Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Andrew Nosakhare Amenaghawon

    2014-07-01

    Full Text Available Response surface methodology (RSM was employed for the analysis of the simultaneous effect of acid concentration, pretreatment time and temperature on the total reducing sugar concentration obtained during acid hydrolysis of corn stover. A three-variable, three-level Box-Behnken design (BBD was used to develop a statistical model for the optimization of the process variables. The optimal hydrolysis conditions that resulted in the maximum total reducing sugar concentration were acid concentration; 1.72 % (w/w, temperature; 169.260C and pretreatment time; 48.73 minutes. Under these conditions, the total reducing sugar concentration was obtained to be 23.41g/L. Validation of the model indicated no difference between predicted and observed values.

  20. A new methodology for modelling of health risk from urban flooding exemplified by cholera

    DEFF Research Database (Denmark)

    Mark, Ole; Jørgensen, Claus; Hammond, Michael

    2016-01-01

    The phenomenon of urban flooding due to rainfall exceeding the design capacity of drainage systems is a global problem and can have significant economic and social consequences. This is even more extreme in developing countries, where poor sanitation still causes a high infectious disease burden...... outlines a novel methodology for linking dynamic urban flood modelling with quantitative microbial risk assessment (QMRA). This provides a unique possibility for understanding the interaction between urban flooding and health risk caused by direct human contact with the flood water and hence gives...... and mortality, especially during floods. At present, there are no software tools capable of combining hydrodynamic modelling and health risk analyses, and the links between urban flooding and the health risk for the population due to direct contact with the flood water are poorly understood. The present paper...

  1. Modeling Epistemic and Ontological Cognition: Philosophical Perspectives and Methodological Directions

    Science.gov (United States)

    Greene, Jeffrey A.; Azevedo, Roger A.; Torney-Purta, Judith

    2008-01-01

    We propose an integration of aspects of several developmental and systems of beliefs models of personal epistemology. Qualitatively different positions, including realism, dogmatism, skepticism, and rationalism, are characterized according to individuals' beliefs across three dimensions in a model of epistemic and ontological cognition. This model…

  2. A methodology to calibrate pedestrian walker models using multiple objectives

    NARCIS (Netherlands)

    Campanella, M.C.; Daamen, W.; Hoogendoorn, S.P.

    2012-01-01

    The application of walker models to simulate real situations require accuracy in several traffic situations. One strategy to obtain a generic model is to calibrate the parameters in several situations using multiple-objective functions in the optimization process. In this paper, we propose a general

  3. USEPA SHEDS MODEL: METHODOLOGY FOR EXPOSURE ASSESSMENT FOR WOOD PRESERVATIVES

    Science.gov (United States)

    A physically-based, Monte Carlo probabilistic model (SHEDS-Wood: Stochastic Human Exposure and Dose Simulation model for wood preservatives) has been applied to assess the exposure and dose of children to arsenic (As) and chromium (Cr) from contact with chromated copper arsenat...

  4. Using soft systems methodology to develop a simulation of out-patient services.

    Science.gov (United States)

    Lehaney, B; Paul, R J

    1994-10-01

    Discrete event simulation is an approach to modelling a system in the form of a set of mathematical equations and logical relationships, usually used for complex problems, which are difficult to address by using analytical or numerical methods. Managing out-patient services is such a problem. However, simulation is not in itself a systemic approach, in that it provides no methodology by which system boundaries and system activities may be identified. The investigation considers the use of soft systems methodology as an aid to drawing system boundaries and identifying system activities, for the purpose of simulating the outpatients' department at a local hospital. The long term aims are to examine the effects that the participative nature of soft systems methodology has on the acceptability of the simulation model, and to provide analysts and managers with a process that may assist in planning strategies for health care.

  5. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    Science.gov (United States)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  6. Development of practical methodology and indicators for on-farm animal welfare assessment

    OpenAIRE

    2016-01-01

    163 p. Work described in the doctoral thesis entitled ¿Development of practical methodology and indicators for on-farm animal welfare assessment¿ was conducted within the frame of Work Package 1 of the AWIN project by Joanna Marchewka. The research project aimed to optimize strategies for welfare assessment including pain in turkeys and sheep. Due to scarce knowledge on turkeys¿ welfare and lack of methodology for its evaluation, the first part of work concentrated on the development of a ...

  7. A study on development of methodology and guidelines of risk based regulation for optimization of regulation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Hoon; Chang, Soon Hong; Kang, Kee Sik; Paek, Won Phil; Kim, Han Gon; Chang, Hyeon Seop [Korea Association for Nuclear Technology, Taejon (Korea, Republic of)

    1995-12-15

    This project consists of the three phase as follow : define the RBR(Risk Based Regulation) concept and analysis of state of art to RBR to NRC, EPRI etc., develop the application area and guideline of RBR to the selected area, develop the regulatory guideline with considering the plant specific conditions in detail. For the first year of this study elementary work for risk based regulation concept was analysed and performed as follows : state of the art of RBR research status, methodology establishment for usage of PSA(Probabilistic Safety Assessment), establishment of the methodology of risk based regulation, and application area for RBR.

  8. Application of infinite model predictive control methodology to other advanced controllers.

    Science.gov (United States)

    Abu-Ayyad, M; Dubay, R; Hernandez, J M

    2009-01-01

    This paper presents an application of most recent developed predictive control algorithm an infinite model predictive control (IMPC) to other advanced control schemes. The IMPC strategy was derived for systems with different degrees of nonlinearity on the process gain and time constant. Also, it was shown that IMPC structure uses nonlinear open-loop modeling which is conducted while closed-loop control is executed every sampling instant. The main objective of this work is to demonstrate that the methodology of IMPC can be applied to other advanced control strategies making the methodology generic. The IMPC strategy was implemented on several advanced controllers such as PI controller using Smith-Predictor, Dahlin controller, simplified predictive control (SPC), dynamic matrix control (DMC), and shifted dynamic matrix (m-DMC). Experimental work using these approaches combined with IMPC was conducted on both single-input-single-output (SISO) and multi-input-multi-output (MIMO) systems and compared with the original forms of these advanced controllers. Computer simulations were performed on nonlinear plants demonstrating that the IMPC strategy can be readily implemented on other advanced control schemes providing improved control performance. Practical work included real-time control applications on a DC motor, plastic injection molding machine and a MIMO three zone thermal system.

  9. Platform development for merging various information sources for water management: methodological, technical and operational aspects

    Science.gov (United States)

    Galvao, Diogo

    2013-04-01

    As a result of various economic, social and environmental factors, we can all experience the increase in importance of water resources at a global scale. As a consequence, we can also notice the increasing need of methods and systems capable of efficiently managing and combining the rich and heterogeneous data available that concerns, directly or indirectly, these water resources, such as in-situ monitoring station data, Earth Observation images and measurements, Meteorological modeling forecasts and Hydrological modeling. Under the scope of the MyWater project, we developed a water management system capable of satisfying just such needs, under a flexible platform capable of accommodating future challenges, not only in terms of sources of data but also on applicable models to extract information from it. From a methodological point of view, the MyWater platform obtains data from distinct sources, and in distinct formats, be they Satellite images or meteorological model forecasts, transforms and combines them in ways that allow them to be fed to a variety of hydrological models (such as MOHID Land, SIMGRO, etc…), which themselves can also be combined, using such approaches as those advocated by the OpenMI standard, to extract information in an automated and time efficient manner. Such an approach brings its own deal of challenges, and further research was developed under this project on the best ways to combine such data and on novel approaches to hydrological modeling (like the PriceXD model). From a technical point of view, the MyWater platform is structured according to a classical SOA architecture, with a flexible object oriented modular backend service responsible for all the model process management and data treatment, while the information extracted can be interacted with using a variety of frontends, from a web portal, including also a desktop client, down to mobile phone and tablet applications. From an operational point of view, a user can not only see

  10. Methodology for the Incorporation of Passive Component Aging Modeling into the RAVEN/ RELAP-7 Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mandelli, Diego; Rabiti, Cristian; Cogliati, Joshua; Alfonsi, Andrea; Askin Guler; Tunc Aldemir

    2014-11-01

    Passive system, structure and components (SSCs) will degrade over their operation life and this degradation may cause to reduction in the safety margins of a nuclear power plant. In traditional probabilistic risk assessment (PRA) using the event-tree/fault-tree methodology, passive SSC failure rates are generally based on generic plant failure data and the true state of a specific plant is not reflected realistically. To address aging effects of passive SSCs in the traditional PRA methodology [1] does consider physics based models that account for the operating conditions in the plant, however, [1] does not include effects of surveillance/inspection. This paper represents an overall methodology for the incorporation of aging modeling of passive components into the RAVEN/RELAP-7 environment which provides a framework for performing dynamic PRA. Dynamic PRA allows consideration of both epistemic and aleatory uncertainties (including those associated with maintenance activities) in a consistent phenomenological and probabilistic framework and is often needed when there is complex process/hardware/software/firmware/ human interaction [2]. Dynamic PRA has gained attention recently due to difficulties in the traditional PRA modeling of aging effects of passive components using physics based models and also in the modeling of digital instrumentation and control systems. RAVEN (Reactor Analysis and Virtual control Environment) [3] is a software package under development at the Idaho National Laboratory (INL) as an online control logic driver and post-processing tool. It is coupled to the plant transient code RELAP-7 (Reactor Excursion and Leak Analysis Program) also currently under development at INL [3], as well as RELAP 5 [4]. The overall methodology aims to: • Address multiple aging mechanisms involving large number of components in a computational feasible manner where sequencing of events is conditioned on the physical conditions predicted in a simulation

  11. Terminology and methodology in modelling for water quality management

    DEFF Research Database (Denmark)

    Carstensen, J.; Vanrolleghem, P.; Rauch, W.

    1997-01-01

    There is a widespread need for a common terminology in modelling for water quality management. This paper points out sources of confusion in the communication between researchers due to misuse of existing terminology or use of unclear terminology. The paper attempts to clarify the context...... of the most widely used terms for characterising models and within the process of model building. It is essential to the ever growing society of researchers within water quality management, that communication is eased by establishing a common terminology. This should not be done by giving broader definitions...

  12. Maturity Models Development in IS Research

    DEFF Research Database (Denmark)

    Lasrado, Lester Allan; Vatrapu, Ravi; Andersen, Kim Normann

    2015-01-01

    Maturity models are widespread in IS research and in particular, IT practitioner communities. However, theoretically sound, methodologically rigorous and empirically validated maturity models are quite rare. This literature review paper focuses on the challenges faced during the development...... literature reveals that researchers have primarily focused on developing new maturity models pertaining to domain-specific problems and/or new enterprise technologies. We find rampant re-use of the design structure of widely adopted models such as Nolan’s Stage of Growth Model, Crosby’s Grid, and Capability...... Maturity Model (CMM). Only recently have there been some research efforts to standardize maturity model development. We also identify three dominant views of maturity models and provide guidelines for various approaches of constructing maturity models with a standard vocabulary. We finally propose using...

  13. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    Science.gov (United States)

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT).

  14. Methodology and models in erosion research: discussion and conclusions

    National Research Council Canada - National Science Library

    Shellis, R P; Ganss, C; Ren, Y; Zero, D T; Lussi, A

    2011-01-01

    .... The prospects for clinical trials are also discussed. All models in erosion research require a number of choices regarding experimental conditions, study design and measurement techniques, and these general aspects are discussed first...

  15. Methodologies in the modeling of combined chemo-radiation treatments

    Science.gov (United States)

    Grassberger, C.; Paganetti, H.

    2016-11-01

    The variety of treatment options for cancer patients has increased significantly in recent years. Not only do we combine radiation with surgery and chemotherapy, new therapeutic approaches such as immunotherapy and targeted therapies are starting to play a bigger role. Physics has made significant contributions to radiation therapy treatment planning and delivery. In particular, treatment plan optimization using inverse planning techniques has improved dose conformity considerably. Furthermore, medical physics is often the driving force behind tumor control and normal tissue complication modeling. While treatment optimization and outcome modeling does focus mainly on the effects of radiation, treatment modalities such as chemotherapy are treated independently or are even neglected entirely. This review summarizes the published efforts to model combined modality treatments combining radiation and chemotherapy. These models will play an increasing role in optimizing cancer therapy not only from a radiation and drug dosage standpoint, but also in terms of spatial and temporal optimization of treatment schedules.

  16. Systematic reviews of animal models: methodology versus epistemology.

    Science.gov (United States)

    Greek, Ray; Menache, Andre

    2013-01-01

    Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  17. Systematic Reviews of Animal Models: Methodology versus Epistemology

    Directory of Open Access Journals (Sweden)

    Ray Greek, Andre Menache

    2013-01-01

    Full Text Available Systematic reviews are currently favored methods of evaluating research in order to reach conclusions regarding medical practice. The need for such reviews is necessitated by the fact that no research is perfect and experts are prone to bias. By combining many studies that fulfill specific criteria, one hopes that the strengths can be multiplied and thus reliable conclusions attained. Potential flaws in this process include the assumptions that underlie the research under examination. If the assumptions, or axioms, upon which the research studies are based, are untenable either scientifically or logically, then the results must be highly suspect regardless of the otherwise high quality of the studies or the systematic reviews. We outline recent criticisms of animal-based research, namely that animal models are failing to predict human responses. It is this failure that is purportedly being corrected via systematic reviews. We then examine the assumption that animal models can predict human outcomes to perturbations such as disease or drugs, even under the best of circumstances. We examine the use of animal models in light of empirical evidence comparing human outcomes to those from animal models, complexity theory, and evolutionary biology. We conclude that even if legitimate criticisms of animal models were addressed, through standardization of protocols and systematic reviews, the animal model would still fail as a predictive modality for human response to drugs and disease. Therefore, systematic reviews and meta-analyses of animal-based research are poor tools for attempting to reach conclusions regarding human interventions.

  18. A Modeling Methodology to Support Evaluation Public Health Impacts on Air Pollution Reduction Programs

    Science.gov (United States)

    Environmental public health protection requires a good understanding of types and locations of pollutant emissions of health concern and their relationship to environmental public health indicators. Therefore, it is necessary to develop the methodologies, data sources, and tools...

  19. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  20. The National Aviation Operational Monitoring Service (NAOMS): A Documentation of the Development of a Survey Methodology

    Science.gov (United States)

    Connors, Mary M.; Mauro, Robert; Statler, Irving C.

    2012-01-01

    The National Aviation Operational Monitoring Service (NAOMS) was a research project under NASA s Aviation Safety Program during the years from 2000 to 2005. The purpose of this project was to develop a methodology for gaining reliable information on changes over time in the rates-of-occurrence of safety-related events as a means of assessing the safety of the national airspace. The approach was a scientifically designed survey of the operators of the aviation system concerning their safety-related experiences. This report presents the results of the methodology developed and a demonstration of the NAOMS concept through a survey of nearly 20,000 randomly selected air-carrier pilots. Results give evidence that the NAOMS methodology can provide a statistically sound basis for evaluating trends of incidents that could compromise safety. The approach and results are summarized in the report and supporting documentation and complete analyses of results are presented in 14 appendices.

  1. Developing a Design Methodology for Web 2.0 Mediated Learning

    DEFF Research Database (Denmark)

    2010-01-01

    Ed). We describe how this method has been adopted as part of a learning methodology building on concepts and models presented in the other symposium papers, in particular those of active, problem based learning and web 2.0-technologies. The challenge of designing on the basis of an explicit learning...

  2. Performance Support Engineering: An Emerging Development Methodology for Enabling Organizational Learning.

    Science.gov (United States)

    Raybould, Barry

    1995-01-01

    Discussion of electronic performance support systems (EPSS) focuses on performance support engineering and its role in designing performance support systems. Highlights include the organizational performance/learning cycle model; a systems approach to EPSS; computer-based training and other EPSS methodologies; and future possibilities. (LRW)

  3. A Roadmap for Generating Semantically Enriched Building Models According to CityGML Model via Two Different Methodologies

    Science.gov (United States)

    Floros, G.; Solou, D.; Pispidikis, I.; Dimopoulou, E.

    2016-10-01

    The methodologies of 3D modeling techniques have increasingly increased due to the rapid advances of new technologies. Nowadays, the focus of 3D modeling software is focused, not only to the finest visualization of the models, but also in their semantic features during the modeling procedure. As a result, the models thus generated are both realistic and semantically enriched. Additionally, various extensions of modeling software allow for the immediate conversion of the model's format, via semi-automatic procedures with respect to the user's scope. The aim of this paper is to investigate the generation of a semantically enriched Citygml building model via two different methodologies. The first methodology includes the modeling in Trimble SketchUp and the transformation in FME Desktop Manager, while the second methodology includes the model's generation in CityEngine and its transformation in the CityGML format via the 3DCitiesProject extension for ArcGIS. Finally, the two aforesaid methodologies are being compared and specific characteristics are evaluated, in order to infer the methodology that is best applied depending on the different projects' purposes.

  4. Modelling methodology for engineering of complex sociotechnical systems

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2014-10-01

    Full Text Available Different systems engineering techniques and approaches are applied to design and develop complex sociotechnical systems for complex problems. In a complex sociotechnical system cognitive and social humans use information technology to make sense...

  5. Methodology of formation and development of the research-andproduction clusters in the region

    Directory of Open Access Journals (Sweden)

    E. L. Smolyanova

    2012-01-01

    Full Text Available In article the point of view of the author on set of means and the methods necessary for formation and development of research-and-production clusters is positioned. As the base doctrine of development of methodology of formation and development of research-andproduction clusters the indicative planning creating conditions for coordination of economic interests of business and the state is considered.

  6. The Backyard Human Performance Technologist: Applying the Development Research Methodology to Develop and Validate a New Instructional Design Framework

    Science.gov (United States)

    Brock, Timothy R.

    2009-01-01

    Development research methodology (DRM) has been recommended as a viable research approach to expand the practice-to-theory/theory-to-practice literature that human performance technology (HPT) practitioners can integrate into the day-to-day work flow they already use to develop instructional products. However, little has been written about how it…

  7. Measuring Team Shared Understanding Using the Analysis-Constructed Shared Mental Model Methodology

    Science.gov (United States)

    Johnson, Tristan E.; O'Connor, Debra L.

    2008-01-01

    Teams are an essential part of successful performance in learning and work environments. Analysis-constructed shared mental model (ACSMM) methodology is a set of techniques where individual mental models are elicited and sharedness is determined not by the individuals who provided their mental models but by an analytical procedure. This method…

  8. A European test of pesticide-leaching models: methodology and major recommendations

    NARCIS (Netherlands)

    Vanclooster, M.; Boesten, J.J.T.I.; Trevisan, M.; Brown, C.D.; Capri, E.; Eklo, O.M.; Gottesbüren, B.; Gouy, V.; Linden, van der A.M.A.

    2000-01-01

    Testing of pesticide-leaching models is important in view of their increasing use in pesticide registration procedures in the European Union. This paper presents the methodology and major conclusions of a test of pesticide-leaching models. Twelve models simulating the vertical one-dimensional moveme

  9. An integrated methodology on the suitability of offshore sites for wind farm development

    Science.gov (United States)

    Patlakas, Platon; Galanis, George; Péray, Marie; Filipot, Jean-François; Kalogeri, Christina; Spyrou, Christos; Diamantis, Dimitris; Kallos, Gerorge

    2016-04-01

    During, the last decades the potential and interest in wind energy investments has been constantly increasing in the European countries. As technology changes rapidly, more and more areas can be identified as suitable for energy applications. Offshore wind farms perfectly illustrate how new technologies allow to build bigger, more efficient and resistant in extreme conditions wind power plants. The current work proposes an integrated methodology to determine the suitability of an offshore marine area for the development of wind farm structures. More specifically, the region of interest is evaluated based both on the natural resources, connected to the local environmental characteristics, and potential constrains set by anthropogenic or other activities. State of the art atmospheric and wave models and a 10-year hindcast database are utilized in conjunction with local information for a number of potential constrains, leading to a 5-scale suitability index for the whole area. In this way, sub regions are characterized, at a high resolution mode, as poorly or highly suitable for wind farm development, providing a new tool for technical/research teams and decision makers. In addition, extreme wind and wave conditions and their 50-years return period are analyzed and used to define the safety level of the wind farms structural characteristics.

  10. Clinical trials in Huntington's disease: Interventions in early clinical development and newer methodological approaches.

    Science.gov (United States)

    Sampaio, Cristina; Borowsky, Beth; Reilmann, Ralf

    2014-09-15

    Since the identification of the Huntington's disease (HD) gene, knowledge has accumulated about mechanisms directly or indirectly affected by the mutated Huntingtin protein. Transgenic and knock-in animal models of HD facilitate the preclinical evaluation of these targets. Several treatment approaches with varying, but growing, preclinical evidence have been translated into clinical trials. We review major landmarks in clinical development and report on the main clinical trials that are ongoing or have been recently completed. We also review clinical trial settings and designs that influence drug-development decisions, particularly given that HD is an orphan disease. In addition, we provide a critical analysis of the evolution of the methodology of HD clinical trials to identify trends toward new processes and endpoints. Biomarker studies, such as TRACK-HD and PREDICT-HD, have generated evidence for the potential usefulness of novel outcome measures for HD clinical trials, such as volumetric imaging, quantitative motor (Q-Motor) measures, and novel cognitive endpoints. All of these endpoints are currently applied in ongoing clinical trials, which will provide insight into their reliability, sensitivity, and validity, and their use may expedite proof-of-concept studies. We also outline the specific opportunities that could provide a framework for a successful avenue toward identifying and efficiently testing and translating novel mechanisms of action in the HD field.

  11. A branch-and-bound methodology within algebraic modelling systems

    NARCIS (Netherlands)

    Bisschop, J.J.; Heerink, J.B.J.; Kloosterman, G.

    1998-01-01

    Through the use of application-specific branch-and-bound directives it is possible to find solutions to combinatorial models that would otherwise be difficult or impossible to find by just using generic branch-and-bound techniques within the framework of mathematical programming. {\\sc Minto} is an e

  12. Summarizing systematic reviews: methodological development, conduct and reporting of an umbrella review approach.

    Science.gov (United States)

    Aromataris, Edoardo; Fernandez, Ritin; Godfrey, Christina M; Holly, Cheryl; Khalil, Hanan; Tungpunkom, Patraporn

    2015-09-01

    With the increase in the number of systematic reviews available, a logical next step to provide decision makers in healthcare with the evidence they require has been the conduct of reviews of existing systematic reviews. Syntheses of existing systematic reviews are referred to by many different names, one of which is an umbrella review. An umbrella review allows the findings of reviews relevant to a review question to be compared and contrasted. An umbrella review's most characteristic feature is that this type of evidence synthesis only considers for inclusion the highest level of evidence, namely other systematic reviews and meta-analyses. A methodology working group was formed by the Joanna Briggs Institute to develop methodological guidance for the conduct of an umbrella review, including diverse types of evidence, both quantitative and qualitative. The aim of this study is to describe the development and guidance for the conduct of an umbrella review. Discussion and testing of the elements of methods for the conduct of an umbrella review were held over a 6-month period by members of a methodology working group. The working group comprised six participants who corresponded via teleconference, e-mail and face-to-face meeting during this development period. In October 2013, the methodology was presented in a workshop at the Joanna Briggs Institute Convention. Workshop participants, review authors and methodologists provided further testing, critique and feedback on the proposed methodology. This study describes the methodology and methods developed for the conduct of an umbrella review that includes published systematic reviews and meta-analyses as the analytical unit of the review. Details are provided regarding the essential elements of an umbrella review, including presentation of the review question in a Population, Intervention, Comparator, Outcome format, nuances of the inclusion criteria and search strategy. A critical appraisal tool with 10 questions to

  13. Methodology for Developing the REScheckTM Software through Version 4.4.3

    Energy Technology Data Exchange (ETDEWEB)

    Bartlett, Rosemarie; Connell, Linda M; Gowri, Krishnan; Lucas, Robert G; Schultz, Robert W; Taylor, Zachary T; Wiberg, John D

    2012-09-01

    , MECcheck was renamed REScheck™ to better identify it as a residential code compliance tool. The “MEC” in MECcheck was outdated because it was taken from the Model Energy Code, which has been succeeded by the IECC. The “RES” in REScheck is also a better fit with the companion commercial product, COMcheck™. The easy-to-use REScheck compliance materials include a compliance and enforcement manual for all the MEC and IECC requirements and three compliance approaches for meeting the code’s thermal envelope requirements-prescriptive packages, software, and a trade-off worksheet (included in the compliance manual). The compliance materials can be used for single-family and low-rise multifamily dwellings. The materials allow building energy efficiency measures (such as insulation levels) to be “traded off” against each other, allowing a wide variety of building designs to comply with the code. This report explains the methodology used to develop Version 4.4.3 of the REScheck software developed for the 1992, 1993, and 1995 editions of the MEC, and the 1998, 2000, 2003, 2006, 2007, 2009, and 2012 editions of the IECC, and the 2006 edition of the International Residential Code (IRC). Although some requirements contained in these codes have changed, the methodology used to develop the REScheck software for these editions is similar. Beginning with REScheck Version 4.4.0, support for 1992, 1993, and 1995 MEC and the 1998 IECC is no longer included, but those sections remain in this document for reference purposes. REScheck assists builders in meeting the most complicated part of the code-the building envelope Uo-, U-, and R-value requirements in Section 502 of the code. This document details the calculations and assumptions underlying the treatment of the code requirements in REScheck, with a major emphasis on the building envelope requirements.

  14. Methodology Aspects of Quantifying Stochastic Climate Variability with Dynamic Models

    Science.gov (United States)

    Nuterman, Roman; Jochum, Markus; Solgaard, Anna

    2015-04-01

    The paleoclimatic records show that climate has changed dramatically through time. For the past few millions of years it has been oscillating between ice ages, with large parts of the continents covered with ice, and warm interglacial periods like the present one. It is commonly assumed that these glacial cycles are related to changes in insolation due to periodic changes in Earth's orbit around Sun (Milankovitch theory). However, this relationship is far from understood. The insolation changes are so small that enhancing feedbacks must be at play. It might even be that the external perturbation only plays a minor role in comparison to internal stochastic variations or internal oscillations. This claim is based on several shortcomings in the Milankovitch theory: Prior to one million years ago, the duration of the glacial cycles was indeed 41,000 years, in line with the obliquity cycle of Earth's orbit. This duration changed at the so-called Mid-Pleistocene transition to approximately 100,000 years. Moreover, according to Milankovitch's theory the interglacial of 400,000 years ago should not have happened. Thus, while prior to one million years ago the pacing of these glacial cycles may be tied to changes in Earth's orbit, we do not understand the current magnitude and phasing of the glacial cycles. In principle it is possible that the glacial/interglacial cycles are not due to variations in Earth's orbit, but due to stochastic forcing or internal modes of variability. We present a new method and preliminary results for a unified framework using a fully coupled Earth System Model (ESM), in which the leading three ice age hypotheses will be investigated together. Was the waxing and waning of ice sheets due to an internal mode of variability, due to variations in Earth's orbit, or simply due to a low-order auto-regressive process (i.e., noise integrated by system with memory)? The central idea is to use the Generalized Linear Models (GLM), which can handle both

  15. Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Yuting [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dunham, Camilla [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fuchs, Heidi [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Sarah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stratton, Hannah [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) million hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled

  16. Methodology of Remote Control of Competitive Swimmers’ Individual Energetic Profile Development

    Directory of Open Access Journals (Sweden)

    Kh.A. Sanosyan

    2012-06-01

    Full Text Available The offered approach helps to estimate the correlation of swimmer’s major energetic mechanisms distantly (remotely in parallel to competition (training, basing on data, got by means of GPS and other devices. This methodology promotes efficient use of means of educational control, developed and tested before, implementing the concept of parallel control over training and competition.

  17. Research and Development in an ICL Project: A Methodology for Understanding Meaning Making in Economics

    Science.gov (United States)

    Paxton, Moragh

    2011-01-01

    This article focuses on the methodology for an academic literacies research project in an Integrated Content and Language (ICL) collaboration in economics and the ways in which the findings from the research contributed to further development and expansion of the ICL endeavour. The research was conducted independently rather than collaboratively…

  18. A methodology for developing product platforms in the specific setting of the housebuilding industry

    NARCIS (Netherlands)

    Veenstra, Vanessa S.; Halman, Johannes I.M.; Voordijk, Johannes T.

    2006-01-01

    Platform based strategies have proved to be a successful approach for achieving optimum balances between standardization and variation in many industries. However, application of this concept in the housebuilding industry is relatively new. This article describes a new methodology for developing pro

  19. Methodologies Developed for EcoCity Related Projects: New Borg El Arab, an Egyptian Case Study

    Directory of Open Access Journals (Sweden)

    Carmen Antuña-Rozado

    2016-08-01

    Full Text Available The aim of the methodologies described here is to propose measures and procedures for developing concepts and technological solutions, which are adapted to the local conditions, to build sustainable communities in developing countries and emerging economies. These methodologies are linked to the EcoCity framework outlined by VTT Technical Research Centre of Finland Ltd. for sustainable community and neighbourhood regeneration and development. The framework is the result of a long experience in numerous EcoCity related projects, mainly Nordic and European in scope, which has been reformulated in recent years to respond to the local needs in the previously mentioned countries. There is also a particular emphasis on close collaboration with local partners and major stakeholders. In order to illustrate how these methodologies can support EcoCity concept development and implementation, results from a case study in Egypt will be discussed. The referred case study relates to the transformation of New Borg El Arab (NBC, near Alexandria, into an EcoCity. The viability of the idea was explored making use of different methodologies (Roadmap, Feasibility Study, and Residents Energy Survey and Building Consumption Assessment and considering the Residential, Commercial/Public Facilities, Industrial, Services/Utilities, and Transport sectors.

  20. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…