WorldWideScience

Sample records for modeling tool developed

  1. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  2. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  3. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  4. Accessing Curriculum Through Technology Tools (ACTTT): A Model Development Project

    Science.gov (United States)

    Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda

    2012-01-01

    Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum.…

  5. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  6. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  7. Development of hydrogeological modelling tools based on NAMMU

    International Nuclear Information System (INIS)

    Marsic, N.; Hartley, L.; Jackson, P.; Poole, M.; Morvik, A.

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  8. Gsflow-py: An integrated hydrologic model development tool

    Science.gov (United States)

    Gardner, M.; Niswonger, R. G.; Morton, C.; Henson, W.; Huntington, J. L.

    2017-12-01

    Integrated hydrologic modeling encompasses a vast number of processes and specifications, variable in time and space, and development of model datasets can be arduous. Model input construction techniques have not been formalized or made easily reproducible. Creating the input files for integrated hydrologic models (IHM) requires complex GIS processing of raster and vector datasets from various sources. Developing stream network topology that is consistent with the model resolution digital elevation model is important for robust simulation of surface water and groundwater exchanges. Distribution of meteorologic parameters over the model domain is difficult in complex terrain at the model resolution scale, but is necessary to drive realistic simulations. Historically, development of input data for IHM models has required extensive GIS and computer programming expertise which has restricted the use of IHMs to research groups with available financial, human, and technical resources. Here we present a series of Python scripts that provide a formalized technique for the parameterization and development of integrated hydrologic model inputs for GSFLOW. With some modifications, this process could be applied to any regular grid hydrologic model. This Python toolkit automates many of the necessary and laborious processes of parameterization, including stream network development and cascade routing, land coverages, and meteorological distribution over the model domain.

  9. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  10. Improved Modeling Tools Development for High Penetration Solar

    Energy Technology Data Exchange (ETDEWEB)

    Washom, Byron [Univ. of California, San Diego, CA (United States); Meagher, Kevin [Power Analytics Corporation, San Diego, CA (United States)

    2014-12-11

    One of the significant objectives of the High Penetration solar research is to help the DOE understand, anticipate, and minimize grid operation impacts as more solar resources are added to the electric power system. For Task 2.2, an effective, reliable approach to predicting solar energy availability for energy generation forecasts using the University of California, San Diego (UCSD) Sky Imager technology has been demonstrated. Granular cloud and ramp forecasts for the next 5 to 20 minutes over an area of 10 square miles were developed. Sky images taken every 30 seconds are processed to determine cloud locations and cloud motion vectors yielding future cloud shadow locations respective to distributed generation or utility solar power plants in the area. The performance of the method depends on cloud characteristics. On days with more advective cloud conditions, the developed method outperforms persistence forecasts by up to 30% (based on mean absolute error). On days with dynamic conditions, the method performs worse than persistence. Sky Imagers hold promise for ramp forecasting and ramp mitigation in conjunction with inverter controls and energy storage. The pre-commercial Sky Imager solar forecasting algorithm was documented with licensing information and was a Sunshot website highlight.

  11. Model-based development of a course of action scheduling tool

    DEFF Research Database (Denmark)

    Kristensen, Lars Michael; Mechlenborg, Peter; Zhang, Lin

    2008-01-01

    This paper shows how a formal method in the form of Coloured Petri Nets (CPNs) and the supporting CPN Tools have been used in the development of the Course of Action Scheduling Tool (COAST). The aim of COAST is to support human planners in the specification and scheduling of tasks in a Course...... of Action. CPNs have been used to develop a formal model of the task execution framework underlying COAST. The CPN model has been extracted in executable form from CPN Tools and embedded directly into COAST, thereby automatically bridging the gap between the formal specification and its implementation....... The scheduling capabilities of COAST are based on state space exploration of the embedded CPN model. Planners interact with COAST using a domain-specific graphical user interface (GUI) that hides the embedded CPN model and analysis algorithms. This means that COAST is based on a rigorous semantical model...

  12. Development of a surrogate model for elemental analysis using a natural gamma ray spectroscopy tool

    International Nuclear Information System (INIS)

    Zhang, Qiong

    2015-01-01

    A systematic computational method for obtaining accurate elemental standards efficiently for varying borehole conditions was developed based on Monte Carlo simulations, surrogate modeling, and data assimilation. Elemental standards are essential for spectral unfolding in formation evaluation applications commonly used for nuclear well logging tools. Typically, elemental standards are obtained by standardized measurements, but these experiments are expensive and lack the flexibility to address different logging conditions. In contrast, computer-based Monte Carlo simulations provide an accurate and more flexible approach to obtaining elemental standards for formation evaluation. The presented computational method recognizes that in contrast to typical neutron–photon simulations, where the source is typically artificial and well characterized (Galford, 2009), an accurate knowledge of the source is essential for matching the obtained Monte Carlo elemental standards with their experimental counterparts. Therefore, source distributions are adjusted to minimize the L2 difference of the Monte Carlo computed and experimental standards. Subsequently, an accurate surrogate model is developed accounting for different casing and cement thicknesses, and tool positions within the borehole. The adjusted source distributions are then utilized to generate and validate spectra for varying borehole conditions: tool position, casing and cement thickness. The effect of these conditions on the spectra are investigated and discussed in this work. Given that Monte Carlo modeling provides much lower cost and more flexibility, employing Monte Carlo could enhance the processing of nuclear tool logging data computed standards. - Highlights: • A novel computational model for efficiently computing elemental standards for varying borehole conditions has been developed. • A model of an experimental test pit was implemented in the Monte Carlo code GEANT4 for computing elemental standards.

  13. Using Model-Eliciting Activities as a Tool to Identify and Develop Mathematically Creative Students

    Science.gov (United States)

    Coxbill, Emmy; Chamberlin, Scott A.; Weatherford, Jennifer

    2013-01-01

    Traditional classroom methods for identifying mathematically creative students have been inadequate. Identifying students who could potentially be mathematically creative is instrumental in the development of students and in meeting their affective and educational needs. One prospective identification tool is the use of model-eliciting activities…

  14. A new framework for modeling decentralized low impact developments using Soil and Water Assessment Tool

    Science.gov (United States)

    Assessing the performance of Low Impact Development (LID) practices at a catchment scale is important in managing urban watersheds. Few modeling tools exist that are capable of explicitly representing the hydrological mechanisms of LIDs while considering the diverse land uses of urban watersheds. ...

  15. Tools for Resilience Management: Multidisciplinary Development of State-and-Transition Models for Northwest Colorado

    Directory of Open Access Journals (Sweden)

    Emily J. Kachergis

    2013-12-01

    Full Text Available Building models is an important way of integrating knowledge. Testing and updating models of social-ecological systems can inform management decisions and, ultimately, improve resilience. We report on the outcomes of a six-year, multidisciplinary model development process in the sagebrush steppe, USA. We focused on creating state-and-transition models (STMs, conceptual models of ecosystem change that represent nonlinear dynamics and are being adopted worldwide as tools for managing ecosystems. STM development occurred in four steps with four distinct sets of models: (1 local knowledge elicitation using semistructured interviews; (2 ecological data collection using an observational study; (3 model integration using participatory workshops; and (4 model simplification upon review of the literature by a multidisciplinary team. We found that different knowledge types are ultimately complementary. Many of the benefits of the STM-building process flowed from the knowledge integration steps, including improved communication, identification of uncertainties, and production of more broadly credible STMs that can be applied in diverse situations. The STM development process also generated hypotheses about sagebrush steppe dynamics that could be tested by future adaptive management and research. We conclude that multidisciplinary development of STMs has great potential for producing credible, useful tools for managing resilience of social-ecological systems. Based on this experience, we outline a streamlined, participatory STM development process that integrates multiple types of knowledge and incorporates adaptive management.

  16. The mesoscale dispersion modeling system a simulation tool for development of an emergency response system

    International Nuclear Information System (INIS)

    Uliasz, M.

    1990-01-01

    The mesoscale dispersion modeling system is under continuous development. The included numerical models require further improvements and evaluation against data from meteorological and tracer field experiments. The system can not be directly applied to real time predictions. However, it seems to be a useful simulation tool for solving several problems related to planning the monitoring network and development of the emergency response system for the nuclear power plant located in a coastal area. The modeling system can be also applied to another environmental problems connected with air pollution dispersion in complex terrain. The presented numerical models are designed for the use on personal computers and are relatively fast in comparison with the similar mesoscale models developed on mainframe computers

  17. A modeling tool to support decision making in future hydropower development in Chile

    Science.gov (United States)

    Vicuna, S.; Hermansen, C.; Cerda, J. P.; Olivares, M. A.; Gomez, T. I.; Toha, E.; Poblete, D.; Mao, L.; Falvey, M. J.; Pliscoff, P.; Melo, O.; Lacy, S.; Peredo, M.; Marquet, P. A.; Maturana, J.; Gironas, J. A.

    2017-12-01

    Modeling tools support planning by providing transparent means to assess the outcome of natural resources management alternatives within technical frameworks in the presence of conflicting objectives. Such tools, when employed to model different scenarios, complement discussion in a policy-making context. Examples of practical use of this type of tool exist, such as the Canadian public forest management, but are not common, especially in the context of developing countries. We present a tool to support the selection from a portfolio of potential future hydropower projects in Chile. This tool, developed by a large team of researchers under the guidance of the Chilean Energy Ministry, is especially relevant in the context of evident regionalism, skepticism and change in societal values in a country that has achieved a sustained growth alongside increased demands from society. The tool operates at a scale of a river reach, between 1-5 km long, on a domain that can be defined according to the scale needs of the related discussion, and its application can vary from river basins to regions or other spatial configurations that may be of interest. The tool addresses both available hydropower potential and the existence (inferred or observed) of other ecological, social, cultural and productive characteristics of the territory which are valuable to society, and provides a means to evaluate their interaction. The occurrence of each of these other valuable characteristics in the territory is measured by generating a presence-density score for each. Considering the level of constraint each characteristic imposes on hydropower development, they are weighted against each other and an aggregate score is computed. With this information, optimal trade-offs are computed between additional hydropower capacity and valuable local characteristics over the entire domain, using the classical knapsack 0-1 optimization algorithm. Various scenarios of different weightings and hydropower

  18. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Science.gov (United States)

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  19. Neuro-fuzzy models as an IVIVR tool and their applicability in generic drug development.

    Science.gov (United States)

    Opara, Jerneja; Legen, Igor

    2014-03-01

    The usefulness of neuro-fuzzy (NF) models as an alternative in vitro-in vivo relationship (IVIVR) tool and as a support to quality by design (QbD) in generic drug development is presented. For drugs with complicated pharmacokinetics, immediate release drugs or nasal sprays, suggested level A correlations are not capable to satisfactorily describe the IVIVR. NF systems were recognized as a reasonable method in comparison to the published approaches for development of IVIVR. Consequently, NF models were built to predict 144 pharmacokinetic (PK) parameter ratios required for demonstration of bioequivalence (BE) for 88 pivotal BE studies. Input parameters of models included dissolution data and their combinations in different media, presence of food, formulation strength, technology type, particle size, and spray pattern for nasal sprays. Ratios of PK parameters Cmax or AUC were used as output variables. The prediction performance of models resulted in the following values: 79% of models have acceptable external prediction error (PE) below 10%, 13% of models have inconclusive PE between 10 and 20%, and remaining 8% of models show inadequate PE above 20%. Average internal predictability (LE) is 0.3%, and average external predictability of all models results in 7.7%. In average, models have acceptable internal and external predictabilities with PE lower than 10% and are therefore useful for IVIVR needs during formulation development, as a support to QbD and for the prediction of BE study outcome.

  20. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  1. ACORNS: a tool for the visualisation and modelling of atypical development.

    Science.gov (United States)

    Moore, D G; George, R

    2011-10-01

    Across many academic disciplines visualisation and notation systems are used for modelling data and developing theory, but in child development visual models are not widely used; yet researchers and students of developmental difficulties may benefit from a visualisation and notation system which can clearly map developmental outcomes and trajectories, and convey hypothesised dynamic causal pathways. Such a system may help understanding of existing accounts and be a tool for developing new theories. We first present criteria that need to be met in order to provide fully nuanced visualisations of development, and discuss strengths and weaknesses of the visualisation system proposed by Morton. Secondly, we present a tool we have designed to give more precise accounts of development while also being accessible, intuitive and visually appealing. We have called this an Accessible Cause-Outcome Representation and Notation System (ACORNS). This system provides a framework for clear mapping and modelling of developmental sequences, illustrating more precisely how functions change over time, how factors interact with the environment, and the absolute and relative nature of causal outcomes. We provide a new template, a set of rules for the appropriate use of boxes and arrows, and a set of visually accessible indicators that can be used to show more precisely relative rates, degrees and variance of functioning over different capacities at different time points. We have designed ACORNS to give a precise and clear visualisation of how development unfolds; allowing the representation of less 'static' and more transactional models of developmental difficulties. We hope ACORNS will help students, clinicians and theoreticians across disciplines to better represent nuances of debates, and be a seed for the development of new theory. © 2011 The Authors. Journal of Intellectual Disability Research © 2011 Blackwell Publishing Ltd.

  2. The development of environmental modeling tools in Brazil for emergency preparedness

    International Nuclear Information System (INIS)

    Rochedo, Elaine R.R.; Vetere, Maria I.C.; Conti, Luiz F.C.; Wasserman, Maria A.V.; Salinas, Isabel C.P.; Pereira, Jose F.; Silva, Diogo N.G.; Vinhas, Denise M

    2008-01-01

    Since the Goiania accident, in 1987, the IRD is developing tools to support decision-making processes after accidents involving radiological public exposure. The Environmental Modelling Project began with the development of the code CORAL, based on the German dynamic model ECOSYS, developed by the GSF, with the purpose of assessing the consequences of an accidental contamination of rural areas. Then, in cooperation with the GSF, the IRD has developed the model PARATI, based on information from Chernobyl and Goiania accidents, for the assessment of the exposure of the public due to a contamination of Cs-137 in urban areas. This model includes the ability to simulate the implementation of countermeasure and its effectiveness in reducing doses to the public. Subsequently, the SIEM - Integrated Emergency System was developed to include CORAL and PARATI, as well as some generic models developed by the IAEA, for short-term dose estimates and to support protective strategies during the emergency phase of an accident. SIEM also incorporated standardized data on the physical behavior of radionuclides and dose conversion factors. Several improvements have been performed in order to better adequate the model to Brazilian social, political, economic and climatic characteristics. Currently a multi-criteria strategy to support decision-making processes after the occurrence of an event of environmental contamination is under development. That work includes the development of a database on countermeasures and a computer model to perform the multi-criteria simulation. At all stages of the work, the pertinent weather and seasonal aspects are considered, in order to obtain a guide to protective actions accounting for social, economic and climatic characteristics, to be used in multi-criteria optimization processes adequate for tropical climate areas. (author)

  3. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  4. Open Tools for Integrated Modelling to Understand SDG development - The OPTIMUS program

    Science.gov (United States)

    Howells, Mark; Zepeda, Eduardo; Rogner, H. Holger; Sanchez, Marco; Roehrl, Alexander; Cicowiez, Matrin; Mentis, Dimitris; Korkevelos, Alexandros; Taliotis, Constantinos; Broad, Oliver; Alfstad, Thomas

    2016-04-01

    The recently adopted Sustainable Development Goals (SDGs) - a set of 17 measurable and time-bound goals with 169 associated targets for 2030 - are highly inclusive challenges before the world community ranging from eliminating poverty to human rights, inequality, a secure world and protection of the environment. Each individual goal or target by themselves present enormous tasks, taken together they are overwhelming. There strong and weak interlinkages, hence trade-offs and complementarities among goals and targets. Some targets may affect several goals while other goals and targets may conflict or be mutually exclusive (Ref). Meeting each of these requires the judicious exploitation of resource, with energy playing an important role. Such complexity demands to be addressed in an integrated way using systems analysis tools to support informed policy formulation, planning, allocation of scarce resources, monitoring progress, effectiveness and review at different scales. There is no one size fits all methodology that conceivably could include all goal and targets simultaneously. But there are methodologies encapsulating critical subsets of the goal and targets with strong interlinkages with a 'soft' reflection on the weak interlinkages. Universal food security or sustainable energy for all inherently support goals and targets on human rights and equality but possibly at the cost of biodiversity or desertification. Integrated analysis and planning tools are not yet commonplace at national universities - or indeed in many policy making organs. What is needed is a fundamental realignment of institutions and integrations of their planning processes and decision making. We introduce a series of open source tools to support the SDG planning and implementation process. The Global User-friendly CLEW Open Source (GLUCOSE) tool optimizes resource interactions and constraints; The Global Electrification Tool kit (GETit) provides the first global spatially explicit

  5. Nongeneric tool support for model-driven product development; Werkzeugunterstuetzung fuer die modellbasierte Produktentwicklung. Maschinenlesbare Spezifikationen selbst erstellen

    Energy Technology Data Exchange (ETDEWEB)

    Bock, C. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Zuehlke, D. [Technische Univ. Kaiserslautern (Germany). Lehrstuhl fuer Produktionsautomatisierung; Deutsches Forschungszentrum fuer Kuenstliche Intelligenz (DFKI), Kaiserslautern (DE). Zentrum fuer Mensch-Maschine-Interaktion (ZMMI)

    2006-07-15

    A well-defined specification process is a central success factor in human-machine-interface development. Consequently in interdisciplinary development teams specification documents are an important communication instrument. In order to replace todays typically paper-based specification and to leverage the benefits of their electronic equivalents developers demand comprehensive and applicable computer-based tool kits. Manufacturers' increasing awareness of appropriate tool support causes alternative approaches for tool kit creation to emerge. Therefore this article introduces meta-modelling as a promising attempt to create nongeneric tool support with justifiable effort. This enables manufacturers to take advantage of electronic specifications in product development processes.

  6. Cost Benefit Analysis Modeling Tool for Electric vs. ICE Airport Ground Support Equipment – Development and Results

    Energy Technology Data Exchange (ETDEWEB)

    James Francfort; Kevin Morrow; Dimitri Hochard

    2007-02-01

    This report documents efforts to develop a computer tool for modeling the economic payback for comparative airport ground support equipment (GSE) that are propelled by either electric motors or gasoline and diesel engines. The types of GSE modeled are pushback tractors, baggage tractors, and belt loaders. The GSE modeling tool includes an emissions module that estimates the amount of tailpipe emissions saved by replacing internal combustion engine GSE with electric GSE. This report contains modeling assumptions, methodology, a user’s manual, and modeling results. The model was developed based on the operations of two airlines at four United States airports.

  7. Development of a robust modeling tool for radiation-induced segregation in austenitic stainless steels

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ying [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Field, Kevin G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Allen, Todd R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Busby, Jeremy T [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-09-01

    Irradiation-assisted stress corrosion cracking (IASCC) of austenitic stainless steels in Light Water Reactor (LWR) components has been linked to changes in grain boundary composition due to irradiation induced segregation (RIS). This work developed a robust RIS modeling tool to account for thermodynamics and kinetics of the atom and defect transportation under combined thermal and radiation conditions. The diffusion flux equations were based on the Perks model formulated through the linear theory of the thermodynamics of irreversible processes. Both cross and non-cross phenomenological diffusion coefficients in the flux equations were considered and correlated to tracer diffusion coefficients through Manning’s relation. The preferential atomvacancy coupling was described by the mobility model, whereas the preferential atom-interstitial coupling was described by the interstitial binding model. The composition dependence of the thermodynamic factor was modeled using the CALPHAD approach. Detailed analysis on the diffusion fluxes near and at grain boundaries of irradiated austenitic stainless steels suggested the dominant diffusion mechanism for chromium and iron is via vacancy, while that for nickel can swing from the vacancy to the interstitial dominant mechanism. The diffusion flux in the vicinity of a grain boundary was found to be greatly influenced by the composition gradient formed from the transient state, leading to the oscillatory behavior of alloy compositions in this region. This work confirms that both vacancy and interstitial diffusion, and segregation itself, have important roles in determining the microchemistry of Fe, Cr, and Ni at irradiated grain boundaries in austenitic stainless steels.

  8. Modeling decision making as a support tool for policy making on renewable energy development

    International Nuclear Information System (INIS)

    Cannemi, Marco; García-Melón, Mónica; Aragonés-Beltrán, Pablo; Gómez-Navarro, Tomás

    2014-01-01

    This paper presents the findings of a study on decision making models for the analysis of capital-risk investors’ preferences on biomass power plants projects. The aim of the work is to improve the support tools for policy makers in the field of renewable energy development. Analytic Network Process (ANP) helps to better understand capital-risk investors preferences towards different kinds of biomass fueled power plants. The results of the research allow public administration to better foresee the investors’ reaction to the incentive system, or to modify the incentive system to better drive investors’ decisions. Changing the incentive system is seen as major risk by investors. Therefore, public administration must design better and longer-term incentive systems, forecasting market reactions. For that, two scenarios have been designed, one showing a typical decision making process and another proposing an improved decision making scenario. A case study conducted in Italy has revealed that ANP allows understanding how capital-risk investors interpret the situation and make decisions when investing on biomass power plants; the differences between the interests of public administrations’s and promoters’, how decision making could be influenced by adding new decision criteria, and which case would be ranked best according to the decision models. - Highlights: • We applied ANP to the investors’ preferences on biomass power plants projects. • The aim is to improve the advising tools for renewable energy policy making. • A case study has been carried out with the help of two experts. • We designed two scenarios: decision making as it is and how could it be improved. • Results prove ANP is a fruitful tool enhancing participation and transparency

  9. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  10. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  11. Development of a Quick Look Pandemic Influenza Modeling and Visualization Tool

    Energy Technology Data Exchange (ETDEWEB)

    Brigantic, Robert T.; Ebert, David S.; Corley, Courtney D.; Maciejewski, Ross; Muller, George; Taylor, Aimee E.

    2010-05-30

    Federal, State, and local decision makers and public health officials must prepare and exercise complex plans to contend with a variety of possible mass casualty events, such as pandemic influenza. Through the provision of quick look tools (QLTs) focused on mass casualty events, such planning can be done with higher accuracy and more realism through the combination of interactive simulation and visualization in these tools. If an event happens, the QLTs can then be employed to rapidly assess and execute alternative mitigation strategies, and thereby minimize casualties. This can be achieved by conducting numerous 'what-if' assessments prior to any event in order to assess potential health impacts (e.g., number of sick individuals), required community resources (e.g., vaccinations and hospital beds), and optimal mitigative decision strategies (e.g., school closures) during the course of a pandemic. In this presentation, we overview and demonstrate a pandemic influenza QLT, discuss some of the modeling methods and construct and visual analytic components and interface, and outline additional development concepts. These include the incorporation of a user selectable infectious disease palette, simultaneous visualization of decision alternatives, additional resource elements associated with emergency response (e.g., first responders and medical professionals), and provisions for other potential disaster events.

  12. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  13. Regulatory odour model development: Survey of modelling tools and datasets with focus on building effects

    DEFF Research Database (Denmark)

    Olesen, H. R.; Løfstrøm, P.; Berkowicz, R.

    A project within the framework of a larger research programme, Action Plan for the Aquatic Environment III (VMP III) aims towards improving an atmospheric dispersion model (OML). The OML model is used for regulatory applications in Denmark, and it is the candidate model to be used also in future...... in relation to odour problems due to animal farming. However, the model needs certain improvements and validation in order to be fully suited for that purpose. The report represents a survey of existing literature, models and data sets. It includes a brief overview of the state-of-the-art of atmospheric...... dispersion models for estimating local concentration levels in general. However, the report focuses on some particular issues, which are relevant for subsequent work on odour due to animal production. An issue of primary concern is the effect that buildings (stables) have on flow and dispersion. The handling...

  14. Green Infrastructure Modeling Tools

    Science.gov (United States)

    Modeling tools support planning and design decisions on a range of scales from setting a green infrastructure target for an entire watershed to designing a green infrastructure practice for a particular site.

  15. Population Density Modeling Tool

    Science.gov (United States)

    2012-06-26

    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE 26

  16. Sobol Sensitivity Analysis: A Tool to Guide the Development and Evaluation of Systems Pharmacology Models

    Science.gov (United States)

    Trame, MN; Lesko, LJ

    2015-01-01

    A systems pharmacology model typically integrates pharmacokinetic, biochemical network, and systems biology concepts into a unifying approach. It typically consists of a large number of parameters and reaction species that are interlinked based upon the underlying (patho)physiology and the mechanism of drug action. The more complex these models are, the greater the challenge of reliably identifying and estimating respective model parameters. Global sensitivity analysis provides an innovative tool that can meet this challenge. CPT Pharmacometrics Syst. Pharmacol. (2015) 4, 69–79; doi:10.1002/psp4.6; published online 25 February 2015 PMID:27548289

  17. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of ca...... and extending the concept of MV while trying to lead the CBR methodology into a new domain by applying it in strategic management....... that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...

  18. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  19. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall......Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  20. Applying a Knowledge Management Modeling Tool for Manufacturing Vision (MV) Development

    DEFF Research Database (Denmark)

    Wang, Chengbo; Luxhøj, James T.; Johansen, John

    2004-01-01

    that the CBRM is supportive to the decision-making process of applying and augmenting organizational knowledge. It provides a new angle to tackle strategic management issues within the manufacturing system of a business operation. Explores a new proposition within strategic manufacturing management by enriching...... and extending the concept of MV while trying to lead the CBR methodology into a new domain by applying it in strategic management.......This paper introduces an empirical application of an experimental model for knowledge management within an organization, namely a case-based reasoning model for manufacturing vision development (CBRM). The model integrates the development process of manufacturing vision with the methodology of case...

  1. An empirical modeling tool and glass property database in development of US-DOE radioactive waste glasses

    International Nuclear Information System (INIS)

    Muller, I.; Gan, H.

    1997-01-01

    An integrated glass database has been developed at the Vitreous State Laboratory of Catholic University of America. The major objective of this tool was to support glass formulation using the MAWS approach (Minimum Additives Waste Stabilization). An empirical modeling capability, based on the properties of over 1000 glasses in the database, was also developed to help formulate glasses from waste streams under multiple user-imposed constraints. The use of this modeling capability, the performance of resulting models in predicting properties of waste glasses, and the correlation of simple structural theories to glass properties are the subjects of this paper. (authors)

  2. Development and application of a Japanese model of the WHO fracture risk assessment tool (FRAX).

    Science.gov (United States)

    Fujiwara, S; Nakamura, T; Orimo, H; Hosoi, T; Gorai, I; Oden, A; Johansson, H; Kanis, J A

    2008-04-01

    The present study estimated the 10-year probability using the Japanese version of WHO fracture risk assessment tool (FRAX) in order to determine fracture probabilities that correspond to intervention thresholds currently used in Japan and to resolve some issues for its use in Japan. The objective of the present study was to evaluate a Japanese version of the WHO fracture risk assessment (FRAX) tool to compute 10-year probabilities of osteoporotic fracture in Japanese men and women. Since lumbar spine bone mineral density (BMD) is used preferentially as a site for assessment, and densitometers use Japanese reference data, a second aim was to investigate the suitability and impact of this practice in Japan. Fracture probabilities were computed from published data on the fracture and death hazards in Japan. Probabilities took account of age, sex, the presence of clinical risk factors and femoral neck BMD. Fracture probabilities were determined that were equivalent to intervention thresholds currently used in Japan. The difference between T-scores derived from international reference data and that using Japanese-specific normal ranges was estimated from published sources. The gradient of risk of BMD for fracture in Japan was compared to that for BMD at the lumbar spine in the Hiroshima cohort. The 10-year probabilities of a major osteoporosis-related fracture that corresponded to current intervention thresholds ranged from approximately 5% at the age of 50 years to more than 20% at the age of 80 years. The use of femoral neck BMD predicts fracture as well as or better than BMD tests at the lumbar spine. There were small differences in T-scores between those used for the model and those derived from a Japanese reference population. The FRAX mark tool has been used to determine possible thresholds for therapeutic intervention, based on equivalence of risk with current guidelines. The approach will need to be supported by appropriate health economic analyses. Femoral neck

  3. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    Science.gov (United States)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    The increasing ship traffic and maritime transport of dangerous substances make it more difficult to significantly reduce the environmental, economic and social risks posed by potential spills, although the security rules are becoming more restrictive (ships with double hull, etc.) and the surveillance systems are becoming more developed (VTS, AIS). In fact, the problematic associated to spills is and will always be a main topic: spill events are continuously happening, most of them unknown for the general public because of their small scale impact, but with some of them (in a much smaller number) becoming authentic media phenomena in this information era, due to their large dimensions and environmental and social-economic impacts on ecosystems and local communities, and also due to some spectacular or shocking pictures generated. Hence, the adverse consequences posed by these type of accidents, increase the preoccupation of avoiding them in the future, or minimize their impacts, using not only surveillance and monitoring tools, but also increasing the capacity to predict the fate and behaviour of bodies, objects, or substances in the following hours after the accident - numerical models can have now a leading role in operational oceanography applied to safety and pollution response in the ocean because of their predictive potential. Search and rescue operation, oil, inert (ship debris, or floating containers), and HNS (hazardous and noxious substances) spills risk analysis are the main areas where models can be used. Model applications have been widely used in emergency or planning issues associated to pollution risks, and contingency and mitigation measures. Before a spill, in the planning stage, modelling simulations are used in environmental impact studies, or risk maps, using historical data, reference situations, and typical scenarios. After a spill, the use of fast and simple modelling applications allow to understand the fate and behaviour of the spilt

  4. The insect central complex as model for heterochronic brain development-background, concepts, and tools.

    Science.gov (United States)

    Koniszewski, Nikolaus Dieter Bernhard; Kollmann, Martin; Bigham, Mahdiyeh; Farnworth, Max; He, Bicheng; Büscher, Marita; Hütteroth, Wolf; Binzer, Marlene; Schachtner, Joachim; Bucher, Gregor

    2016-06-01

    The adult insect brain is composed of neuropils present in most taxa. However, the relative size, shape, and developmental timing differ between species. This diversity of adult insect brain morphology has been extensively described while the genetic mechanisms of brain development are studied predominantly in Drosophila melanogaster. However, it has remained enigmatic what cellular and genetic mechanisms underlie the evolution of neuropil diversity or heterochronic development. In this perspective paper, we propose a novel approach to study these questions. We suggest using genome editing to mark homologous neural cells in the fly D. melanogaster, the beetle Tribolium castaneum, and the Mediterranean field cricket Gryllus bimaculatus to investigate developmental differences leading to brain diversification. One interesting aspect is the heterochrony observed in central complex development. Ancestrally, the central complex is formed during embryogenesis (as in Gryllus) but in Drosophila, it arises during late larval and metamorphic stages. In Tribolium, it forms partially during embryogenesis. Finally, we present tools for brain research in Tribolium including 3D reconstruction and immunohistochemistry data of first instar brains and the generation of transgenic brain imaging lines. Further, we characterize reporter lines labeling the mushroom bodies and reflecting the expression of the neuroblast marker gene Tc-asense, respectively.

  5. Preclinical models of muscle spasticity: valuable tools in the development of novel treatment for neurological diseases and conditions.

    Science.gov (United States)

    Bespalov, Anton; Mus, Liudmila; Zvartau, Edwin

    2016-05-01

    Poor validity of preclinical animal models is one of the most commonly discussed explanations for the failures to develop novel drugs in general and in neuroscience in particular. However, there are several areas of neuroscience such as injury-induced spasticity where etiological factor can be adequately recreated and models can focus on specific pathophysiological mechanisms that likely contribute to spasticity syndrome in humans (such as motoneuron hyperexcitability and spinal hyperreflexia). Methods used to study spasticity in preclinical models are expected to have a high translational value (e.g., electromyogram (EMG)-based electrophysiological tools) and can efficiently assist clinical development programs. However, validation of these models is not complete yet. First, true predictive validity of these models is not established as clinically efficacious drugs have been used to reverse validate preclinical models while newly discovered mechanisms effective in preclinical models are yet to be fully explored in humans (e.g., 5-HT2C receptor inverse agonists, fatty acid amid hydrolase inhibitors). Second, further efforts need to be invested into cross-laboratory validation of study protocols and tools, adherence to the highest quality standards (blinding, randomization, pre-specified study endpoints, etc.), and systematic efforts to replicate key sets of data. These appear to be readily achievable tasks that will enable development not only of symptomatic but also of disease-modifying therapy of spasticity, an area that seems to be currently not in focus of research efforts.

  6. DEVELOPMENT OF RIVER FLOOD ROUTING MODEL USING NON-LINEAR MUSKINGUM EQUATION AND EXCEL TOOL 'GANetXL'

    Directory of Open Access Journals (Sweden)

    Briti Sundar Sil

    2016-01-01

    Full Text Available Flood routing is of utmost importance to water resources engineers and hydrologist. Muskingum model is one of the popular methods for river flood routing which often require a huge computational work. To solve the routing parameters, most of the established methods require knowledge about different computer programmes and sophisticated models. So, it is beneficial to have a tool which is comfortable to users having more knowledge about everyday decision making problems rather than the development of computational models as the programmes. The use of micro-soft excel and its relevant tool like solver by the practicing engineers for normal modeling tasks has become common over the last few decades. In excel environment, tools are based on graphical user interface which are very comfortable for the users for handling database, modeling, data analysis and programming. GANetXL is an add-in for Microsoft Excel, a leading commercial spreadsheet application for Windows and MAC operating systems. GANetXL is a program that uses a Genetic Algorithm to solve a wide range of single and multi-objective problems. In this study, non-linear Muskingum routing parameters are solved using GANetXL. Statistical Model performances are compared with the earlier results and found satisfactory.

  7. Existing air sparging model and literature review for the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    The objectives of this Report are two-fold: (1) to provide overviews of the state-of-the-art and state-of-the-practice with respect to air sparging technology, air sparging models and related or augmentation technologies (e.g., soil vapor extraction); and (2) to provide the basis for the development of the conceptual Decision Tool. The Project Team conducted an exhaustive review of available literature. The complete listing of the documents, numbering several hundred and reviewed as a part of this task, is included in Appendix A. Even with the large amount of material written regarding the development and application of air sparging, there still are significant gaps in the technical community`s understanding of the remediation technology. The results of the literature review are provided in Section 2. In Section 3, an overview of seventeen conceptual, theoretical, mathematical and empirical models is presented. Detailed descriptions of each of the models reviewed is provided in Appendix B. Included in Appendix D is a copy of the questionnaire used to compile information about the models. The remaining sections of the document reflect the analysis and synthesis of the information gleaned during the literature and model reviews. The results of these efforts provide the basis for development of the decision tree and conceptual decision tool for determining applicability and optimization of air sparging. The preliminary decision tree and accompanying information provided in Section 6 describe a three-tiered approach for determining air sparging applicability: comparison with established scenarios; calculation of conceptual design parameters; and the conducting of pilot-scale studies to confirm applicability. The final two sections of this document provide listings of the key success factors which will be used for evaluating the utility of the Decision Tool and descriptions of potential applications for Decision Tool use.

  8. \\title{Development of Radiation Damage Models for Irradiated Silicon Sensors Using TCAD Tools}

    CERN Document Server

    Bhardwaj, Ashutosh; Lalwani, Kavita; Ranjan, Kirti; Printz, Martin; Ranjeet, Ranjeet; Eber, Robert; Eichhorn, Thomas; Peltola, Timo Hannu Tapani

    2014-01-01

    Abstract. During the high luminosity upgrade of the LHC (HL-LHC) the CMS tracking system will face a more intense radiation environment than the present system was designed for. In order to design radiation tolerant silicon sensors for the future CMS tracker upgrade it is fundamental to complement the measurement with device simulation. This will help in both the understanding of the device performance and in the optimization of the design parameters. One of the important ingredients of the device simulation is to develop a radiation damage model incorporating both bulk and surface damage. In this paper we will discuss the development of a radiation damage model by using commercial TCAD packages (Silvaco and Synopsys), which successfully reproduce the recent measurements like leakage current, depletion voltage, interstrip capacitance and interstrip resistance, and provides an insight into the performance of irradiated silicon strip sensors.

  9. Developments of modeling tools for the ultrasonic propagation in bimetallic welds

    International Nuclear Information System (INIS)

    Gardahaut, A.

    2013-01-01

    This study fits into the field of ultrasonic non-destructive evaluation. It consists in the development of a dynamic ray tracing model to simulate the ultrasonic propagation in bimetallic welds. The approach has been organised in three steps. First of all, an image processing technique has been developed and applied on the macro-graphs of the weld in order to obtain a smooth cartography of the crystallographic orientation. These images are used as input data for a dynamic ray tracing model adapted to the study of anisotropic and inhomogeneous media such as bimetallic welds. Based on a kinematic and a dynamic ray tracing model, usually used in geophysics, it allows the evaluation of ray trajectories between a source point and an observation point, and the computation of the ultrasonic amplitude through the geometrical spreading of an elementary ray tube. This model has been validated in 2D by comparison of the results with a hybrid semi-analytical/finite elements code, then in 3D thanks to experimental results made on the mock-ups of the studied bimetallic welds. (author) [fr

  10. Analysis, Design, Implementation and Evaluation of Graphical Design Tool to Develop Discrete Event Simulation Models Using Event Graphs and Simkit

    National Research Council Canada - National Science Library

    San

    2001-01-01

    ... (OR) modeling and analysis. However, designing and implementing DES can be a time-consuming and error-prone task, This thesis designed, implemented and evaluated a tool, the Event Graph Graphical Design Tool (EGGDT...

  11. Applying mathematical tools to accelerate vaccine development: modeling Shigella immune dynamics.

    Directory of Open Access Journals (Sweden)

    Courtney L Davis

    Full Text Available We establish a mathematical framework for studying immune interactions with Shigella, a bacteria that kills over one million people worldwide every year. The long-term goal of this novel approach is to inform Shigella vaccine design by elucidating which immune components and bacterial targets are crucial for establishing Shigella immunity. Our delay differential equation model focuses on antibody and B cell responses directed against antigens like lipopolysaccharide in Shigella's outer membrane. We find that antibody-based vaccines targeting only surface antigens cannot elicit sufficient immunity for protection. Additional boosting prior to infection would require a four-orders-of-magnitude increase in antibodies to sufficiently prevent epithelial invasion. However, boosting anti-LPS B memory can confer protection, which suggests these cells may correlate with immunity. We see that IgA antibodies are slightly more effective per molecule than IgG, but more total IgA is required due to spatial functionality. An extension of the model reveals that targeting both LPS and epithelial entry proteins is a promising avenue to advance vaccine development. This paper underscores the importance of multifaceted immune targeting in creating an effective Shigella vaccine. It introduces mathematical models to the Shigella vaccine development effort and lays a foundation for joint theoretical/experimental/clinical approaches to Shigella vaccine design.

  12. High-throughput micro-scale cultivations and chromatography modeling: Powerful tools for integrated process development.

    Science.gov (United States)

    Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen

    2015-10-01

    Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.

  13. Mathematical modelling of active safety system functions as tools for development of driverless vehicles

    Science.gov (United States)

    Ryazantsev, V.; Mezentsev, N.; Zakharov, A.

    2018-02-01

    This paper is dedicated to a solution of the issue of synthesis of the vehicle longitudinal dynamics control functions (acceleration and deceleration control) based on the element base of the vehicle active safety system (ESP) - driverless vehicle development tool. This strategy helps to reduce time and complexity of integration of autonomous motion control systems (AMCS) into the vehicle architecture and allows direct control of actuators ensuring the longitudinal dynamics control, as well as reduction of time for calibration works. The “vehicle+wheel+road” longitudinal dynamics control is complicated due to the absence of the required prior information about the control object. Therefore, the control loop becomes an adaptive system, i.e. a self-adjusting monitoring system. Another difficulty is the driver’s perception of the longitudinal dynamics control process in terms of comfort. Traditionally, one doesn’t pay a lot of attention to this issue within active safety systems, and retention of vehicle steerability, controllability and stability in emergency situations are considered to be the quality criteria. This is mainly connected to its operational limits, since it is activated only in critical situations. However, implementation of the longitudinal dynamics control in the AMCS poses another challenge for the developers - providing the driver with comfortable vehicle movement during acceleration and deceleration - while the possible highest safety level in terms of the road grip is provided by the active safety system (ESP). The results of this research are: universal active safety system - AMCS interaction interface; block diagram for the vehicle longitudinal acceleration and deceleration control as one of the active safety system’s integrated functions; ideology of adaptive longitudinal dynamics control, which enables to realize the deceleration and acceleration requested by the AMCS; algorithms synthesised; analytical experiments proving the

  14. Gap Models as Tools for Sustainable Development under Environmental Changes in Northern Eurasia

    Science.gov (United States)

    Shugart, H. H., Jr.; Wang, B.; Brazhnik, K.; Armstrong, A. H.; Foster, A.

    2017-12-01

    Agent-based models of complex systems or as used in this review, Individual-based Models (IBMs), emerged in the 1960s and early 1970s, across diverse disciplines from astronomy to zoology. IBMs arose from a deeply embedded ecological tradition of understanding the dynamics of ecosystems from a "bottom-up" accounting of the interactions of the parts. In this case, individual trees are principal among the parts. Because they are computationally demanding, these models have prospered as the power of digital computers has increased exponentially over the decades following the 1970s. Forest IBMs are no longer computationally bound from developing continental- or global-scale simulations of responses of forests to climate and other changes. Gap models simulate the changes in forests by simulating the birth, growth and death of each individual tree on small plots of land that in summation comprise a forest (or set of sample plots on a forested landscape or region). Currently, gap models have grown from continental-scale and even global-scale applications to assess the potential consequences of climate change on natural forests. These predictions are valuable in the planning and anticipatory decision-making needed to sustainably manage a vast region such as Northern Eurasia. Modifications to the models have enabled simulation of disturbances including fire, insect outbreak and harvest. These disturbances have significant exogenous drivers, notably weather variables, but their effects are also a function of the endogenous conditions involving the structure of forest itself. This feedback between the forest and its environment can in some cases produce hysteresis and multiple-stable operating-regimes for forests. Such responses, often characterized as "tipping points" could play a significant role in increasing risk under environmental change, notably global warming. Such dynamics in a management context imply regional systems that could be "unforgiving" of management

  15. Metrics Development for UML Tools evaluation

    OpenAIRE

    Dasso, Aristides; Funes, Ana; Peralta, Mario; Salgado, Carlos Humberto

    2005-01-01

    The Unified Modelling Language (UML) has become a defacto standard for software development practitioners. There are several tools that help the use of UML. Users of those tools must evaluate and compare different versions of the tools they intend to use or are using to assess the possibility of changing or acquiring one. There are several ways to perform this evaluation from the simple rule-of-thumb to numeric or quantitative methods. We present an ongoing project that evaluates UML tools us...

  16. Developing and testing a measurement tool for assessing predictors of breakfast consumption based on a health promotion model.

    Science.gov (United States)

    Dehdari, Tahereh; Rahimi, Tahereh; Aryaeian, Naheed; Gohari, Mahmood Reza; Esfeh, Jabiz Modaresi

    2014-01-01

    To develop an instrument for measuring Health Promotion Model constructs in terms of breakfast consumption, and to identify the constructs that were predictors of breakfast consumption among Iranian female students. A questionnaire on Health Promotion Model variables was developed and potential predictors of breakfast consumption were assessed using this tool. One hundred female students, mean age 13 years (SD ± 1.2 years). Two middle schools from moderate-income areas in Qom, Iran. Health Promotion Model variables were assessed using a 58-item questionnaire. Breakfast consumption was also measured. Internal consistency (Cronbach alpha), content validity index, content validity ratio, multiple linear regression using stepwise method, and Pearson correlation. Content validity index and content validity ratio scores of the developed scale items were 0.89 and 0.93, respectively. Internal consistencies (range, .74-.91) of subscales were acceptable. Prior related behaviors, perceived barriers, self-efficacy, and competing demand and preferences were 4 constructs that could predict 63% variance of breakfast frequency per week among subjects. The instrument developed in this study may be a useful tool for researchers to explore factors affecting breakfast consumption among students. Students with a high level of self-efficacy, more prior related behavior, fewer perceived barriers, and fewer competing demands were most likely to regularly consume breakfast. Copyright © 2014 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  17. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  18. [Mathematical models as tools for studying and developing strategies in the case of a pandemic influenza outbreak].

    Science.gov (United States)

    Huppert, Amit; Katriel, Haggai; Yaari, Rami; Barnea, Oren; Roll, Uri; Stern, Eli; Balicer, Ran; Stone, Lewi

    2010-01-01

    The current spread of swine flu H1N1 raises serious concerns for public health worldwide. Mathematical modelling has proved to be an essential tool for both developing strategies in preparation for an outbreak and for predicting and evaluating the effectiveness of control policies during an outbreak. Given its growing importance, this article outlines some of the fundamental contributions of mathematical modelling in the study of infectious diseases. The authors review the classical SIR model which has become central to epidemiology, demonstrating basic concepts such as outbreak threshold, the reproductive number Ro and herd immunity. The authors show how the model can be expanded to include different intervention and mitigation strategies, and discuss other biological and social complexities that may be introduced. Finally, the paper illustrates different scenarios for the spread of swine flu in Israel and provides estimates for Reproductive rate (Ro).

  19. Development and application of MCNP auto-modeling tool: Mcam 3.0

    International Nuclear Information System (INIS)

    Liu Xiaoping; Luo Yuetong; Tong Lili

    2005-01-01

    Mcam is abbreviation of 'MCNP Automatic Modeling', which is a CAD interface program of MCNP geometry model based on CAD technology. Making use of existing CAD technology is Mcam's major characteristic. In rough, CAD technology is utilized in the following two ways: (1) Mcam makes it possible to create MCNP geometry model in some CAD software; (2) accelerate creation of MCNP geometry model by inheriting some existing 3D CAD model. The paper gives an introduction of Mcam's major ability: (1) ability to convert CAD model into MCNP geometry model; (2) ability to convert MCNP geometry model into CAD model; (3) ability to construct CAD model. At the end of the paper, several models are given to demonstrate Mcam's different ability respectively

  20. Development of a CSP plant energy yield calculation tool applying predictive models to analyze plant performance sensitivities

    Science.gov (United States)

    Haack, Lukas; Peniche, Ricardo; Sommer, Lutz; Kather, Alfons

    2017-06-01

    At early project stages, the main CSP plant design parameters such as turbine capacity, solar field size, and thermal storage capacity are varied during the techno-economic optimization to determine most suitable plant configurations. In general, a typical meteorological year with at least hourly time resolution is used to analyze each plant configuration. Different software tools are available to simulate the annual energy yield. Software tools offering a thermodynamic modeling approach of the power block and the CSP thermal cycle, such as EBSILONProfessional®, allow a flexible definition of plant topologies. In EBSILON, the thermodynamic equilibrium for each time step is calculated iteratively (quasi steady state), which requires approximately 45 minutes to process one year with hourly time resolution. For better presentation of gradients, 10 min time resolution is recommended, which increases processing time by a factor of 5. Therefore, analyzing a large number of plant sensitivities, as required during the techno-economic optimization procedure, the detailed thermodynamic simulation approach becomes impracticable. Suntrace has developed an in-house CSP-Simulation tool (CSPsim), based on EBSILON and applying predictive models, to approximate the CSP plant performance for central receiver and parabolic trough technology. CSPsim significantly increases the speed of energy yield calculations by factor ≥ 35 and has automated the simulation run of all predefined design configurations in sequential order during the optimization procedure. To develop the predictive models, multiple linear regression techniques and Design of Experiment methods are applied. The annual energy yield and derived LCOE calculated by the predictive model deviates less than ±1.5 % from the thermodynamic simulation in EBSILON and effectively identifies the optimal range of main design parameters for further, more specific analysis.

  1. Developing a Framework to Link Catchment Modelling tools to Decision Support Systems for Catchment Management and Planning

    Science.gov (United States)

    Adams, Russell; Owen, Gareth

    2015-04-01

    Over the past few years a series of catchment monitoring studies in the UK have developed a wide range of tools to enable managers and planners to make informed decisions to target several key outcomes. These outcomes include the mitigation of diffuse pollution and the reduction of flood risk. Good progress has been but additional steps are still required to link together more detailed models that represent catchment processes with the decision support systems (often termed matrices; i.e. DSMs) which form the basis of these planning and management tools. Examples include: (i) the FARM tools developed by the PROACTIVE team at Newcastle University to assess different catchment management options for mitigating against flooding events, (ii) TOPMANAGE, a suite of algorithms that link with high resolution DEMs to enable surface flow pathways, having the potential to be mitigated by Natural Flood Management (NFM) features (in order to target diffuse pollution due to nutrients and sediments) to be identified. To date, these DSMs have not been underpinned by models that can be run in real-time to quantify the benefits in terms of measurable reductions in flood or nutrient pollution risks. Their use has therefore been mostly as qualitative assessment tools. This study aims to adapt an existing spreadsheet-based model, the CRAFT, in order for it to become fully coupled to a DSM approach. Previous catchment scale applications of the CRAFT have focussed on meso-scale studies where any management interventions at a local scale are unlikely to be detectable at the monitoring point (the catchment outlet). The model has however been reasonably successful in identifying potential flow and transport pathways that link the headwater subcatchments to the outlet. Furthermore, recent enhancements to the model enable features such as sedimentation ponds and lagoons that can trap and remove nutrients and sediments to be added, once data become available from different types of NFM

  2. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  3. Development of Modeling Methods and Tools for Predicting Coupled Reactive Transport Processes in Porous Media at Multiple Scales

    Energy Technology Data Exchange (ETDEWEB)

    Clement, T. Prabhakar [Auburn Univ., AL (United States); Barnett, Mark O. [Auburn Univ., AL (United States); Zheng, Chunmiao [Univ. of Alabama, Tuscaloosa, AL (United States); Jones, Norman L. [Brigham Young Univ., Provo, UT (United States)

    2010-05-05

    DE-FG02-06ER64213: Development of Modeling Methods and Tools for Predicting Coupled Reactive Transport Processes in Porous Media at Multiple Scales Investigators: T. Prabhakar Clement (PD/PI) and Mark O. Barnett (Auburn), Chunmiao Zheng (Univ. of Alabama), and Norman L. Jones (BYU). The objective of this project was to develop scalable modeling approaches for predicting the reactive transport of metal contaminants. We studied two contaminants, a radioactive cation [U(VI)] and a metal(loid) oxyanion system [As(III/V)], and investigated their interactions with two types of subsurface materials, iron and manganese oxyhydroxides. We also developed modeling methods for describing the experimental results. Overall, the project supported 25 researchers at three universities. Produced 15 journal articles, 3 book chapters, 6 PhD dissertations and 6 MS theses. Three key journal articles are: 1) Jeppu et al., A scalable surface complexation modeling framework for predicting arsenate adsorption on goethite-coated sands, Environ. Eng. Sci., 27(2): 147-158, 2010. 2) Loganathan et al., Scaling of adsorption reactions: U(VI) experiments and modeling, Applied Geochemistry, 24 (11), 2051-2060, 2009. 3) Phillippi, et al., Theoretical solid/solution ratio effects on adsorption and transport: uranium (VI) and carbonate, Soil Sci. Soci. of America, 71:329-335, 2007

  4. Development of a numerical modelling tool for combined near field and far field wave transformations using a coupling of potential flow solvers

    DEFF Research Database (Denmark)

    Verbrugghe, Tim; Troch, Peter; Kortenhaus, Andreas

    2016-01-01

    is complex; it is difficult to simulate both near field and far field effects with a single numerical model, with relatively fast computing times. Within this research a numerical tool is developed to model near-field and far-field wave transformations caused by WECs. The tool is based on the coupling...

  5. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  6. Multiple High-Fidelity Modeling Tools for Metal Additive Manufacturing Process Development, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Despite the rapid commercialization of additive manufacturing technology such as selective laser melting, SLM, there are gaps in process modeling and material...

  7. Development of a resource modelling tool to support decision makers in pandemic influenza preparedness: The AsiaFluCap Simulator

    Directory of Open Access Journals (Sweden)

    Stein Mart

    2012-10-01

    Full Text Available Abstract Background Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. Results The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. Conclusions The AsiaFluCap Simulator is freely available software (http://www.cdprg.org which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for

  8. Developing a Learning Analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  9. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  10. Development of tiger habitat suitability model using geospatial tools-a case study in Achankmar Wildlife Sanctuary (AMWLS), Chhattisgarh India.

    Science.gov (United States)

    Singh, R; Joshi, P K; Kumar, M; Dash, P P; Joshi, B D

    2009-08-01

    Geospatial tools supported by ancillary geo-database and extensive fieldwork regarding the distribution of tiger and its prey in Anchankmar Wildlife Sanctuary (AMWLS) were used to build a tiger habitat suitability model. This consists of a quantitative geographical information system (GIS) based approach using field parameters and spatial thematic information. The estimates of tiger sightings, its prey sighting and predicted distribution with the assistance of contextual environmental data including terrain, road network, settlement and drainage surfaces were used to develop the model. Eight variables in the dataset viz., forest cover type, forest cover density, slope, aspect, altitude, and distance from road, settlement and drainage were seen as suitable proxies and were used as independent variables in the analysis. Principal component analysis and binomial multiple logistic regression were used for statistical treatments of collected habitat parameters from field and independent variables respectively. The assessment showed a strong expert agreement between the predicted and observed suitable areas. A combination of the generated information and published literature was also used while building a habitat suitability map for the tiger. The modeling approach has taken the habitat preference parameters of the tiger and potential distribution of prey species into account. For assessing the potential distribution of prey species, independent suitability models were developed and validated with the ground truth. It is envisaged that inclusion of the prey distribution probability strengthens the model when a key species is under question. The results of the analysis indicate that tiger occur throughout the sanctuary. The results have been found to be an important input as baseline information for population modeling and natural resource management in the wildlife sanctuary. The development and application of similar models can help in better management of the protected

  11. Data Assimilation Tools for CO2 Reservoir Model Development – A Review of Key Data Types, Analyses, and Selected Software

    Energy Technology Data Exchange (ETDEWEB)

    Rockhold, Mark L.; Sullivan, E. C.; Murray, Christopher J.; Last, George V.; Black, Gary D.

    2009-09-30

    Pacific Northwest National Laboratory (PNNL) has embarked on an initiative to develop world-class capabilities for performing experimental and computational analyses associated with geologic sequestration of carbon dioxide. The ultimate goal of this initiative is to provide science-based solutions for helping to mitigate the adverse effects of greenhouse gas emissions. This Laboratory-Directed Research and Development (LDRD) initiative currently has two primary focus areas—advanced experimental methods and computational analysis. The experimental methods focus area involves the development of new experimental capabilities, supported in part by the U.S. Department of Energy’s (DOE) Environmental Molecular Science Laboratory (EMSL) housed at PNNL, for quantifying mineral reaction kinetics with CO2 under high temperature and pressure (supercritical) conditions. The computational analysis focus area involves numerical simulation of coupled, multi-scale processes associated with CO2 sequestration in geologic media, and the development of software to facilitate building and parameterizing conceptual and numerical models of subsurface reservoirs that represent geologic repositories for injected CO2. This report describes work in support of the computational analysis focus area. The computational analysis focus area currently consists of several collaborative research projects. These are all geared towards the development and application of conceptual and numerical models for geologic sequestration of CO2. The software being developed for this focus area is referred to as the Geologic Sequestration Software Suite or GS3. A wiki-based software framework is being developed to support GS3. This report summarizes work performed in FY09 on one of the LDRD projects in the computational analysis focus area. The title of this project is Data Assimilation Tools for CO2 Reservoir Model Development. Some key objectives of this project in FY09 were to assess the current state

  12. Multiple imputation as one tool to provide longitudinal databases for modelling human height and weight development.

    Science.gov (United States)

    Aßmann, C

    2016-06-01

    Besides large efforts regarding field work, provision of valid databases requires statistical and informational infrastructure to enable long-term access to longitudinal data sets on height, weight and related issues. To foster use of longitudinal data sets within the scientific community, provision of valid databases has to address data-protection regulations. It is, therefore, of major importance to hinder identifiability of individuals from publicly available databases. To reach this goal, one possible strategy is to provide a synthetic database to the public allowing for pretesting strategies for data analysis. The synthetic databases can be established using multiple imputation tools. Given the approval of the strategy, verification is based on the original data. Multiple imputation by chained equations is illustrated to facilitate provision of synthetic databases as it allows for capturing a wide range of statistical interdependencies. Also missing values, typically occurring within longitudinal databases for reasons of item non-response, can be addressed via multiple imputation when providing databases. The provision of synthetic databases using multiple imputation techniques is one possible strategy to ensure data protection, increase visibility of longitudinal databases and enhance the analytical potential.

  13. Diamond and cBN hybrid and nanomodified cutting tools with enhanced performance: Development, testing and modelling

    DEFF Research Database (Denmark)

    Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny

    2015-01-01

    with 25% of diamond replaced by cBN grains demonstrate 20% increased performance as compared with pure diamond machining tools, and more than two times higher performance as compared with pure cBN tools. Further, cast iron machining efficiency of the wheels modified by hBN particles was 80% more efficient......The potential of enhancement of superhard steel and cast iron cutting tool performance on the basis of microstuctural modifications of the tool materials is studied. Hybrid machining tools with mixed diamond and cBN grains, as well as machining tool with composite nanomodified metallic binder...... are developed, and tested experimentally and numerically. It is demonstrated that both combination of diamond and cBN (hybrid structure) and nanomodification of metallic binder (with hexagonal boron nitride/hBN platelets) lead to sufficient improvement of the cast iron machining performance. The superhard tools...

  14. Development and testing of an in-stream phosphorus cycling model for the soil and water assessment tool.

    Science.gov (United States)

    White, Michael J; Storm, Daniel E; Mittelstet, Aaron; Busteed, Philip R; Haggard, Brian E; Rossi, Colleen

    2014-01-01

    The Soil and Water Assessment Tool is widely used to predict the fate and transport of phosphorus (P) from the landscape through streams and rivers. The current in-stream P submodel may not be suitable for many stream systems, particularly those dominated by attached algae and those affected by point sources. In this research, we developed an alternative submodel based on the equilibrium P concentration concept coupled with a particulate scour and deposition model. This submodel was integrated with the SWAT model and applied to the Illinois River Watershed in Oklahoma, a basin influenced by waste water treatment plant discharges and extensive poultry litter application. The model was calibrated and validated using measured data. Highly variable in-stream P concentrations and equilibrium P concentration values were predicted spatially and temporally. The model also predicted the gradual storage of P in streambed sediments and the resuspension of this P during periodic high-flow flushing events. Waste water treatment plants were predicted to have a profound effect on P dynamics in the Illinois River due to their constant discharge even under base flow conditions. A better understanding of P dynamics in stream systems using the revised submodel may lead to the development of more effective mitigation strategies to control the impact of P from point and nonpoint sources. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  15. Ultrafast Laser Diagnostics for Energetic-Material Ignition Mechanisms: Tools for Physics-Based Model Development.

    Energy Technology Data Exchange (ETDEWEB)

    Kearney, Sean Patrick; Jilek, Brook Anton; Kohl, Ian Thomas; Farrow, Darcie; Urayama, Junji

    2014-11-01

    We present the results of an LDRD project to develop diagnostics to perform fundamental measurements of material properties during shock compression of condensed phase materials at micron spatial scales and picosecond time scales. The report is structured into three main chapters, which each focus on a different diagnostic devel opment effort. Direct picosecond laser drive is used to introduce shock waves into thin films of energetic and inert materials. The resulting laser - driven shock properties are probed via Ultrafast Time Domain Interferometry (UTDI), which can additionally be used to generate shock Hugoniot data in tabletop experiments. Stimulated Raman scattering (SRS) is developed as a temperature diagnostic. A transient absorption spectroscopy setup has been developed to probe shock - induced changes during shock compressio n. UTDI results are presented under dynamic, direct - laser - drive conditions and shock Hugoniots are estimated for inert polystyrene samples and for the explosive hexanitroazobenzene, with results from both Sandia and Lawrence Livermore presented here. SRS a nd transient absorption diagnostics are demonstrated on static thin - film samples, and paths forward to dynamic experiments are presented.

  16. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

    Energy Technology Data Exchange (ETDEWEB)

    Maranas, Costas D

    2012-05-21

    An overarching goal of the Department of Energy mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

  17. Computer model for the cardiovascular system: development of an e-learning tool for teaching of medical students.

    Science.gov (United States)

    Warriner, David Roy; Bayley, Martin; Shi, Yubing; Lawford, Patricia Victoria; Narracott, Andrew; Fenner, John

    2017-11-21

    This study combined themes in cardiovascular modelling, clinical cardiology and e-learning to create an on-line environment that would assist undergraduate medical students in understanding key physiological and pathophysiological processes in the cardiovascular system. An interactive on-line environment was developed incorporating a lumped-parameter mathematical model of the human cardiovascular system. The model outputs were used to characterise the progression of key disease processes and allowed students to classify disease severity with the aim of improving their understanding of abnormal physiology in a clinical context. Access to the on-line environment was offered to students at all stages of undergraduate training as an adjunct to routine lectures and tutorials in cardiac pathophysiology. Student feedback was collected on this novel on-line material in the course of routine audits of teaching delivery. Medical students, irrespective of their stage of undergraduate training, reported that they found the models and the environment interesting and a positive experience. After exposure to the environment, there was a statistically significant improvement in student performance on a series of 6 questions based on cardiovascular medicine, with a 33% and 22% increase in the number of questions answered correctly, p e-learning environment. Opportunities exist for development of similar environments in other fields of medicine, refinement of the existing environment and further engagement with student cohorts. This work combines some exciting and developing fields in medical education, but routine adoption of these types of tool will be possible only with the engagement of all stake-holders, from educationalists, clinicians, modellers to, most importantly, medical students.

  18. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  19. Environmental tools in product development

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen

    1994-01-01

    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...

  20. Urban Development Tools in Denmark

    DEFF Research Database (Denmark)

    Aunsborg, Christian; Enemark, Stig; Sørensen, Michael Tophøj

    2005-01-01

    Artiklen indeholder følgende afsnit: 1. Urbax and the Danish Planning system 2. Main Challenges in the Urban Development 3. Coordination and Growth (Management) Policies and Spatial Planning Policies 4. Coordination of Market Events and Spatial Planning 5. The application of Urban Development Tools...

  1. Endoscopy nurse-administered propofol sedation performance. Development of an assessment tool and a reliability testing model

    DEFF Research Database (Denmark)

    Jensen, Jeppe Thue; Konge, Lars; Møller, Ann

    2014-01-01

    -like fashion. Consensus was achieved on 17 items. Validity evidence was gathered in a case-control study in a full-scale simulation setting. Six experienced nurses and six novice nurses were filmed in two scenarios for assessment according to the assessment tool by three content expert raters. RESULTS: A total...... of training and for future certification. The aim of this study was to develop an assessment tool for measuring competency in propofol sedation and to explore the reliability and validity of the tool. MATERIAL AND METHODS: The nurse-administered propofol assessment tool (NAPSAT) was developed in a Delphi......OBJECTIVE: A gold standard of skills required for nurse-administered propofol sedation (NAPS) for gastroenterological endoscopic procedures has been proposed but not established. Due to the potentially hazardous nature of NAPS, an assessment tool is needed to objectively judge the adequacy...

  2. SU-F-T-405: Development of a Rapid Cardiac Contouring Tool Using Landmark-Driven Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Pelletier, C; Jung, J [East Carolina University Greenville, NC (United States); Mosher, E; Lee, C [National Cancer Institute, Rockville, MD (United States); Lee, C [University of Michigan, Ann Arbor, MI (United States)

    2016-06-15

    Purpose: This study aims to develop a tool to rapidly delineate cardiac substructures for use in dosimetry for large-scale clinical trial or epidemiological investigations. The goal is to produce a system that can semi-automatically delineate nine cardiac structures to a reasonable accuracy within a couple of minutes. Methods: The cardiac contouring tool employs a Most Similar Atlas method, where a selection criterion is used to pre-select the most similar model to the patient from a library of pre-defined atlases. Sixty contrast-enhanced cardiac computed tomography angiography (CTA) scans (30 male and 30 female) were manually contoured to serve as the atlas library. For each CTA 12 structures were delineated. Kabsch algorithm was used to compute the optimum rotation and translation matrices between the patient and atlas. Minimum root mean squared distance between the patient and atlas after transformation was used to select the most-similar atlas. An initial study using 10 CTA sets was performed to assess system feasibility. Leave-one patient out method was performed, and fit criteria were calculated to evaluate the fit accuracy compared to manual contours. Results: For the pilot study, mean dice indices of .895 were achieved for the whole heart, .867 for the ventricles, and .802 for the atria. In addition, mean distance was measured via the chord length distribution (CLD) between ground truth and the atlas structures for the four coronary arteries. The mean CLD for all coronary arteries was below 14mm, with the left circumflex artery showing the best agreement (7.08mm). Conclusion: The cardiac contouring tool is able to delineate cardiac structures with reasonable accuracy in less than 90 seconds. Pilot data indicates that the system is able to delineate the whole heart and ventricles within a reasonable accuracy using even a limited library. We are extending the atlas sets to 60 adult males and females in total.

  3. SU-F-T-405: Development of a Rapid Cardiac Contouring Tool Using Landmark-Driven Modeling

    International Nuclear Information System (INIS)

    Pelletier, C; Jung, J; Mosher, E; Lee, C; Lee, C

    2016-01-01

    Purpose: This study aims to develop a tool to rapidly delineate cardiac substructures for use in dosimetry for large-scale clinical trial or epidemiological investigations. The goal is to produce a system that can semi-automatically delineate nine cardiac structures to a reasonable accuracy within a couple of minutes. Methods: The cardiac contouring tool employs a Most Similar Atlas method, where a selection criterion is used to pre-select the most similar model to the patient from a library of pre-defined atlases. Sixty contrast-enhanced cardiac computed tomography angiography (CTA) scans (30 male and 30 female) were manually contoured to serve as the atlas library. For each CTA 12 structures were delineated. Kabsch algorithm was used to compute the optimum rotation and translation matrices between the patient and atlas. Minimum root mean squared distance between the patient and atlas after transformation was used to select the most-similar atlas. An initial study using 10 CTA sets was performed to assess system feasibility. Leave-one patient out method was performed, and fit criteria were calculated to evaluate the fit accuracy compared to manual contours. Results: For the pilot study, mean dice indices of .895 were achieved for the whole heart, .867 for the ventricles, and .802 for the atria. In addition, mean distance was measured via the chord length distribution (CLD) between ground truth and the atlas structures for the four coronary arteries. The mean CLD for all coronary arteries was below 14mm, with the left circumflex artery showing the best agreement (7.08mm). Conclusion: The cardiac contouring tool is able to delineate cardiac structures with reasonable accuracy in less than 90 seconds. Pilot data indicates that the system is able to delineate the whole heart and ventricles within a reasonable accuracy using even a limited library. We are extending the atlas sets to 60 adult males and females in total.

  4. Stochastic airspace simulation tool development

    Science.gov (United States)

    2009-10-01

    Modeling and simulation is often used to study : the physical world when observation may not be : practical. The overall goal of a recent and ongoing : simulation tool project has been to provide a : documented, lifecycle-managed, multi-processor : c...

  5. Molecular Modeling and Simulation Tools in the Development of Peptide-Based Biosensors for Mycotoxin Detection: Example of Ochratoxin

    Directory of Open Access Journals (Sweden)

    Aby A. Thyparambil

    2017-12-01

    Full Text Available Mycotoxin contamination of food and feed is now ubiquitous. Exposures to mycotoxin via contact or ingestion can potentially induce adverse health outcomes. Affordable mycotoxin-monitoring systems are highly desired but are limited by (a the reliance on technically challenging and costly molecular recognition by immuno-capture technologies; and (b the lack of predictive tools for directing the optimization of alternative molecular recognition modalities. Our group has been exploring the development of ochratoxin detection and monitoring systems using the peptide NFO4 as the molecular recognition receptor in fluorescence, electrochemical and multimodal biosensors. Using ochratoxin as the model mycotoxin, we share our perspective on addressing the technical challenges involved in biosensor fabrication, namely: (a peptide receptor design; and (b performance evaluation. Subsequently, the scope and utility of molecular modeling and simulation (MMS approaches to address the above challenges are described. Informed and enabled by phage display, the subsequent application of MMS approaches can rationally guide subsequent biomolecular engineering of peptide receptors, including bioconjugation and bioimmobilization approaches to be used in the fabrication of peptide biosensors. MMS approaches thus have the potential to reduce biosensor development cost, extend product life cycle, and facilitate multi-analyte detection of mycotoxins, each of which positively contributes to the overall affordability of mycotoxin biosensor monitoring systems.

  6. Tools for Model Evaluation

    DEFF Research Database (Denmark)

    Olesen, H. R.

    1998-01-01

    Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France.......Proceedings of the Twenty-Second NATO/CCMS International Technical Meeting on Air Pollution Modeling and Its Application, held June 6-10, 1997, in Clermont-Ferrand, France....

  7. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    Funkhouser, B.R.; Moyer, J.W.; Ballweg, E.L.

    1994-01-01

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  8. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  9. Development of accurate UWB dielectric properties dispersion at CST simulation tool for modeling microwave interactions with numerical breast phantoms

    International Nuclear Information System (INIS)

    Maher, A.; Quboa, K. M.

    2011-01-01

    In this paper, a reformulation for the recently published dielectric properties dispersion models of the breast tissues is carried out to be used by CST simulation tool. The reformulation includes tabulation of the real and imaginary parts versus frequency on ultra-wideband (UWB) for these models by MATLAB programs. The tables are imported and fitted by CST simulation tool to second or first order general equations. The results have shown good agreement between the original and the imported data. The MATLAB programs written in MATLAB code are included in the appendix.

  10. Requirements for clinical information modelling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Kalra, Dipak

    2015-07-01

    This study proposes consensus requirements for clinical information modelling tools that can support modelling tasks in medium/large scale institutions. Rather than identify which functionalities are currently available in existing tools, the study has focused on functionalities that should be covered in order to provide guidance about how to evolve the existing tools. After identifying a set of 56 requirements for clinical information modelling tools based on a literature review and interviews with experts, a classical Delphi study methodology was applied to conduct a two round survey in order to classify them as essential or recommended. Essential requirements are those that must be met by any tool that claims to be suitable for clinical information modelling, and if we one day have a certified tools list, any tool that does not meet essential criteria would be excluded. Recommended requirements are those more advanced requirements that may be met by tools offering a superior product or only needed in certain modelling situations. According to the answers provided by 57 experts from 14 different countries, we found a high level of agreement to enable the study to identify 20 essential and 21 recommended requirements for these tools. It is expected that this list of identified requirements will guide developers on the inclusion of new basic and advanced functionalities that have strong support by end users. This list could also guide regulators in order to identify requirements that could be demanded of tools adopted within their institutions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  12. Applied, theoretical modeling of space-based assembly, using expert system architecture for computer-aided engineering tool development

    Science.gov (United States)

    Jolly, Steven Douglas

    1992-01-01

    The challenges associated with constructing interplanetary spacecraft and space platforms in low earth orbit are such that it is imperative that comprehensive, preliminary process planning analyses be completed before committing funds for Phase B design (detail design, development). Phase A and 'pre-Phase A' design activities will commonly address engineering questions such as mission-design structural integrity, attitude control, thermal control, etc. But the questions of constructability, maintainability and reliability during the assembly phase usually go unaddressed until the more mature stages of design (or very often production) are reached. This is an unacceptable strategy for future space missions whether they be government or commercial ventures. After interviews with expert Aerospace and Construction industry planners a new methodology was formulated and a Blackboard Metaphor Knowledge-based Expert System synthesis model has been successfully developed which can decompose interplanetary vehicles into deliverable orbital subassemblies. Constraint propagation, including launch vehicle payload shroud envelope, is accomplished with heuristic and numerical algorithms including a unique adaptation of a reasoning technique used by Stanford researchers in terrestrial automated process planning. The model is a hybrid combination of rule and frame-based representations, designed to integrate into a Computer-Aided Engineering (CAE) environment. Emphasis is placed on the actual joining, rendezvous, and refueling of the orbiting, dynamic spacecraft. Significant results of this new methodology upon a large Mars interplanetary spacecraft (736,000 kg) designed by Boeing, show high correlation to manual decomposition and planning analysis studies, but at a fraction of the time, and with little user interaction. Such Computer-Aided Engineering (CAE) tools would greatly leverage the designers ability to assess constructability.

  13. Development and implementation of a GIS-based tool for spatial modeling of seismic vulnerability of Tehran

    Directory of Open Access Journals (Sweden)

    M. Hashemi

    2012-12-01

    Full Text Available Achieving sustainable development in countries prone to earthquakes is possible with taking effective measures to reduce vulnerability to earthquakes. In this context, damage assessment of hypothetical earthquakes and planning for disaster management are important issues. Having a computer tool capable of estimating structural and human losses from earthquakes in a specific region may facilitate the decision-making process before and during disasters. Interoperability of this tool with wide-spread spatial analysis frameworks will expedite the data transferring process. In this study, the earthquake damage assessment (EDA software tool is developed as an embedded extension within a GIS (geographic information system environment for the city of Tehran, Iran. This GIS-based extension provides users with a familiar environment to estimate and observe the probable damages and fatalities of a deterministic earthquake scenario. The productivity of this tool is later demonstrated for southern Karoon parish, Region 10, Tehran. Three case studies for three active faults in the area and a comparison of the results with other research substantiated the reliability of this tool for additional earthquake scenarios.

  14. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  15. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Schulz, M.

    2012-01-01

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  16. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    Directory of Open Access Journals (Sweden)

    Elizabeth S. Burnside MD, MPH, MS

    2017-07-01

    Full Text Available Background: There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective: To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods: The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49, annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16 who completed surveys. Results: The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16 liked the appearance of the site; 94% (15/16 found the tool helpful; and 94% (15/16 would recommend the tool to a colleague. Conclusions: This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms.

  17. Modifying the Soil and Water Assessment Tool to Simulate Cropland Carbon Flux: Model Development and Initial Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xuesong; Izaurralde, Roberto C.; Arnold, Jeffrey; Williams, Jimmy R.; Srinivasan, Raghavan

    2013-10-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO2 emission, one important source of greenhouse gases (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)- residue and simulate land-atmosphere carbon exchange.

  18. Modifying the Soil and Water Assessment Tool to simulate cropland carbon flux: model development and initial evaluation.

    Science.gov (United States)

    Zhang, Xuesong; Izaurralde, R César; Arnold, Jeffrey G; Williams, Jimmy R; Srinivasan, Raghavan

    2013-10-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO2 emission, one important source of greenhouse gasses (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)-residue and simulate land-atmosphere carbon exchange. We test this new SWAT-C model with daily eddy covariance (EC) observations of net ecosystem exchange (NEE) and evapotranspiration (ET) and annual crop yield at six sites across the U.S. Midwest. Results show that SWAT-C simulates well multi-year average NEE and ET across the spatially distributed sites and capture the majority of temporal variation of these two variables at a daily time scale at each site. Our analyses also reveal that performance of SWAT-C is influenced by multiple factors, such as crop management practices (irrigated vs. rainfed), completeness and accuracy of input data, crop species, and initialization of state variables. Overall, the new SWAT-C demonstrates favorable performance for simulating land-atmosphere carbon exchange across agricultural sites with different soils, climate, and management practices. SWAT-C is expected to serve as a useful tool for including carbon flux into consideration in sustainable watershed management under a changing climate. We also note that extensive assessment of SWAT-C with field observations is required for further improving the model and understanding potential uncertainties of applying it across large regions with complex landscapes. © 2013.

  19. A Robust Profitability Assessment Tool for Targeting Agricultural Investments in Developing Countries: Modeling Spatial Heterogeneity and Uncertainty

    Science.gov (United States)

    Quinn, J. D.; Zeng, Z.; Shoemaker, C. A.; Woodard, J.

    2014-12-01

    In sub-Saharan Africa, where the majority of the population earns their living from agriculture, government expenditures in many countries are being re-directed to the sector to increase productivity and decrease poverty. However, many of these investments are seeing low returns because they are poorly targeted. A geographic tool that accounts for spatial heterogeneity and temporal variability in the factors of production would allow governments and donors to optimize their investments by directing them to farmers for whom they are most profitable. One application for which this is particularly relevant is fertilizer recommendations. It is well-known that soil fertility in much of sub-Saharan Africa is declining due to insufficient nutrient inputs to replenish those lost through harvest. Since fertilizer application rates in sub-Saharan Africa are several times smaller than in other developing countries, it is often assumed that African farmers are under-applying fertilizer. However, this assumption ignores the risk farmers face in choosing whether or how much fertilizer to apply. Simply calculating the benefit/cost ratio of applying a given level of fertilizer in a particular year over a large, aggregated region (as is often done) overlooks the variability in yield response seen at different sites within the region, and at the same site from year to year. Using Ethiopia as an example, we are developing a 1 km resolution fertilizer distribution tool that provides pre-season fertilizer recommendations throughout the agricultural regions of the country, conditional on seasonal climate forecasts. By accounting for spatial heterogeneity in soil, climate, market and travel conditions, as well as uncertainty in climate and output prices at the time a farmer must purchase fertilizer, this stochastic optimization tool gives better recommendations to governments, fertilizer companies, and aid organizations looking to optimize the welfare benefits achieved by their

  20. Development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool: An Evidence-Based Model for School Garden Integration.

    Science.gov (United States)

    Burt, Kate Gardner; Koch, Pamela; Contento, Isobel

    2017-10-01

    Researchers have established the benefits of school gardens on students' academic achievement, dietary outcomes, physical activity, and psychosocial skills, yet limited research has been conducted about how school gardens become institutionalized and sustained. Our aim was to develop a tool that captures how gardens are effectively established, integrated, and sustained in schools. We conducted a sequential, exploratory, mixed-methods study. Participants were identified with the help of Grow To Learn, the organization coordinating the New York City school garden initiative, and recruited via e-mail. A stratified, purposeful sample of 21 New York City elementary and middle schools participated in this study throughout the 2013/2014 school year. The sample was stratified in their garden budgets and purposeful in that each of the schools' gardens were determined to be well integrated and sustained. The processes and strategies used by school gardeners to establish well-integrated school gardens were assessed via data collected from surveys, interviews, observations, and concept mapping. Descriptive statistics as well as multidimensional scaling and hierarchical cluster analysis were used to examine the survey and concept mapping data. Qualitative data analysis consisted of thematic coding, pattern matching, explanation building and cross-case synthesis. Nineteen components within four domains of school garden integration were found through the mixed-methods concept mapping analysis. When the analyses of other data were combined, relationships between domains and components emerged. These data resulted in the development of the GREEN (Garden Resources, Education, and Environment Nexus) Tool. When schools with integrated and sustained gardens were studied, patterns emerged about how gardeners achieve institutionalization through different combinations of critical components. These patterns are best described by the GREEN Tool, the first framework to identify how to

  1. Modifying the Soil and Water Assessment Tool to simulate cropland carbon flux: Model development and initial evaluation

    International Nuclear Information System (INIS)

    Zhang, Xuesong; Izaurralde, R. César; Arnold, Jeffrey G.; Williams, Jimmy R.; Srinivasan, Raghavan

    2013-01-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO 2 emission, one important source of greenhouse gasses (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)-residue and simulate land–atmosphere carbon exchange. We test this new SWAT-C model with daily eddy covariance (EC) observations of net ecosystem exchange (NEE) and evapotranspiration (ET) and annual crop yield at six sites across the U.S. Midwest. Results show that SWAT-C simulates well multi-year average NEE and ET across the spatially distributed sites and capture the majority of temporal variation of these two variables at a daily time scale at each site. Our analyses also reveal that performance of SWAT-C is influenced by multiple factors, such as crop management practices (irrigated vs. rainfed), completeness and accuracy of input data, crop species, and initialization of state variables. Overall, the new SWAT-C demonstrates favorable performance for simulating land–atmosphere carbon exchange across agricultural sites with different soils, climate, and management practices. SWAT-C is expected to serve as a useful tool for including carbon flux into consideration in sustainable watershed management under a changing climate. We also note that extensive assessment of SWAT-C with field observations is required for further improving the model and understanding potential uncertainties of applying it across large regions with complex landscapes. - Highlights: • Expanding the SWAT

  2. Modifying the Soil and Water Assessment Tool to simulate cropland carbon flux: Model development and initial evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xuesong; Izaurralde, R. César [Joint Global Change Research Institute, Pacific Northwest National Laboratory and University of Maryland, College Park, MD 20740 (United States); Arnold, Jeffrey G. [Grassland, Soil and Water Research Laboratory USDA-ARS, Temple, TX 76502 (United States); Williams, Jimmy R. [Blackland Research and Extension Center, AgriLIFE Research, 720 E. Blackland Road, Temple, TX 76502 (United States); Srinivasan, Raghavan [Spatial Sciences Laboratory in the Department of Ecosystem Science and Management, Texas A and M University, College Stations, TX 77845 (United States)

    2013-10-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO{sub 2} emission, one important source of greenhouse gasses (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)-residue and simulate land–atmosphere carbon exchange. We test this new SWAT-C model with daily eddy covariance (EC) observations of net ecosystem exchange (NEE) and evapotranspiration (ET) and annual crop yield at six sites across the U.S. Midwest. Results show that SWAT-C simulates well multi-year average NEE and ET across the spatially distributed sites and capture the majority of temporal variation of these two variables at a daily time scale at each site. Our analyses also reveal that performance of SWAT-C is influenced by multiple factors, such as crop management practices (irrigated vs. rainfed), completeness and accuracy of input data, crop species, and initialization of state variables. Overall, the new SWAT-C demonstrates favorable performance for simulating land–atmosphere carbon exchange across agricultural sites with different soils, climate, and management practices. SWAT-C is expected to serve as a useful tool for including carbon flux into consideration in sustainable watershed management under a changing climate. We also note that extensive assessment of SWAT-C with field observations is required for further improving the model and understanding potential uncertainties of applying it across large regions with complex landscapes. - Highlights: • Expanding the

  3. Analysis and prediction of agricultural pest dynamics with Tiko'n, a generic tool to develop agroecological food web models

    Science.gov (United States)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.

    2016-12-01

    While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs

  4. Development of multiple linear regression models as predictive tools for fecal indicator concentrations in a stretch of the lower Lahn River, Germany.

    Science.gov (United States)

    Herrig, Ilona M; Böer, Simone I; Brennholt, Nicole; Manz, Werner

    2015-11-15

    Since rivers are typically subject to rapid changes in microbiological water quality, tools are needed to allow timely water quality assessment. A promising approach is the application of predictive models. In our study, we developed multiple linear regression (MLR) models in order to predict the abundance of the fecal indicator organisms Escherichia coli (EC), intestinal enterococci (IE) and somatic coliphages (SC) in the Lahn River, Germany. The models were developed on the basis of an extensive set of environmental parameters collected during a 12-months monitoring period. Two models were developed for each type of indicator: 1) an extended model including the maximum number of variables significantly explaining variations in indicator abundance and 2) a simplified model reduced to the three most influential explanatory variables, thus obtaining a model which is less resource-intensive with regard to required data. Both approaches have the ability to model multiple sites within one river stretch. The three most important predictive variables in the optimized models for the bacterial indicators were NH4-N, turbidity and global solar irradiance, whereas chlorophyll a content, discharge and NH4-N were reliable model variables for somatic coliphages. Depending on indicator type, the extended mode models also included the additional variables rainfall, O2 content, pH and chlorophyll a. The extended mode models could explain 69% (EC), 74% (IE) and 72% (SC) of the observed variance in fecal indicator concentrations. The optimized models explained the observed variance in fecal indicator concentrations to 65% (EC), 70% (IE) and 68% (SC). Site-specific efficiencies ranged up to 82% (EC) and 81% (IE, SC). Our results suggest that MLR models are a promising tool for a timely water quality assessment in the Lahn area. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. A Tool for Conceptualising in PSS development

    DEFF Research Database (Denmark)

    Matzen, Detlef; McAloone, Timothy Charles

    2006-01-01

    This paper introduces a tool for conceptualising in the development of product/servicesystems (PSS), based upon the modelling of service activities. Our argumentation is built on two previous articles by the same author, previously presented at the 16. Symposium “Design for X” [1] and the 9th...... International Design Conference [2]. In this contribution, we take the step from a fundamental understanding of the phenomenon to creating a normative exploitation of this understanding for PSS concept development. The developed modelling technique is based on the Customer Activity Cycle (CAC) model...... the integrated consideration of the customers’ activities, possible PSS offerings and beneficial partnering options (i.e. between different supplier companies) within the delivery value chain....

  6. Assessment of low contrast detection in CT using model observers. Developing a clinically-relevant tool for characterising adaptive statistical and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Ott, Julien G.; Ba, Alexandre; Racine, Damien; Viry, Anais; Bochud, Francois O.; Verdun, Francis R. [Univ. Hospital Lausanne (Switzerland). Inst. of Radiation Physics

    2017-08-01

    This study aims to assess CT image quality in a way that would meet specific requirements of clinical practice. Physics metrics like Fourier transform derived metrics were traditionally employed for that. However, assessment methods through a detection task have also developed quite extensively lately, and we chose here to rely on this modality for image quality assessment. Our goal was to develop a tool adapted for a fast and reliable CT image quality assessment in order to pave the way for new CT benchmarking techniques in a clinical context. Additionally, we also used this method to estimate the benefits brought by some IR algorithms. A modified QRM chest phantom containing spheres of 5 and 8 mm at contrast levels of 10 and 20 HU at 120 kVp was used. Images of the phantom were acquired at CTDI{sub vol} of 0.8, 3.6, 8.2 and 14.5 mGy, before being reconstructed using FBP, ASIR 40 and MBIR on a GE HD 750 CT scanner. They were then assessed by eight human observers undergoing a 4-AFC test. After that, these data were compared with the results obtained from two different model observers (NPWE and CHO with DDoG channels). The study investigated the effects of the acquisition conditions as well as reconstruction methods. NPWE and CHO models both gave coherent results and approximated human observer results well. Moreover, the reconstruction technique used to retrieve the images had a clear impact on the PC values. Both models suggest that switching from FBP to ASIR 40 and particularly to MBIR produces an increase of the low contrast detection, provided a minimum level of exposure is reached. Our work shows that both CHO with DDoG channels and NPWE models both approximate the trend of humans performing a detection task. Both models also suggest that the use of MBIR goes along with an increase of the PCs, indicating that further dose reduction is still possible when using those techniques. Eventually, the CHO model associated to the protocol we described in this study

  7. On the Development of a Java-Based Tool for Multifidelity Modeling of Coupled Systems LDRD Final Report

    CERN Document Server

    Gardner, D R; Gonzáles, M A; Hennigan, G L; Young, M

    2002-01-01

    This report describes research and development of methods to couple vastly different subsystems and physical models and to encapsulate these methods in a Java(trademark)-based framework. The work described here focused on developing a capability to enable design engineers and safety analysts to perform multifidelity, multiphysics analyses more simply. In particular this report describes a multifidelity algorithm for thermal radiative heat transfer and illustrates its performance. Additionally, it describes a module-based computer software architecture that facilitates multifidelity, multiphysics simulations. The architecture is currently being used to develop an environment for modeling the effects of radiation on electronic circuits in support of the FY 2003 Hostile Environments Milestone for the Accelerated Strategic Computing Initiative.

  8. A Stochastic Model of Space Radiation Transport as a Tool in the Development of Time-Dependent Risk Assessment

    Science.gov (United States)

    Kim, Myung-Hee Y.; Nounu, Hatem N.; Ponomarev, Artem L.; Cucinotta, Francis A.

    2011-01-01

    A new computer model, the GCR Event-based Risk Model code (GERMcode), was developed to describe biophysical events from high-energy protons and heavy ions that have been studied at the NASA Space Radiation Laboratory (NSRL) [1] for the purpose of simulating space radiation biological effects. In the GERMcode, the biophysical description of the passage of heavy ions in tissue and shielding materials is made with a stochastic approach that includes both ion track structure and nuclear interactions. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model [2]. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections

  9. Evaluation of endourological tools to improve the diagnosis and therapy of ureteral tumors – from model development to clinical application

    Directory of Open Access Journals (Sweden)

    Wagner D.

    2015-09-01

    Full Text Available Adequate diagnosis of upper urinary tract (UUT tumors is essential for successful local treatment. Organsparing approaches are technically difficult and require consistent further development. Appropriate models for investigating new diagnostic and therapeutic methods are not yet available. This study demonstrates the incorporation of a fresh sample model into five different test levels (I-V for improving the diagnosis and therapy of ureteral tumors. In these test levels, new diagnostic and ablation techniques are evaluated for feasibility, application safety, efficacy and accuracy. An assessment of their suitability for broad preclinical and clinical application also took economic aspects into account.

  10. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  11. A model for using a concept inventory as a tool for students' assessment and faculty professional development.

    Science.gov (United States)

    Marbach-Ad, Gili; McAdams, Katherine C; Benson, Spencer; Briken, Volker; Cathcart, Laura; Chase, Michael; El-Sayed, Najib M; Frauwirth, Kenneth; Fredericksen, Brenda; Joseph, Sam W; Lee, Vincent; McIver, Kevin S; Mosser, David; Quimby, B Booth; Shields, Patricia; Song, Wenxia; Stein, Daniel C; Stewart, Richard; Thompson, Katerina V; Smith, Ann C

    2010-01-01

    This essay describes how the use of a concept inventory has enhanced professional development and curriculum reform efforts of a faculty teaching community. The Host Pathogen Interactions (HPI) teaching team is composed of research and teaching faculty with expertise in HPI who share the goal of improving the learning experience of students in nine linked undergraduate microbiology courses. To support evidence-based curriculum reform, we administered our HPI Concept Inventory as a pre- and postsurvey to approximately 400 students each year since 2006. The resulting data include student scores as well as their open-ended explanations for distractor choices. The data have enabled us to address curriculum reform goals of 1) reconciling student learning with our expectations, 2) correlating student learning with background variables, 3) understanding student learning across institutions, 4) measuring the effect of teaching techniques on student learning, and 5) demonstrating how our courses collectively form a learning progression. The analysis of the concept inventory data has anchored and deepened the team's discussions of student learning. Reading and discussing students' responses revealed the gap between our understanding and the students' understanding. We provide evidence to support the concept inventory as a tool for assessing student understanding of HPI concepts and faculty development.

  12. SNL-NUMO collaborative : development of a deterministic site characterization tool using multi-model ranking and inference.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew; Lowry, Thomas Stephen; Arnold, Bill Walter; James, Scott Carlton; Gray, Genetha Anne; Ahlmann, Michael

    2008-08-01

    Uncertainty in site characterization arises from a lack of data and knowledge about a site and includes uncertainty in the boundary conditions, uncertainty in the characteristics, location, and behavior of major features within an investigation area (e.g., major faults as barriers or conduits), uncertainty in the geologic structure, as well as differences in numerical implementation (e.g., 2-D versus 3-D, finite difference versus finite element, grid resolution, deterministic versus stochastic, etc.). Since the true condition at a site can never be known, selection of the best conceptual model is very difficult. In addition, limiting the understanding to a single conceptualization too early in the process, or before data can support that conceptualization, may lead to confidence in a characterization that is unwarranted as well as to data collection efforts and field investigations that are misdirected and/or redundant. Using a series of numerical modeling experiments, this project examined the application and use of information criteria within the site characterization process. The numerical experiments are based on models of varying complexity that were developed to represent one of two synthetically developed groundwater sites; (1) a fully hypothetical site that represented a complex, multi-layer, multi-faulted site, and (2) a site that was based on the Horonobe site in northern Japan. Each of the synthetic sites were modeled in detail to provide increasingly informative 'field' data over successive iterations to the representing numerical models. The representing numerical models were calibrated to the synthetic site data and then ranked and compared using several different information criteria approaches. Results show, that for the early phases of site characterization, low-parameterized models ranked highest while more complex models generally ranked lowest. In addition, predictive capabilities were also better with the low-parameterized models. For

  13. Identification and implementation of end-user needs during development of a state-of-the-art modeling tool-set - 59069

    International Nuclear Information System (INIS)

    Seitz, Roger; Williamson, Mark; Gerdes, Kurt; Freshley, Mark; Dixon, Paul; Collazo, Yvette T.; Hubbard, Susan

    2012-01-01

    The U.S. Department of Energy (US DOE) Office of Environmental Management, Technology Innovation and Development is supporting a multi-National Laboratory effort to develop the Advanced Simulation Capability for Environmental Management (ASCEM). ASCEM is an emerging state-of-the-art scientific approach and software infrastructure for understanding and predicting contaminant fate and transport in natural and engineered systems. These modular and open-source high performance computing tools and user interfaces will facilitate integrated approaches that enable standardized assessments of performance and risk for EM cleanup and closure decisions. The ASCEM team recognized that engaging end-users in the ASCEM development process would lead to enhanced development and implementation of the ASCEM tool-sets in the user community. End-user involvement in ASCEM covers a broad spectrum of perspectives, including: performance assessment (PA) and risk assessment practitioners, research scientists, decision-makers, oversight personnel, and regulators engaged in the US DOE cleanup mission. End-users are primarily engaged in ASCEM via the ASCEM User Steering Committee (USC) and the 'user needs interface' task. Future plans also include user involvement in demonstrations of the ASCEM tools. This paper will describe the details of how end users have been engaged in the ASCEM program and will demonstrate how this involvement has strengthened both the tool development and community confidence. ASCEM tools requested by end-users specifically target modeling challenges associated with US DOE cleanup activities. The demonstration activities involve application of ASCEM tools and capabilities to representative problems at DOE sites. Selected results from the ASCEM Phase 1 demonstrations are discussed to illustrate how capabilities requested by end-users were implemented in prototype versions of the ASCEM tool. The ASCEM team engaged a variety of interested parties early in the development

  14. ANSYS tools in modeling tires

    Science.gov (United States)

    Ali, Ashraf; Lovell, Michael

    1995-01-01

    This presentation summarizes the capabilities in the ANSYS program that relate to the computational modeling of tires. The power and the difficulties associated with modeling nearly incompressible rubber-like materials using hyperelastic constitutive relationships are highlighted from a developer's point of view. The topics covered include a hyperelastic material constitutive model for rubber-like materials, a general overview of contact-friction capabilities, and the acoustic fluid-structure interaction problem for noise prediction. Brief theoretical development and example problems are presented for each topic.

  15. Combining Tools to Design and Develop Software Support for Capabilities

    Directory of Open Access Journals (Sweden)

    Martin Henkel

    2017-04-01

    Full Text Available Analyzing, designing and implementing software systems based on the concept of capabilities have several benefits, such as the ability to design efficient monitoring of capabilities and their execution context. Today, there exist new model-driven methods and development tools that support capability-based analysis, design, and implementation. However, there are also a plethora of existing efficient development tools that are currently in use by organizations. In this article, we examine how a new set of capability based tools, the Capability Driven Development (CDD environment, can be combined with model-driven development tools to leverage both novel capability-based functionality and the proven functionality of existing tools. We base the examination on a case study where an existing model-driven tool is combined with the CDD environment.

  16. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  17. Towards a new tool to develop a 3-D shear-wave velocity model from converted waves

    Science.gov (United States)

    Colavitti, Leonardo; Hetényi, György

    2017-04-01

    The main target of this work is to develop a new method in which we exploit converted waves to construct a fully 3-D shear-wave velocity model of the crust. A reliable 3-D model is very important in Earth sciences because geological structures may vary significantly in their lateral dimension. In particular, shear-waves provide valuable complementary information with respect to P-waves because they usually guarantee a much better correlation in terms of rock density and mechanical properties, reducing the interpretation ambiguities. Therefore, it is fundamental to develop a new technique to improve structural images and to describe different lithologies in the crust. In this study we start from the analysis of receiver functions (RF, Langston, 1977), which are nowadays largely used for structural investigations based on passive seismic experiments, to map Earth discontinuities at depth. The RF technique is also commonly used to invert for velocity structure beneath single stations. Here, we plan to combine two strengths of RF method: shear-wave velocity inversion and dense arrays. Starting from a simple 3-D forward model, synthetic RFs are obtained extracting the structure along a ray to match observed data. During the inversion, thanks to a dense stations network, we aim to build and develop a multi-layer crustal model for shear-wave velocity. The initial model should be chosen simple to make sure that the inversion process is not influenced by the constraints in terms of depth and velocity posed at the beginning. The RFs inversion represents a complex problem because the amplitude and the arrival time of different phases depend in a non-linear way on the depth of interfaces and the characteristics of the velocity structure. The solution we envisage to manage the inversion problem is the stochastic Neighbourhood Algorithm (NA, Sambridge, 1999a, b), whose goal is to find an ensemble of models that sample the good data-fitting regions of a multidimensional parameter

  18. Capitalizing on App Development Tools and Technologies

    Science.gov (United States)

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  19. Observation Tools for Professional Development

    Science.gov (United States)

    Malu, Kathleen F.

    2015-01-01

    Professional development of teachers, including English language teachers, empowers them to change in ways that improve teaching and learning (Gall and Acheson 2011; Murray 2010). In their seminal research on staff development--professional development in today's terms--Joyce and Showers (2002) identify key factors that promote teacher change.…

  20. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  1. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  2. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  3. Composite Sequence-Structure Stability Models as Screening Tools for Identifying Vulnerable Targets for HIV Drug and Vaccine Development.

    Science.gov (United States)

    Manocheewa, Siriphan; Mittler, John E; Samudrala, Ram; Mullins, James I

    2015-11-04

    Rapid evolution and high sequence diversity enable Human Immunodeficiency Virus (HIV) populations to acquire mutations to escape antiretroviral drugs and host immune responses, and thus are major obstacles for the control of the pandemic. One strategy to overcome this problem is to focus drugs and vaccines on regions of the viral genome in which mutations are likely to cripple function through destabilization of viral proteins. Studies relying on sequence conservation alone have had only limited success in determining critically important regions. We tested the ability of two structure-based computational models to assign sites in the HIV-1 capsid protein (CA) that would be refractory to mutational change. The destabilizing mutations predicted by these models were rarely found in a database of 5811 HIV-1 CA coding sequences, with none being present at a frequency greater than 2%. Furthermore, 90% of variants with the low predicted stability (from a set of 184 CA variants whose replication fitness or infectivity has been studied in vitro) had aberrant capsid structures and reduced viral infectivity. Based on the predicted stability, we identified 45 CA sites prone to destabilizing mutations. More than half of these sites are targets of one or more known CA inhibitors. The CA regions enriched with these sites also overlap with peptides shown to induce cellular immune responses associated with lower viral loads in infected individuals. Lastly, a joint scoring metric that takes into account both sequence conservation and protein structure stability performed better at identifying deleterious mutations than sequence conservation or structure stability information alone. The computational sequence-structure stability approach proposed here might therefore be useful for identifying immutable sites in a protein for experimental validation as potential targets for drug and vaccine development.

  4. Development of monitoring and modelling tools as basis for sustainable thermal management concepts of urban groundwater bodies

    Science.gov (United States)

    Mueller, Matthias H.; Epting, Jannis; Köhler, Mandy; Händel, Falk; Huggenberger, Peter

    2015-04-01

    Increasing groundwater temperatures observed in many urban areas strongly interfere with the demand of thermal groundwater use. The groundwater temperatures in these urban areas are affected by numerous interacting factors: open and closed-loop geothermal systems for heating and cooling, sealed surfaces, constructions in the subsurface (infrastructure and buildings), artificial groundwater recharge, and interaction with rivers. On the one hand, these increasing groundwater temperatures will negatively affect the potential for its use in the future e.g. for cooling purposes. On the other hand, elevated subsurface temperatures can be considered as an energy source for shallow geothermal heating systems. Integrated thermal management concepts are therefore needed to coordinate the thermal use of groundwater in urban areas. These concepts should be based on knowledge of the driving processes which influence the thermal regime of the aquifer. We are currently investigating the processes influencing the groundwater temperature throughout the urban area of Basel City, Switzerland. This involves a three-dimensional numerical groundwater heat-transport model including geothermal use and interactions with the unsaturated zone such as subsurface constructions reaching into the aquifer. The cantonal groundwater monitoring system is an important part of the data base in our model, which will help to develop sustainable management strategies. However, single temperature measurements in conventional groundwater wells can be biased by vertical thermal convection. Therefore, multilevel observation wells are used in the urban areas of the city to monitor subsurface temperatures reaching from the unsaturated zone to the base of the aquifer. These multilevel wells are distributed in a pilot area in order to monitor the subsurface temperatures in the vicinity of deep buildings and to quantify the influence of the geothermal use of groundwater. Based on time series of the conventional

  5. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  6. Awareness Development Across Perspectives Tool (ADAPT)

    Science.gov (United States)

    2010-10-01

    individualist and collectivist cultures are described and linked in the generic knowledge base, and the specific cultural aspects and how they relate to...RTO-MP-HFM-202 2 - 1 Awareness Development Across Perspectives Tool (ADAPT)1 Dr. A.J. van Vliet TNO Human Factors PO Box 23, 3769ZG...This paper discusses the development of this Awareness Development across Perspectives Tool (ADAPT). The Approach Our research and development

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  8. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    Numerous design decisions including architectural decisions are made while developing a software system, which influence the architecture of the system as well as subsequent decisions. Several tools already exist for managing design decisions, i.e. capturing, documenting, and maintaining them......, but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models....... In this paper, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from an example: the UML modeling tool shall show all decisions related to a model and allow extending or updating them; the decision management tool shall trigger the modeling tool...

  9. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  10. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  11. Quality Assurance Project Plan Development Tool

    Science.gov (United States)

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  12. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...... for a level of detailing which is high enough to facilitate an understan- ding of the generic qualities of proposed designs. Starting from a brief overview of parametric design, this paper presents initial findings from the development of a parametric urban design tool with regard to developing a structural...... logic which is flexible and expandable. It then moves on to outline and discuss further development work. Finally, it offers a brief reflection on the potentials and shortcomings of the software – CityEngine – which is used for developing the parametric urban design tool....

  13. Development and Psychometric of Assessment Tool of Students\\' Preventive Behaviors of Cutaneous Leishmaniosis Based on BASNEF Model

    Directory of Open Access Journals (Sweden)

    Musalreza Ghodsi

    2017-08-01

    The results showed that prevention behavior questionnaire of skin, cutaneous Based on BASNEF Model is valid and reliable with 36 items, and because of the strength of suitable factor structure and psychometric properties, researchers can use it in the  related studies.

  14. Developing Tool Support for Problem Diagrams with CPN and VDM++

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe ongoing work on the development of tool support for formal description of domains found in Problem Diagrams. The purpose of the tool is to handle the generation of a CPN model based on a collection of Problem Diagrams. The Problem Diagrams are used for representing the ...

  15. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  16. Development and experimental assessment of a numerical modelling code to aid the design of profile extrusion cooling tools

    Science.gov (United States)

    Carneiro, O. S.; Rajkumar, A.; Fernandes, C.; Ferrás, L. L.; Habla, F.; Nóbrega, J. M.

    2017-10-01

    On the extrusion of thermoplastic profiles, upon the forming stage that takes place in the extrusion die, the profile must be cooled in a metallic calibrator. This stage must be done at a high rate, to assure increased productivity, but avoiding the development of high temperature gradients, in order to minimize the level of induced thermal residual stresses. In this work, we present a new coupled numerical solver, developed in the framework of the OpenFOAM® computational library, that computes the temperature distribution in both domains simultaneously (metallic calibrator and plastic profile), whose implementation aimed the minimization of the computational time. The new solver was experimentally assessed with an industrial case study.

  17. Development of a cardiovascular diseases risk prediction model and tools for Chinese patients with type 2 diabetes mellitus: A population-based retrospective cohort study.

    Science.gov (United States)

    Wan, Eric Yuk Fai; Fong, Daniel Yee Tak; Fung, Colman Siu Cheung; Yu, Esther Yee Tak; Chin, Weng Yee; Chan, Anca Ka Chun; Lam, Cindy Lo Kuen

    2018-02-01

    Evidence-based cardiovascular diseases (CVD) risk prediction models and tools specific for Chinese patients with type 2 diabetes mellitus (T2DM) are currently unavailable. This study aimed to develop and validate a CVD risk prediction model for Chinese T2DM patients. A retrospective cohort study was conducted with 137 935 Chinese patients aged 18 to 79 years with T2DM and without prior history of CVD, who had received public primary care services between January 1, 2010 and December 31, 2010. Using the derivation cohort over a median follow-up of 5 years, the interaction effect between predictors and age were derived using Cox proportional hazards regression with a forward stepwise approach. Harrell's C statistic and calibration plot were used on the validation cohort to assess the discrimination and calibration of the models. The web calculator and chart were developed based on the developed models. For both genders, predictors for higher risk of CVD were older age, smoking, longer diabetes duration, usage of anti-hypertensive drug and insulin, higher body mass index, haemoglobin A1c (HbA1c), systolic and diastolic blood pressure, a total cholesterol to high-density lipoprotein-cholesterol (TC/HDL-C) ratio and urine albumin to creatinine ratio, and lower estimated glomerular filtration rate. Interaction factors with age demonstrated a greater weighting of TC/HDL-C ratio in both younger females and males, and smoking status and HbA1c in younger males. The developed models, translated into a web calculator and color-coded chart, served as evidence-based visual aids that facilitate clinicians to estimate quickly the 5-year CVD risk for Chinese T2DM patients and to guide intervention. © 2017 John Wiley & Sons Ltd.

  18. Development of a tool for modeling snowmobile and snowcoach noise in Yellowstone and Grand Teton National Parks

    Science.gov (United States)

    2010-11-01

    The National Park Service (NPS) develops winter use plans for Yellowstone and Grand Teton National Parks to help manage the use of Over-Snow Vehicles (OSVs), such as snowmobiles and snowcoaches. The use and management of OSVs in the parks is an issue...

  19. A Model for Using a Concept Inventory as a Tool for Students' Assessment and Faculty Professional Development

    Science.gov (United States)

    Marbach-Ad, Gili; McAdams, Katherine C.; Benson, Spencer; Briken, Volker; Cathcart, Laura; Chase, Michael; El-Sayed, Najib M.; Frauwirth, Kenneth; Fredericksen, Brenda; Joseph, Sam W.; Lee, Vincent; McIver, Kevin S.; Mosser, David; Quimby, B. Booth; Shields, Patricia; Song, Wenxia; Stein, Daniel C.; Stewart, Richard; Thompson, Katerina V.; Smith, Ann C.

    2010-01-01

    This essay describes how the use of a concept inventory has enhanced professional development and curriculum reform efforts of a faculty teaching community. The Host Pathogen Interactions (HPI) teaching team is composed of research and teaching faculty with expertise in HPI who share the goal of improving the learning experience of students in…

  20. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  1. Energy counts and materials matter in models for sustainable development. Dynamic lifecycle modelling as a tool for design and evalution of long-term environmental strategies

    NARCIS (Netherlands)

    Moll, Henry Coert

    1993-01-01

    ln this study I do not adopt one of these perspectives but I develop the perspective of the environmental physiologist instead. The environmental physiologist designs models that simulate the metabolism in the society and between the society and the environment. The four perspectives mentioned

  2. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  3. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Craig, G.; Ferlay, J.C.; Ieracitano, F.

    2003-01-01

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  4. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels, and i...

  5. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  6. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    -annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  7. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  8. Statistical forecast of seasonal discharge in Central Asia using observational records: development of a generic linear modelling tool for operational water resource management

    Science.gov (United States)

    Apel, Heiko; Abdykerimova, Zharkinay; Agalhanova, Marina; Baimaganbetov, Azamat; Gavrilenko, Nadejda; Gerlitz, Lars; Kalashnikova, Olga; Unger-Shayesteh, Katy; Vorogushyn, Sergiy; Gafurov, Abror

    2018-04-01

    The semi-arid regions of Central Asia crucially depend on the water resources supplied by the mountainous areas of the Tien Shan and Pamir and Altai mountains. During the summer months the snow-melt- and glacier-melt-dominated river discharge originating in the mountains provides the main water resource available for agricultural production, but also for storage in reservoirs for energy generation during the winter months. Thus a reliable seasonal forecast of the water resources is crucial for sustainable management and planning of water resources. In fact, seasonal forecasts are mandatory tasks of all national hydro-meteorological services in the region. In order to support the operational seasonal forecast procedures of hydro-meteorological services, this study aims to develop a generic tool for deriving statistical forecast models of seasonal river discharge based solely on observational records. The generic model structure is kept as simple as possible in order to be driven by meteorological and hydrological data readily available at the hydro-meteorological services, and to be applicable for all catchments in the region. As snow melt dominates summer runoff, the main meteorological predictors for the forecast models are monthly values of winter precipitation and temperature, satellite-based snow cover data, and antecedent discharge. This basic predictor set was further extended by multi-monthly means of the individual predictors, as well as composites of the predictors. Forecast models are derived based on these predictors as linear combinations of up to four predictors. A user-selectable number of the best models is extracted automatically by the developed model fitting algorithm, which includes a test for robustness by a leave-one-out cross-validation. Based on the cross-validation the predictive uncertainty was quantified for every prediction model. Forecasts of the mean seasonal discharge of the period April to September are derived every month from

  9. Developing new chemical tools for solvent extraction

    International Nuclear Information System (INIS)

    Moyer, B.A.; Baes, C.F.; Burns, J.H.; Case, G.N.; Sachleben, R.A.; Bryan, S.A.; Lumetta, G.J.; McDowell, W.J.; Sachleben, R.A.

    1993-01-01

    Prospects for innovation and for greater technological impact in the field of solvent extraction (SX) seem as bright as ever, despite the maturation of SX as an economically significant separation method and as an important technique in the laboratory. New industrial, environmental, and analytical problems provide compelling motivation for diversifying the application of SX, developing new solvent systems, and seeking improved properties. Toward this end, basic research must be dedicated to enhancing the tools of SX: physical tools for probing the basis of extraction and molecular tools for developing new SX chemistries. In this paper, the authors describe their progress in developing and applying the general tools of equilibrium analysis and of ion recognition in SX. Nearly half a century after the field of SX began in earnest, coordination chemistry continues to provide the impetus for important advancements in understanding SX systems and in controlling SX chemistry. In particular, the physical tools of equilibrium analysis, X-ray crystallography, and spectroscopy are elucidating the molecular basis of SX in unprecedented detail. Moreover, the principles of ion recognition are providing the molecular tools with which to achieve new selectivities and new applications

  10. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  11. International Collaboration Tools for Industrial Development

    CSIR Research Space (South Africa)

    Dan, Nagy

    2017-10-01

    Full Text Available This presentation discusses countries that are ready for Industry 4.0 , International Collaboration Tools and Industrial Development by Dan Nagy at The 6th CSIR Conference: Ideas that work for industrial development, 5-6 October 2017, CSIR...

  12. Information technology tools for curriculum development

    NARCIS (Netherlands)

    McKenney, Susan; Nieveen, N.M.; Strijker, A.; Voogt, Joke; Knezek, Gerald

    2008-01-01

    The widespread introduction and use of computers in the workplace began in the early 1990s. Since then, computer-based tools have been developed to support a myriad of task types, including the complex process of curriculum development. This chapter begins by briefly introducing two concepts that

  13. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  14. Developing mathematical modelling competence

    DEFF Research Database (Denmark)

    Blomhøj, Morten; Jensen, Tomas Højgaard

    2003-01-01

    In this paper we introduce the concept of mathematical modelling competence, by which we mean being able to carry through a whole mathematical modelling process in a certain context. Analysing the structure of this process, six sub-competences are identified. Mathematical modelling competence...... cannot be reduced to these six sub-competences, but they are necessary elements in the development of mathematical modelling competence. Experience from the development of a modelling course is used to illustrate how the different nature of the sub-competences can be used as a tool for finding...... the balance between different kinds of activities in a particular educational setting. Obstacles of social, cognitive and affective nature for the students' development of mathematical modelling competence are reported and discussed in relation to the sub-competences....

  15. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  16. Development of a New Spatial and Temporal resizing Tool of Natural and Anthropogenic Emissions for use in WRF/Chem Regional Modeling

    Science.gov (United States)

    Fernandez, Rafael Pedro; Schiavone, Juan Franco; Cremades, Pablo Gabriel; Ruben Santos, Jorge; Lopez Noreña, Ana Isabel; Puliafito, Salvador Enrique

    2017-04-01

    Atmospheric physical and chemical processes can be simulated with different degrees of complexity using global (CAM-Chem) and regional (WRF-Chem) chemical transport models. The proper representation of such processes strongly depends on the quality and temporal resolution of the initial and boundary conditions (IC/BC), as well as on the spatial resolution of the static fields used to represent the land/ocean - atmosphere interaction (e.g., emission sources). This work presents the development on a new spatial and temporal resizing tool of natural and anthropogenic emissions oriented to adapt the global emission inventories used in CAM-Chem to the technical requirements of the regional WRF/Chem model. The new resizing tool, which is based on the anthro_emiss NCAR pre-processor, allows to i) spatially interpolate and extrapolate any local or global emissions inventory to a given user-defined WRF/Chem domain (including nested domains); while at the same time it ii) imposes an hourly variation of the surface emission flux based on the superposition of the time-dependent Solar Zenith Angle (SZA) with high-resolution political maps (for anthropogenic sources) or geophysical land/ocean fields (for natural sources). Here we present results for the adaptation of two different emission inventories into a three-nested regional domain located in South America (with 36 x 36, 12 x 12 and 4 x 4 km2 spatial resolution, respectively): the global halogenated Very Short-Lived (VSLs) emissions inventory used in CAM-Chem (Ordoñez et al., 2012; with a spatial resolution of 100 x 250 km2 and a monthly seasonality); and a local vehicular emissions inventory of GHG for Argentina (Puliafito et al., 2015; which posses national annual means with a local resolution of 2.5 x 2.5 km2). Different diurnal profiles are analyzed for both natural and anthropogenic sources, assuring an identical total surface flux independently of the spatial resolution and temporal variation imposed to each source

  17. Participatory data collection and monitoring of agricultural pest dynamics for climate-resilient coffee production using Tiko'n, a generic tool to develop agroecological food web models

    Science.gov (United States)

    Rojas, M.; Malard, J. J.; Adamowski, J. F.; Tuy, H.

    2016-12-01

    Climate variability impacts agricultural processes through many mechanisms. For example, the proliferation of pests and diseases increases with warmer climate and alternated wind patterns, as longer growing seasons allow pest species to complete more reproductive cycles and changes in the weather patterns alter the stages and rates of development of pests and pathogens. Several studies suggest that enhancing plant diversity and complexity in farming systems, such as in agroforestry systems, reduces the vulnerability of farms to extreme climatic events. On the other hand, other authors have argued that vegetation diversity does not necessarily reduce the incidence of pests and diseases, highlighting the importance of understanding how, where and when it is recommendable to diversify vegetation to improve pest and disease control, and emphasising the need for tools to develop, monitor and evaluate agroecosystems. In order to understand how biodiversity can enhance ecosystem services provided by the agroecosystem in the context of climatic variability, it is important to develop comprehensive models that include the role of trophic chains in the regulation of pests, which can be achieved by integrating crop models with pest-predator models, also known as agroecosystem network (AEN) models. Here we present a methodology for the participatory data collection and monitoring necessary for running Tiko'n, an AEN model that can also be coupled to a crop model such as DSSAT. This methodology aims to combine the local and practical knowledge of farmers with the scientific knowledge of entomologists and agronomists, allowing for the simplification of complex ecological networks of plant and insect interactions. This also increases the acceptability, credibility, and comprehension of the model by farmers, allowing them to understand their relationship with the local agroecosystem and their potential to use key agroecosystem principles such as functional diversity to mitigate

  18. New tools for generation IV assemblies modelling

    International Nuclear Information System (INIS)

    Sylvie Aniel-Buchheit; Edwige Richebois

    2005-01-01

    Full text of publication follows: In the framework of the development of generation IV concepts, the need of new assembly modelling tools arises. These concepts present more geometrical and spectral heterogeneities (radially and axially). Moreover thermal-hydraulics and neutronics aspects are so closely related that coupled computations are necessary. That raises the need for more precise and flexible tools presenting 3D features. The 3D-coupling of the thermal-hydraulic code FLICA4 with the Monte-Carlo neutronics code TRIPOLI4 was developed in that frame. This new tool enables for the first time to obtain realistic axial and radial power profiles with real feedback effects in an assembly where thermal-hydraulics and neutronics effects are closely related. The BWR is the existing concept presenting the closest heterogeneous characteristics to the various new proposed concepts. This assembly design is thus chosen to compare this new tool, presenting real 3D characteristics, to the existing ones. For design studies, the evaluation of the assembly behavior, currently necessitate a depletion scheme using a 3D thermal-hydraulics assembly calculation coupled with a 1D axial neutronics deterministic calculation (or an axial power profile chosen as a function of the assembly averaged burn-up). The 3D neutronics code (CRONOS2) uses neutronic data built by 2D deterministic assembly calculations without feedback. These cross section libraries enable to take feedbacks into account via parameters such as fuel temperature, moderator density and temperature (history parameters such as void and control rod are not useful in design evaluation). Recently, the libraries build-up has been replaced by on line multi-2D deterministic assembly calculations performed by a cell code (APOLLO2). That avoids interpolation between pre-determined parameters in the cross-section data used by the 1D axial neutronics calculation and enable to give a radial power map to the 3D thermal

  19. A PSYCHOSOCIAL TOOL FOR CHILD'S DEVELOPMENT Nicholas

    African Journals Online (AJOL)

    the entire life span. The developmental psychology in children include issues such as the gradual accumulation of knowledge versus stage- like development or the extent to which children are ... The Practical use of Dance as a Psychological tool for Child's ..... stories to children change their moods and help them to identify.

  20. Research tools | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...

  1. Latest Developments in PVD Coatings for Tooling

    Directory of Open Access Journals (Sweden)

    Gabriela Strnad

    2010-06-01

    Full Text Available The paper presents the recent developments in the field of PVD coating for manufacturing tools. A review of monoblock, multilayer, nanocomposite, DLC and oxinitride coatings is discussed, with the emphasis on coatings which enables the manufacturers to implement high productivity processes such as high speed cutting and dry speed machining.

  2. Visualization tool for advanced laser system development

    Science.gov (United States)

    Crockett, Gregg A.; Brunson, Richard L.

    2002-06-01

    Simulation development for Laser Weapon Systems design and system trade analyses has progressed to new levels with the advent of object-oriented software development tools and PC processor capabilities. These tools allow rapid visualization of upcoming laser weapon system architectures and the ability to rapidly respond to what-if scenario questions from potential user commands. These simulations can solve very intensive problems in short time periods to investigate the parameter space of a newly emerging weapon system concept, or can address user mission performance for many different scenario engagements. Equally important to the rapid solution of complex numerical problems is the ability to rapidly visualize the results of the simulation, and to effectively interact with visualized output to glean new insights into the complex interactions of a scenario. Boeing has applied these ideas to develop a tool called the Satellite Visualization and Signature Tool (SVST). This Windows application is based upon a series of C++ coded modules that have evolved from several programs at Boeing-SVS. The SVST structure, extensibility, and some recent results of applying the simulation to weapon system concepts and designs will be discussed in this paper.

  3. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates......, where the experimental effort could be focused. In this project a general modelling framework for systematic model building through modelling templates, which supports the reuse of existing models via its new model import and export capabilities, have been developed. The new feature for model transfer...... has been developed by establishing a connection with an external modelling environment for code generation. The main contribution of this thesis is a creation of modelling templates and their connection with other modelling tools within a modelling framework. The goal was to create a user...

  4. Developing biological resource banks as a supporting tool for wildlife reproduction and conservation The Iberian lynx bank as a model for other endangered species.

    Science.gov (United States)

    Leon-Quinto, Trinidad; Simon, Miguel A; Cadenas, Rafael; Jones, Jonathan; Martinez-Hernandez, Francisco J; Moreno, Juan M; Vargas, Astrid; Martinez, Fernando; Soria, Bernat

    2009-06-01

    This work presents a Biological Resource Bank generated as a complementary supporting tool for the reproduction and the in situ and ex situ conservation of the Iberian lynx. In its design we prioritized the preservation of a maximum of the current genetic and biological diversity of the population, and the harmless collection of the samples. To provide future reproductive opportunities through any possible technique, we processed and cryopreserved germinal cells and tissues from dead animals, 7 males and 6 females, as well as somatic cells and tissues from 69 different individuals. This somatic cell reserve reflects a very important fraction of the population biodiversity which, furthermore, will allow the development of a wide variety of studies that can be easily extrapolated to the majority of the population. We have developed a new non-destructive method to isolate cells with stem-cell-like properties. If considered convenient in the future, and after proper research, such cells could permit therapeutic applications and perhaps be a good source to be used in somatic cell nuclear transfer. Samples of whole blood and its derivatives, hairs, urine and feces from many different individuals were also preserved. Proper storage of such samples is required to allow epidemiological studies to be performed for the testing of different etiological hypotheses or, in general, to develop any bio-sanitary study to improve conservation strategies within the natural habitat. This work describes the main aspects involved in the practical implementation of the Iberian lynx Biological Resource Bank, as a model that could be useful for the development of similar banks for other endangered species.

  5. Development of bore tools for pipe inspection

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Nakahira, Masataka; Taguchi, Kou; Ito, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Reactor (ITER), replacement and maintenance on in-vessel components requires that all cooling pipes connected be cut and removed, that a new component be installed, and that all cooling pipes be rewelded. After welding is completed, welded area must be inspected for soundness. These tasks require a new work concept for securing shielded area and access from narrow ports. Tools had to be developed for nondestructive inspection and leak testing to evaluate pipe welding soundness by accessing areas from inside pipes using autonomous locomotion welding and cutting tools. A system was proposed for nondestructive inspection of branch pipes and the main pipe after passing through pipe curves, the same as for welding and cutting tool development. Nondestructive inspection and leak testing sensors were developed and the basic parameters were obtained. In addition, the inspection systems which can move inside pipes and conduct the nondestructive inspection and the leak testing were developed. In this paper, an introduction will be given to the current situation concerning the development of nondestructive inspection and leak testing machines for the branch pipes. (author)

  6. Towards a generalized energy prediction model for machine tools.

    Science.gov (United States)

    Bhinge, Raunak; Park, Jinkyoo; Law, Kincho H; Dornfeld, David A; Helu, Moneer; Rachuri, Sudarsan

    2017-04-01

    Energy prediction of machine tools can deliver many advantages to a manufacturing enterprise, ranging from energy-efficient process planning to machine tool monitoring. Physics-based, energy prediction models have been proposed in the past to understand the energy usage pattern of a machine tool. However, uncertainties in both the machine and the operating environment make it difficult to predict the energy consumption of the target machine reliably. Taking advantage of the opportunity to collect extensive, contextual, energy-consumption data, we discuss a data-driven approach to develop an energy prediction model of a machine tool in this paper. First, we present a methodology that can efficiently and effectively collect and process data extracted from a machine tool and its sensors. We then present a data-driven model that can be used to predict the energy consumption of the machine tool for machining a generic part. Specifically, we use Gaussian Process (GP) Regression, a non-parametric machine-learning technique, to develop the prediction model. The energy prediction model is then generalized over multiple process parameters and operations. Finally, we apply this generalized model with a method to assess uncertainty intervals to predict the energy consumed to machine any part using a Mori Seiki NVD1500 machine tool. Furthermore, the same model can be used during process planning to optimize the energy-efficiency of a machining process.

  7. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  8. A pandemic influenza modeling and visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Maciejewski, Ross; Livengood, Philip; Rudolph, Stephen; Collins, Timothy F.; Ebert, David S.; Brigantic, Robert T.; Corley, Courtney D.; Muller, George A.; Sanders, Stephen W.

    2011-08-01

    The National Strategy for Pandemic Influenza outlines a plan for community response to a potential pandemic. In this outline, state and local communities are charged with enhancing their preparedness. In order to help public health officials better understand these charges, we have developed a modeling and visualization toolkit (PanViz) for analyzing the effect of decision measures implemented during a simulated pandemic influenza scenario. Spread vectors based on the point of origin and distance traveled over time are calculated and the factors of age distribution and population density are taken into effect. Healthcare officials are able to explore the effects of the pandemic on the population through a spatiotemporal view, moving forward and backward through time and inserting decision points at various days to determine the impact. Linked statistical displays are also shown, providing county level summaries of data in terms of the number of sick, hospitalized and dead as a result of the outbreak. Currently, this tool has been deployed in Indiana State Department of Health planning and preparedness exercises, and as an educational tool for demonstrating the impact of social distancing strategies during the recent H1N1 (swine flu) outbreak.

  9. Development of a Model Protein Interaction Pair as a Benchmarking Tool for the Quantitative Analysis of 2-Site Protein-Protein Interactions.

    Science.gov (United States)

    Yamniuk, Aaron P; Newitt, John A; Doyle, Michael L; Arisaka, Fumio; Giannetti, Anthony M; Hensley, Preston; Myszka, David G; Schwarz, Fred P; Thomson, James A; Eisenstein, Edward

    2015-12-01

    A significant challenge in the molecular interaction field is to accurately determine the stoichiometry and stepwise binding affinity constants for macromolecules having >1 binding site. The mission of the Molecular Interactions Research Group (MIRG) of the Association of Biomolecular Resource Facilities (ABRF) is to show how biophysical technologies are used to quantitatively characterize molecular interactions, and to educate the ABRF members and scientific community on the utility and limitations of core technologies [such as biosensor, microcalorimetry, or analytic ultracentrifugation (AUC)]. In the present work, the MIRG has developed a robust model protein interaction pair consisting of a bivalent variant of the Bacillus amyloliquefaciens extracellular RNase barnase and a variant of its natural monovalent intracellular inhibitor protein barstar. It is demonstrated that this system can serve as a benchmarking tool for the quantitative analysis of 2-site protein-protein interactions. The protein interaction pair enables determination of precise binding constants for the barstar protein binding to 2 distinct sites on the bivalent barnase binding partner (termed binase), where the 2 binding sites were engineered to possess affinities that differed by 2 orders of magnitude. Multiple MIRG laboratories characterized the interaction using isothermal titration calorimetry (ITC), AUC, and surface plasmon resonance (SPR) methods to evaluate the feasibility of the system as a benchmarking model. Although general agreement was seen for the binding constants measured using solution-based ITC and AUC approaches, weaker affinity was seen for surface-based method SPR, with protein immobilization likely affecting affinity. An analysis of the results from multiple MIRG laboratories suggests that the bivalent barnase-barstar system is a suitable model for benchmarking new approaches for the quantitative characterization of complex biomolecular interactions.

  10. Risk Assessment in Fractured Clayey Tills - Which Modeling Tools?

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia; Bjerg, Poul Løgstrup; Binning, Philip John

    2012-01-01

    The article presents different tools available for risk assessment in fractured clayey tills and their advantages and limitations are discussed. Because of the complex processes occurring during contaminant transport through fractured media, the development of simple practical tools for risk...... assessment is challenging and the inclusion of the relevant processes is difficult. Furthermore the lack of long-term monitoring data prevents from verifying the accuracy of the different conceptual models. Further investigations based on long-term data and numerical modeling are needed to accurately...... describe contaminant transport in fractured media and develop practical tools with the relevant processes and level of complexity....

  11. Development of Computational Tools for Modeling Thermal and Radiation Effects on Grain Boundary Segregation and Precipitation in Fe-Cr-Ni-based Alloys

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Ying [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    This work aims at developing computational tools for modeling thermal and radiation effects on solute segregation at grain boundaries (GBs) and precipitation. This report described two major efforts. One is the development of computational tools on integrated modeling of thermal equilibrium segregation (TES) and radiation-induced segregation (RIS), from which synergistic effects of thermal and radiation, pre-existing GB segregation have been taken into consideration. This integrated modeling was used in describing the Cr and Ni segregation in the Fe-Cr-Ni alloys. The other effort is thermodynamic modeling on the Fe-Cr-Ni-Mo system which includes the major alloying elements in the investigated alloys in the Advanced Radiation Resistant Materials (ARRM) program. Through thermodynamic calculation, we provide baseline thermodynamic stability of the hardening phase Ni2(Cr,Mo) in selected Ni-based super alloys, and contribute knowledge on mechanistic understanding on the formation of Ni2(Cr,Mo) in the irradiated materials. The major outcomes from this work are listed in the following: 1) Under the simultaneous thermal and irradiation conditions, radiation-induced segregation played a dominant role in the GB segregation. The pre-existing GB segregation only affects the subsequent radiation-induced segregation in the short time. For the same element, the segregation tendency of Cr and Ni due to TES is opposite to it from RIS. The opposite tendency can lead to the formation of W-shape profile. These findings are consistent with literature observation of the transitory W-shape profile. 2) While TES only affects the distance of one or two atomic layers from GBs, the RIS can affect a broader distance from GB. Therefore, the W-shape due to pre-existing GB segregation is much narrower than that due to composition gradient formed during the transient state. Considering the measurement resolution of Auger or STEM analysis, the segregation tendency due to RIS should play a dominant

  12. Development of a simulation tool based on a segregated model to optimize the design and the scale up of animal cell culture in fixed-bed bioreactor [abstract

    Directory of Open Access Journals (Sweden)

    Gelbgras, V.

    2010-01-01

    Full Text Available The fixed-bed bioreactor is a promising system for the process intensification of the adherent animal cell culture. Nevertheless the fixed-bed bioreactor presents heterogeneity of the cell and the species concentrations which can complicate its optimization and its scale-up. The aim of this work is to develop a mathematical model of the evolution of the cell concentration and the species concentrations to study the process optimization and the bioreactor scale-up. The developed model is used as a simulation tool to study the influence of different phenomena on the cell heterogeneity. In this work, the importance of the adherent phase is investigated. This phase takes place in the beginning of the process. To realize a good implementation of the process, it is important to control the adherent cell concentration and to minimize the heterogeneity during this phase. If cell concentration heterogeneity appears, it will have repercussions during the whole process. In the model, four cell populations are considered: the viable cells in suspension in the medium, the captured cells by the fixed-bed in suspension in the medium, the adherent cells on the fixed-bed and the dead cells in suspension in the medium. Five extracellular species are considered: glucose, glutamine, oxygen, ammonia and lactate. Five phenomena are modeled: the culture medium flow through the fixed-bed (with axial convection, radial dispersion and axial dispersion, the cell capture by the fixed-bed, the cell adherence on the fixed-bed, the cell growth with a maximal cell concentration imposed by the specific area of the fixed-bed and the cell death. The interaction between cells and species is modeled by a Monod equation for the specific growth rate. The model equations are solved with a routine developed with Matlab 6.5. This routine used the Finite Volume Method coupled with a Newton-Raphson algorithm. The model parameters are experimentally identified by cell cultures in a pilot

  13. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  14. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  15. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer......’s preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation...

  16. Developing Expert Tools for the LHC

    CERN Document Server

    AUTHOR|(CDS)2160780; Timkó, Helga

    2017-10-12

    This Thesis describes software tools developed for automated, precision setting-up of low-power level radio frequency (LLRF) loops, which will help expert users to have better control and faster setting-up of the radio-frequency (RF) system in the Large Hadron Collider (LHC) experiment. The aim was to completely redesign the software architecture, to add new features, to improve certain algorithms, and to increase the automation.

  17. Development of configuration risk management tool

    International Nuclear Information System (INIS)

    Masuda, Takahiro; Doi, Eiji

    2003-01-01

    Tokyo Electric Power Company (referred to as TEPCO hereinafter), and other Japanese utilities as well, have been trying to improve the capacity factor of their Nuclear Power Plants (NPPs) through modernization of Operation and Maintenance strategy. TEPCO intends to apply risk information to O and M field with maintaining or even improving both safety and production efficiency. Under these situations, TEPCO with some BWR utilities started to develop a Configuration Risk Management (CRM) tool that can estimate risk in various plant conditions due to configuration changes during outage. Moreover, we also intend to apply CRM to on-line maintenance (OLM) in the near future. This tool can calculate the Core Damage Frequency (CDF) according to given plant condition, such as SSCs availability, decay heat level and the inventory of coolant in both outage state and full-power operation. From deterministic viewpoint, whether certain configuration meet the related requirements of Technical Specifications. User-friendly interface is one of the important features of this tool because this enables the site engineers with little experience in PSA to quantify and utilize the risk information by this tool. (author)

  18. Development of a flexible visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M. E-mail: emo@nifs.ac.jp; Shibata, K.; Watanabe, K.; Ohdachi, S.; Ida, K.; Sudo, S

    2002-06-01

    User-friendly visualization tools are indispensable for quick recognition of experimental data. One such tool, the dwscope component of the MDS-Plus system, is widely used to visualize the data that MDS-Plus acquires. However, the National Institute for Fusion Science does not use MDS-Plus, so our researchers on the Large Helical Device (LHD) project cannot use dwscope without modification. Therefore, we developed a new visualization tool, NIFScope. The user interface of NIFScope is based on JavaScope, which is a Java version of dwscope, but NIFScope has its own unique characteristics, including the following: (1) the GUI toolkit is GTK+; (2) Ruby is the equation evaluator; and (3) data loaders are provided as Ruby modules. With these features, NIFScope becomes a multi-purpose and flexible visualization tool. For example, because GTK+ is a multi-platform open source GUI toolkit, NIFScope can run on both MS-Windows and UNIX, and it can be delivered freely. The second characteristic enables users to plot various equations besides experimental data. Furthermore, Ruby is an object-oriented script language and is widely used on the Internet, allowing it to serve not only as an equation evaluator but also as an ordinal programming language. This means users can easily add new data loaders for their own data formats.

  19. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  20. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    OpenAIRE

    O. K. Akmaev; B. A. Enikeev; A. I. Nigmatullin

    2015-01-01

    The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel k...

  1. CREST Cost of Renewable Energy Spreadsheet Tool: A Model for Developing Cost-based Incentives in the United States. User Manual Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Gifford, Jason S. [Sustainable Energy Advantage, LLC, Framingham, MA (United States); Grace, Robert C. [Sustainable Energy Advantage, LLC, Framingham, MA (United States)

    2011-03-01

    This user manual helps model users understands how to use the CREST model to support renewable energy incentives, FITs, and other renewable energy rate-setting processes. It reviews the spreadsheet tool, including its layout and conventions, offering context on how and why it was created. It also provides instructions on how to populate the model with inputs that are appropriate for a specific jurisdiction’s policymaking objectives and context. And, it describes the results and outlines how these results may inform decisions about long-term renewable energy support programs.

  2. SE Requirements Development Tool User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Faith Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Department of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.

  3. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  4. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  5. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  6. Hypermedia as an experiential learning tool: a theoretical model

    OpenAIRE

    Jose Miguel Baptista Nunes; Susan P. Fowell

    1996-01-01

    The process of methodical design and development is of extreme importance in the production of educational software. However, this process will only be effective, if it is based on a theoretical model that explicitly defines what educational approach is being used and how specific features of the technology can best support it. This paper proposes a theoretical model of how hypermedia can be used as an experiential learning tool. The development of the model was based on a experiential learni...

  7. Development of IFC based fire safety assesment tools

    DEFF Research Database (Denmark)

    Taciuc, Anca; Karlshøj, Jan; Dederichs, Anne

    2016-01-01

    changes need to be implemented, involving supplementary work and costs with negative impact on the client. The aim of this project is to create a set of automatic compliance checking rules for prescriptive design and to develop a web application tool for performance based design that retrieves data from...... Building Information Models (BIM) to evacuate the safety level in the building during the conceptual design stage. The findings show that the developed tools can be useful in AEC industry. Integrating BIM from conceptual design stage for analyzing the fire safety level can ensure precision in further...

  8. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  9. R5 clade C SHIV strains with tier 1 or 2 neutralization sensitivity: tools to dissect env evolution and to develop AIDS vaccines in primate models.

    Directory of Open Access Journals (Sweden)

    Nagadenahalli B Siddappa

    2010-07-01

    Full Text Available HIV-1 clade C (HIV-C predominates worldwide, and anti-HIV-C vaccines are urgently needed. Neutralizing antibody (nAb responses are considered important but have proved difficult to elicit. Although some current immunogens elicit antibodies that neutralize highly neutralization-sensitive (tier 1 HIV strains, most circulating HIVs exhibiting a less sensitive (tier 2 phenotype are not neutralized. Thus, both tier 1 and 2 viruses are needed for vaccine discovery in nonhuman primate models.We constructed a tier 1 simian-human immunodeficiency virus, SHIV-1157ipEL, by inserting an "early," recently transmitted HIV-C env into the SHIV-1157ipd3N4 backbone [1] encoding a "late" form of the same env, which had evolved in a SHIV-infected rhesus monkey (RM with AIDS. SHIV-1157ipEL was rapidly passaged to yield SHIV-1157ipEL-p, which remained exclusively R5-tropic and had a tier 1 phenotype, in contrast to "late" SHIV-1157ipd3N4 (tier 2. After 5 weekly low-dose intrarectal exposures, SHIV-1157ipEL-p systemically infected 16 out of 17 RM with high peak viral RNA loads and depleted gut CD4+ T cells. SHIV-1157ipEL-p and SHIV-1157ipd3N4 env genes diverge mostly in V1/V2. Molecular modeling revealed a possible mechanism for the increased neutralization resistance of SHIV-1157ipd3N4 Env: V2 loops hindering access to the CD4 binding site, shown experimentally with nAb b12. Similar mutations have been linked to decreased neutralization sensitivity in HIV-C strains isolated from humans over time, indicating parallel HIV-C Env evolution in humans and RM.SHIV-1157ipEL-p, the first tier 1 R5 clade C SHIV, and SHIV-1157ipd3N4, its tier 2 counterpart, represent biologically relevant tools for anti-HIV-C vaccine development in primates.

  10. Simulation Tools Model Icing for Aircraft Design

    Science.gov (United States)

    2012-01-01

    the years from strictly a research tool to one used routinely by industry and other government agencies. Glenn contractor William Wright has been the architect of this development, supported by a team of researchers investigating icing physics, creating validation data, and ensuring development according to standard software engineering practices. The program provides a virtual simulation environment for determining where water droplets strike an airfoil in flight, what kind of ice would result, and what shape that ice would take. Users can enter geometries for specific, two-dimensional cross sections of an airfoil or other airframe surface and then apply a range of inputs - different droplet sizes, temperatures, airspeeds, and more - to model how ice would build up on the surface in various conditions. The program s versatility, ease of use, and speed - LEWICE can run through complex icing simulations in only a few minutes - have contributed to it becoming a popular resource in the aviation industry.

  11. Student Model Tools Code Release and Documentation

    DEFF Research Database (Denmark)

    Johnson, Matthew; Bull, Susan; Masci, Drew

    of its strengths and areas of improvement (Section 6). Several key appendices are attached to this report including user manuals for teacher and students (Appendix 3). Fundamentally, all relevant information is included in the report for those wishing to do further development work with the tool...

  12. Graphical tools for model selection in generalized linear models.

    Science.gov (United States)

    Murray, K; Heritier, S; Müller, S

    2013-11-10

    Model selection techniques have existed for many years; however, to date, simple, clear and effective methods of visualising the model building process are sparse. This article describes graphical methods that assist in the selection of models and comparison of many different selection criteria. Specifically, we describe for logistic regression, how to visualize measures of description loss and of model complexity to facilitate the model selection dilemma. We advocate the use of the bootstrap to assess the stability of selected models and to enhance our graphical tools. We demonstrate which variables are important using variable inclusion plots and show that these can be invaluable plots for the model building process. We show with two case studies how these proposed tools are useful to learn more about important variables in the data and how these tools can assist the understanding of the model building process. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Development of a biogas planning tool for project owners

    DEFF Research Database (Denmark)

    Fredenslund, Anders Michael; Kjær, Tyge

    A spreadsheet model was developed, which can be used as a tool in the initial phases of planning a centralized biogas plant in Denmark. The model assesses energy production, total plant costs, operational costs and revenues and effect on greenhouse gas emissions. Two energy utilization alternatives...... are considered: Combined heat and power and natural gas grid injection. The main input to the model is the amount and types of substrates available for anaerobic digestion. By substituting the models’ default values with more project specific information, the model can be used in a biogas projects later phases...

  14. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  15. Selected techniques in radioecology: Model development and comparison for internal dosimetry of rainbow trout (Oncorhynchus mykiss) and feasibiltiy assessment of reflectance spectroscopy use as a tool in phytoremediation

    Science.gov (United States)

    Martinez, Nicole

    The first study in Part 1 examines the effects of lake tropic structure on the uptake of iodine-131 (131I) in rainbow trout (Oncorhynchus mykiss) and considers a simple computational model for the estimation of resulting radiation dose. Iodine-131 is a major component of the atmospheric releases following reactor accidents, and the passage of 131I through food chains from grass to human thyroids has been extensively studied. By comparison, the fate and effects of 131I deposition onto lakes and other aquatic systems has been less studied. In this study we reanalyze 1960s data from experimental releases of 131I into two small lakes and compare the effects of differences in lake trophic structures on 131I accumulation in fish. The largest concentrations in the thyroids of trout may occur from 8 to 32 days post initial release. DCFs for trout for whole body as well as thyroid were computed using Monte Carlo modeling with an anatomically-appropriate model of trout thyroid structure. Activity concentration data was used in conjunction with the calculated DCFs to estimate dose rates and ultimately determine cumulative radiation dose (Gy) to the thyroids after 32 days. The estimated cumulative thyroid doses at 32 days post-release ranged from 6 mGy to 18 mGy per 1 Bq mL-1 of initial 131I in the water, depending upon fish size. The subsequent studies in Part 1 seek to develop and compare different, increasingly detailed anatomical phantoms for O. mykiss for the purpose of estimating organ radiation dose and dose rates from 131I uptake and from molybdenum-99 (99Mo) uptake. Model comparison and refinement is important to the process of determining both dose rates and dose effects, and we develop and compare three models for O. mykiss: a simplistic geometry considering a single organ, a more specific geometry employing anatomically relevant organ size and location, and voxel reconstruction of internal anatomy obtained from CT imaging (referred to as CSUTROUT). Dose Conversion

  16. Development of an Integrated Aeroelastic Multibody Morphing Simulation Tool (Postprint)

    National Research Council Canada - National Science Library

    Reich, Gregory W; Bowman, Jason C; Sanders, Brian; Frank, Geoffrey J

    2007-01-01

    .... Also discussed are current-generation tools for modeling vehicle flight and illustrations of how these tools are as yet too immature for modeling of the flight of an aircraft during morphing of the wings...

  17. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  18. Development of the SOFIA Image Processing Tool

    Science.gov (United States)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  19. A tool box for implementing supersymmetric models

    Science.gov (United States)

    Staub, Florian; Ohl, Thorsten; Porod, Werner; Speckner, Christian

    2012-10-01

    We present a framework for performing a comprehensive analysis of a large class of supersymmetric models, including spectrum calculation, dark matter studies and collider phenomenology. To this end, the respective model is defined in an easy and straightforward way using the Mathematica package SARAH. SARAH then generates model files for CalcHep which can be used with micrOMEGAs as well as model files for WHIZARD and O'Mega. In addition, Fortran source code for SPheno is created which facilitates the determination of the particle spectrum using two-loop renormalization group equations and one-loop corrections to the masses. As an additional feature, the generated SPheno code can write out input files suitable for use with HiggsBounds to apply bounds coming from the Higgs searches to the model. Combining all programs provides a closed chain from model building to phenomenology. Program summary Program title: SUSY Phenomenology toolbox. Catalog identifier: AEMN_v1_0. Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMN_v1_0.html. Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland. Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html. No. of lines in distributed program, including test data, etc.: 140206. No. of bytes in distributed program, including test data, etc.: 1319681. Distribution format: tar.gz. Programming language: Autoconf, Mathematica. Computer: PC running Linux, Mac. Operating system: Linux, Mac OS. Classification: 11.6. Nature of problem: Comprehensive studies of supersymmetric models beyond the MSSM is considerably complicated by the number of different tasks that have to be accomplished, including the calculation of the mass spectrum and the implementation of the model into tools for performing collider studies, calculating the dark matter density and checking the compatibility with existing collider bounds (in particular, from the Higgs searches). Solution method: The

  20. Technology development for high temperature logging tools

    Energy Technology Data Exchange (ETDEWEB)

    Veneruso, A.F.; Coquat, J.A.

    1979-01-01

    A set of prototype, high temperature logging tools (temperature, pressure and flow) were tested successfully to temperatures up to 275/sup 0/C in a Union geothermal well during November 1978 as part of the Geothermal Logging Instrumentation Development Program. This program is being conducted by Sandia Laboratories for the Department of Energy's Division of Geothermal Energy. The progress and plans of this industry based program to develop and apply the high temperature instrumentation technology needed to make reliable geothermal borehole measurements are described. Specifically, this program is upgrading existing sondes for improved high temperature performance, as well as applying new materials (elastomers, polymers, metals and ceramics) and developing component technology such as high temperature cables, cableheads and electronics to make borehole measurements such as formation temperature, flow rate, high resolution pressure and fracture mapping. In order to satisfy critical existing needs, the near term goal is for operation up to 275/sup 0/C and 7000 psi by the end of FY80. The long term goal is for operation up to 350/sup 0/C and 20,000 psi by the end of FY84.

  1. The Danube Delta Biosphere Reserve Qualitative Reasoning Model - Education and Decision Support tool for Active Behaviour in Sustainable Development of this area

    Directory of Open Access Journals (Sweden)

    CIOACA Eugenia

    2007-10-01

    Full Text Available This paper presents the three main steps necessary in preparing a new model based on Qualitative Reasoning (QR concept, for Danube Delta Biosphere Reserve (DDBR environmental system. These are: theDDBR system Concept map, the Global causal model, and the Structural model. The DDBR QR model is focused on describing the behaviour of this system related to those causes and their effects which hamper the system sustainable development, especially this system aquatic ecosystem behaviour governed by the water pollution process positive rate and its negative effect on biodiversity and human being health, for peolple living in oraround this area.

  2. RSMASS system model development

    International Nuclear Information System (INIS)

    Marshall, A.C.; Gallup, D.R.

    1998-01-01

    RSMASS system mass models have been used for more than a decade to make rapid estimates of space reactor power system masses. This paper reviews the evolution of the RSMASS models and summarizes present capabilities. RSMASS has evolved from a simple model used to make rough estimates of space reactor and shield masses to a versatile space reactor power system model. RSMASS uses unique reactor and shield models that permit rapid mass optimization calculations for a variety of space reactor power and propulsion systems. The RSMASS-D upgrade of the original model includes algorithms for the balance of the power system, a number of reactor and shield modeling improvements, and an automatic mass optimization scheme. The RSMASS-D suite of codes cover a very broad range of reactor and power conversion system options as well as propulsion and bimodal reactor systems. Reactor choices include in-core and ex-core thermionic reactors, liquid metal cooled reactors, particle bed reactors, and prismatic configuration reactors. Power conversion options include thermoelectric, thermionic, Stirling, Brayton, and Rankine approaches. Program output includes all major component masses and dimensions, efficiencies, and a description of the design parameters for a mass optimized system. In the past, RSMASS has been used as an aid to identify and select promising concepts for space power applications. The RSMASS modeling approach has been demonstrated to be a valuable tool for guiding optimization of the power system design; consequently, the model is useful during system design and development as well as during the selection process. An improved in-core thermionic reactor system model RSMASS-T is now under development. The current development of the RSMASS-T code represents the next evolutionary stage of the RSMASS models. RSMASS-T includes many modeling improvements and is planned to be more user-friendly. RSMASS-T will be released as a fully documented, certified code at the end of

  3. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  4. Development and Integration of an Advanced Stirling Convertor Linear Alternator Model for a Tool Simulating Convertor Performance and Creating Phasor Diagrams

    Science.gov (United States)

    Metscher, Jonathan F.; Lewandowski, Edward J.

    2013-01-01

    A simple model of the Advanced Stirling Convertors (ASC) linear alternator and an AC bus controller has been developed and combined with a previously developed thermodynamic model of the convertor for a more complete simulation and analysis of the system performance. The model was developed using Sage, a 1-D thermodynamic modeling program that now includes electro-magnetic components. The convertor, consisting of a free-piston Stirling engine combined with a linear alternator, has sufficiently sinusoidal steady-state behavior to allow for phasor analysis of the forces and voltages acting in the system. A MATLAB graphical user interface (GUI) has been developed to interface with the Sage software for simplified use of the ASC model, calculation of forces, and automated creation of phasor diagrams. The GUI allows the user to vary convertor parameters while fixing different input or output parameters and observe the effect on the phasor diagrams or system performance. The new ASC model and GUI help create a better understanding of the relationship between the electrical component voltages and mechanical forces. This allows better insight into the overall convertor dynamics and performance.

  5. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  6. AECL's use of FMEA and OPEX for field service tooling and process development, implementation and improvement: a model for the future

    International Nuclear Information System (INIS)

    Cox, E.; Dam, R.F.; Wilson, E.

    2008-01-01

    Failure Modes and Effects Analysis (FMEA) is a systematic and rigorous process applied to new or complex systems to predict system failures and assist with the development of mitigating strategies. The process is especially beneficial when applied to higher-risk applications such as nuclear systems. FMEA may be used for design verification and maintenance program development. For field service tooling, FMEA is complimented well by operating experience (OPEX) and continuous improvement initiatives. FMEA is generally conducted while developing systems and processes to ensure safe and successful implementation, while OPEX is fed back into the system design and operation to improve those systems and processes for subsequent field applications. This paper will explore these techniques as they have been applied to AECL's CANDUclean system. The portable CANDUclean system is employed to mechanically clean the inside of steam generator (SG) tubes in CANDU nuclear power plants. During normal plant operation, the steam generator tubes in the heat transport system develop a build-up of magnetite on their internal diameter, which decreases heat transfer efficiency, impedes SG maintenance activities and increases the radiation fields in and around the boilers. As part of a regular plant aging management routine, the CANDUclean system is used to remove the magnetite layers. The nature of this work includes risks to personnel safety, however by continually applying FMEA and other improvement initiatives, safety and system effectiveness are maximized. This paper will provide an overview of the integrated continuous improvement approach applied to the CANDUclean system and consider the value of strategies when applied to field service tooling and CANDU systems. (author)

  7. Development of tools for optimization of HWC

    International Nuclear Information System (INIS)

    Wikmark, Gunnar; Lundgren, Klas; Wijkstroem, Hjalmar; Pein, Katarina; Ullberg, Mats

    2004-06-01

    An ECP model for the Swedish Boiling Water Reactors (BWRs) was developed in a previous project sponsored by the Swedish Nuclear Power Inspectorate. The present work is an extension of that effort. The model work has been extended in three ways. Some potential problem areas of the ECP sub-model have been treated in full detail. A comprehensive calibration data set has been assembled from plant data and from laboratory and in-plant experiments. The model has been fitted to the calibration data set and the model parameters adjusted. The work on the ECP sub-model has demonstrated that the generalised Butler Volmer equation, as previously used, adequately describes the electrochemistry. Thus, there is no need to treat the system surface oxides as semiconductors or to take double layer effects into account. The existence of a pseudo potential for the reaction of oxygen on stainless steel is confirmed. The concentration dependence and temperature dependence of the exchange current densities are still unclear. An experimental investigation of these is therefore desirable. An interesting alternative to a conventional experimental set-up is to combine modelling with simpler and more easily controlled experiments. In addition to a calibration data set, the survey of plant data has also led to an improved understanding of the necessary parameters of an ECP model. Thus, variations of the H 2 injection rate at constant reactor power level and constant recirculation flow rate were traced to variations of the relative power level of the fuel elements in the core periphery. The power level in the core periphery determines the dose rate in the down comer and controls the recombination reaction that is fundamental to Hydrogen Water Chemistry (HWC). To accurately model ECP as a function of hydrogen injection rate and other plant parameters, the relative power level of the core periphery is a necessary model parameter that has to be regularly updated from core management codes

  8. Development of the Sports Organization Concussion Risk Assessment Tool (SOCRAT).

    Science.gov (United States)

    Yeung, A; Munjal, V; Virji-Babul, N

    2017-01-01

    In this paper, we describe the development of a novel tool-the Sports Organization Concussion Risk Assessment Tool (SOCRAT)-to assist sport organizations in assessing the overall risk of concussion at a team level by identifying key risk factors. We first conducted a literature review to identify risk factors of concussion using ice hockey as a model. We then developed an algorithm by combining the severity and the probability of occurrence of concussions of the identified risk factors by adapting a risk assessment tool commonly used in engineering applications. The following risk factors for ice hockey were identified: age, history of previous concussions, previous body checking experience, allowance of body checking, type of helmet worn and the game or practice environment. These risk factors were incorporated into the algorithm, resulting in an individual risk priority number (RPN) for each risk factor and an overall RPN that provides an estimate of the risk in the given circumstances. The SOCRAT can be used to analyse how different risk factors contribute to the overall risk of concussion. The tool may be tailored to organizations to provide: (1) an RPN for each risk factor and (2) an overall RPN that takes into account all the risk factors. Further work is needed to validate the tool based on real data.

  9. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  10. MOOCs as a Professional Development Tool for Librarians

    Directory of Open Access Journals (Sweden)

    Meghan Ecclestone

    2013-11-01

    Full Text Available This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  11. Integrated landscape/hydrologic modeling tool for semiarid watersheds

    Science.gov (United States)

    Mariano Hernandez; Scott N. Miller

    2000-01-01

    An integrated hydrologic modeling/watershed assessment tool is being developed to aid in determining the susceptibility of semiarid landscapes to natural and human-induced changes across a range of scales. Watershed processes are by definition spatially distributed and are highly variable through time, and this approach is designed to account for their spatial and...

  12. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  13. Static Stiffness Modeling of Parallel Kinematics Machine Tool Joints

    Directory of Open Access Journals (Sweden)

    O. K. Akmaev

    2015-09-01

    Full Text Available The possible variants of an original parallel kinematics machine-tool structure are explored in this article. A new Hooke's universal joint design based on needle roller bearings with the ability of a preload setting is proposed. The bearing stiffness modeling is carried out using a variety of methods. The elastic deformation modeling of a Hook’s joint and a spherical rolling joint have been developed to assess the possibility of using these joints in machine tools with parallel kinematics.

  14. European Institutional and Organisational Tools for Maritime Human Resources Development

    OpenAIRE

    Dragomir Cristina

    2012-01-01

    Seafarers need to continuously develop their career, at all stages of their professional life. This paper presents some tools of institutional and organisational career development. At insitutional level there are presented vocational education and training tools provided by the European Union institutions while at organisational level are exemplified some tools used by private crewing companies for maritime human resources assessment and development.

  15. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  16. A Decision Tool that Combines Discrete Event Software Process Models with System Dynamics Pieces for Software Development Cost Estimation and Analysis

    Science.gov (United States)

    Mizell, Carolyn Barrett; Malone, Linda

    2007-01-01

    The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.

  17. Prediction models and development of an easy to use open-access tool for measuring lung function of individuals with motor complete spinal cord injury

    NARCIS (Netherlands)

    Mueller, Gabi; de Groot, Sonja; van der Woude, Lucas H.; Perret, Claudio; Michel, Franz; Hopman, Maria T. E.

    Objective: To develop statistical models to predict lung function and respiratory muscle strength from personal and lesion characteristics of individuals with motor complete spinal cord injury. Design: Cross-sectional, multi-centre cohort study. Subjects: A total of 440 individuals with traumatic,

  18. DIDADTIC TOOLS FOR THE STUDENTS’ ALGORITHMIC THINKING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. P. Pushkaryeva

    2017-01-01

    Full Text Available Introduction. Modern engineers must possess high potential of cognitive abilities, in particular, the algorithmic thinking (AT. In this regard, the training of future experts (university graduates of technical specialities has to provide the knowledge of principles and ways of designing of various algorithms, abilities to analyze them, and to choose the most optimal variants for engineering activity implementation. For full formation of AT skills it is necessary to consider all channels of psychological perception and cogitative processing of educational information: visual, auditory, and kinesthetic.The aim of the present research is theoretical basis of design, development and use of resources for successful development of AT during the educational process of training in programming.Methodology and research methods. Methodology of the research involves the basic thesis of cognitive psychology and information approach while organizing the educational process. The research used methods: analysis; modeling of cognitive processes; designing training tools that take into account the mentality and peculiarities of information perception; diagnostic efficiency of the didactic tools. Results. The three-level model for future engineers training in programming aimed at development of AT skills was developed. The model includes three components: aesthetic, simulative, and conceptual. Stages to mastering a new discipline are allocated. It is proved that for development of AT skills when training in programming it is necessary to use kinesthetic tools at the stage of mental algorithmic maps formation; algorithmic animation and algorithmic mental maps at the stage of algorithmic model and conceptual images formation. Kinesthetic tools for development of students’ AT skills when training in algorithmization and programming are designed. Using of kinesthetic training simulators in educational process provide the effective development of algorithmic style of

  19. Development of a brachytherapy audit checklist tool.

    Science.gov (United States)

    Prisciandaro, Joann; Hadley, Scott; Jolly, Shruti; Lee, Choonik; Roberson, Peter; Roberts, Donald; Ritter, Timothy

    2015-01-01

    To develop a brachytherapy audit checklist that could be used to prepare for Nuclear Regulatory Commission or agreement state inspections, to aid in readiness for a practice accreditation visit, or to be used as an annual internal audit tool. Six board-certified medical physicists and one radiation oncologist conducted a thorough review of brachytherapy-related literature and practice guidelines published by professional organizations and federal regulations. The team members worked at two facilities that are part of a large, academic health care center. Checklist items were given a score based on their judged importance. Four clinical sites performed an audit of their program using the checklist. The sites were asked to score each item based on a defined severity scale for their noncompliance, and final audit scores were tallied by summing the products of importance score and severity score for each item. The final audit checklist, which is available online, contains 83 items. The audit scores from the beta sites ranged from 17 to 71 (out of 690) and identified a total of 7-16 noncompliance items. The total time to conduct the audit ranged from 1.5 to 5 hours. A comprehensive audit checklist was developed which can be implemented by any facility that wishes to perform a program audit in support of their own brachytherapy program. The checklist is designed to allow users to identify areas of noncompliance and to prioritize how these items are addressed to minimize deviations from nationally-recognized standards. Copyright © 2015 American Brachytherapy Society. All rights reserved.

  20. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  1. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  2. Neural Networks for Hydrological Modeling Tool for Operational Purposes

    Science.gov (United States)

    Bhatt, Divya; Jain, Ashu

    2010-05-01

    Hydrological models are useful in many water resources applications such as flood control, irrigation and drainage, hydro power generation, water supply, erosion and sediment control, etc. Estimates of runoff are needed in many water resources planning, design development, operation and maintenance activities. Runoff is generally computed using rainfall-runoff models. Computer based hydrologic models have become popular for obtaining hydrological forecasts and for managing water systems. Rainfall-runoff library (RRL) is computer software developed by Cooperative Research Centre for Catchment Hydrology (CRCCH), Australia consisting of five different conceptual rainfall-runoff models, and has been in operation in many water resources applications in Australia. Recently, soft artificial intelligence tools such as Artificial Neural Networks (ANNs) have become popular for research purposes but have not been adopted in operational hydrological forecasts. There is a strong need to develop ANN models based on real catchment data and compare them with the conceptual models actually in use in real catchments. In this paper, the results from an investigation on the use of RRL and ANNs are presented. Out of the five conceptual models in the RRL toolkit, SimHyd model has been used. Genetic Algorithm has been used as an optimizer in the RRL to calibrate the SimHyd model. Trial and error procedures were employed to arrive at the best values of various parameters involved in the GA optimizer to develop the SimHyd model. The results obtained from the best configuration of the SimHyd model are presented here. Feed-forward neural network model structure trained by back-propagation training algorithm has been adopted here to develop the ANN models. The daily rainfall and runoff data derived from Bird Creek Basin, Oklahoma, USA have been employed to develop all the models included here. A wide range of error statistics have been used to evaluate the performance of all the models

  3. MODELING OF ANIMATED SIMULATIONS BY MAXIMA PROGRAM TOOLS

    Directory of Open Access Journals (Sweden)

    Nataliya O. Bugayets

    2015-06-01

    Full Text Available The article deals with the methodical features in training of computer simulation of systems and processes using animation. In the article the importance of visibility of educational material that combines sensory and thinking sides of cognition is noted. The concept of modeling and the process of building models has been revealed. Attention is paid to the development of skills that are essential for effective learning of animated simulation by visual aids. The graphical environment tools of the computer mathematics system Maxima for animated simulation are described. The examples of creation of models animated visual aids and their use for the development of research skills are presented.

  4. Integrating Wind Profiling Radars and Radiosonde Observations with Model Point Data to Develop a Decision Support Tool to Assess Upper-Level Winds for Space Launch

    Science.gov (United States)

    Bauman, William H., III; Flinn, Clay

    2013-01-01

    On the day of launch, the 45th Weather Squadron (45 WS) Launch Weather Officers (LWOs) monitor the upper-level winds for their launch customers. During launch operations, the payload/launch team sometimes asks the LWOs if they expect the upper-level winds to change during the countdown. The LWOs used numerical weather prediction model point forecasts to provide the information, but did not have the capability to quickly retrieve or adequately display the upper-level observations and compare them directly in the same display to the model point forecasts to help them determine which model performed the best. The LWOs requested the Applied Meteorology Unit (AMU) develop a graphical user interface (GUI) that will plot upper-level wind speed and direction observations from the Cape Canaveral Air Force Station (CCAFS) Automated Meteorological Profiling System (AMPS) rawinsondes with point forecast wind profiles from the National Centers for Environmental Prediction (NCEP) North American Mesoscale (NAM), Rapid Refresh (RAP) and Global Forecast System (GFS) models to assess the performance of these models. The AMU suggested adding observations from the NASA 50 MHz wind profiler and one of the US Air Force 915 MHz wind profilers, both located near the Kennedy Space Center (KSC) Shuttle Landing Facility, to supplement the AMPS observations with more frequent upper-level profiles. Figure 1 shows a map of KSC/CCAFS with the locations of the observation sites and the model point forecasts.

  5. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  6. Development and application of a predictive model of Aspergillus candidus growth as a tool to improve shelf life of bakery products.

    Science.gov (United States)

    Huchet, V; Pavan, S; Lochardet, A; Divanac'h, M L; Postollec, F; Thuault, D

    2013-12-01

    Molds are responsible for spoilage of bakery products during storage. A modeling approach to predict the effect of water activity (aw) and temperature on the appearance time of Aspergillus candidus was developed and validated on cakes. The gamma concept of Zwietering was adapted to model fungal growth, taking into account the impact of temperature and aw. We hypothesized that the same model could be used to calculate the time for mycelium to become visible (tv), by substituting the matrix parameter by tv. Cardinal values of A. candidus were determined on potato dextrose agar, and predicted tv were further validated by challenge-tests run on 51 pastries. Taking into account the aw dynamics recorded in pastries during reasonable conditions of storage, high correlation was shown between predicted and observed tv when the aw at equilibrium (after 14 days of storage) was used for modeling (Af = 1.072, Bf = 0.979). Validation studies on industrial cakes confirmed the experimental results and demonstrated the suitability of the model to predict tv in food as a function of aw and temperature. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Greenhouse gases from wastewater treatment - A review of modelling tools.

    Science.gov (United States)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  9. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  10. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  11. Specialized case tools for the development of the accounting ...

    African Journals Online (AJOL)

    The paper presents an approach to building specialized CASE tools for the development of accounting applications. These tools form an integrated development environment allowing the computer aided development of the different applications in this field. This development environment consists of a formula interpreter, ...

  12. Multidisciplinary Modelling Tools for Power Electronic Circuits

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad

    package, e.g. power module, DFR approach meets trade-offs in electrical, thermal and mechanical design of the device. Today, virtual prototyping of power electronic circuits using advanced simulation tools is becoming attractive due to cost/time saving in building potential designs. With simulations......This thesis presents multidisciplinary modelling techniques in a Design For Reliability (DFR) approach for power electronic circuits. With increasing penetration of renewable energy systems, the demand for reliable power conversion systems is becoming critical. Since a large part of electricity...... is processed through power electronics, highly efficient, sustainable, reliable and cost-effective power electronic devices are needed. Reliability of a product is defined as the ability to perform within its predefined functions under given conditions in a specific time. Because power electronic devices...

  13. Information and Communication Technologies: A Tool Empowering and Developing the Horizon of the Learner

    Science.gov (United States)

    Debande, Olivier; Ottersten, Eugenia Kazamaki

    2004-01-01

    In this article, we focus on the implementation and development of ICT in the education sector, challenging and developing the traditional learning environment whilst introducing new educational tools including e-learning. The paper investigates ICT as a tool empowering and developing learners lifelong learning opportunities. It defines a model of…

  14. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  15. Demonstration of Decision Support Tools for Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, David Earl; Jacobson, Jacob Jordan; Berrett, Sharon; Cobb, D. A.; Worhach, P.

    2000-11-01

    The Demonstration of Decision Support Tools for Sustainable Development project integrated the Bechtel/Nexant Industrial Materials Exchange Planner and the Idaho National Engineering and Environmental Laboratory System Dynamic models, demonstrating their capabilities on alternative fuel applications in the Greater Yellowstone-Teton Park system. The combined model, called the Dynamic Industrial Material Exchange, was used on selected test cases in the Greater Yellow Teton Parks region to evaluate economic, environmental, and social implications of alternative fuel applications, and identifying primary and secondary industries. The test cases included looking at compressed natural gas applications in Teton National Park and Jackson, Wyoming, and studying ethanol use in Yellowstone National Park and gateway cities in Montana. With further development, the system could be used to assist decision-makers (local government, planners, vehicle purchasers, and fuel suppliers) in selecting alternative fuels, vehicles, and developing AF infrastructures. The system could become a regional AF market assessment tool that could help decision-makers understand the behavior of the AF market and conditions in which the market would grow. Based on this high level market assessment, investors and decision-makers would become more knowledgeable of the AF market opportunity before developing detailed plans and preparing financial analysis.

  16. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    Science.gov (United States)

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  17. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  18. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  19. Ceramic cutting tools materials, development and performance

    CERN Document Server

    Whitney, E Dow

    1994-01-01

    Interest in ceramics as a high speed cutting tool material is based primarily on favorable material properties. As a class of materials, ceramics possess high melting points, excellent hardness and good wear resistance. Unlike most metals, hardness levels in ceramics generally remain high at elevated temperatures which means that cutting tip integrity is relatively unaffected at high cutting speeds. Ceramics are also chemically inert against most workmetals.

  20. Long range manipulator development and experiments with dismantling tools

    International Nuclear Information System (INIS)

    Mueller, K.

    1993-01-01

    An existing handling system (EMIR) was used as a carrier system for various tools for concrete dismantling and radiation protection monitoring. It combined the advantages of long reach and high payload with highly dexterous kinematics. This system was enhanced mechanically to allow the use of different tools. Tool attachment devices for automatic tool exchange were investigated as well as interfaces (electric, hydraulic, compressed air, cooling water and signals). The control system was improved with regard to accuracy and sensor data processing. Programmable logic controller functions for tool control were incorporated. A free field mockup of the EMIR was build that allowed close simulation of dismantling scenarios without radioactive inventory. Aged concrete was provided for the integration tests. The development scheduled included the basic concept investigation; the development of tools and sensors; the EMIR hardware enhancement including a tool exchange; the adaption of tools and mockup and the final evaluation of the system during experiments

  1. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  2. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  3. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  4. Development of novel tools to measure food neophobia in children

    DEFF Research Database (Denmark)

    Damsbo-Svendsen, Marie; Frøst, Michael Bom; Olsen, Annemarie

    2017-01-01

    The main tool currently used to measure food neophobia (the Food Neophobia Scale, FNS, developed by Pliner & Hobden, 1992) may not remain optimal forever. It was developed around 25 years ago, and the perception and availability of “novel” and “ethnic” foods may have changed in the meantime....... Consequently, there is a need for developing updated tools for measuring food neophobia....

  5. The development of a post occupancy evaluation tool for primary schools: learner comfort assessment tool (LCAT)

    CSIR Research Space (South Africa)

    Motsatsi, L

    2015-12-01

    Full Text Available in order to facilitate teaching and learning. The aim of this study was to develop a Post Occupational Evaluation (POE) tool to assess learner comfort in relation to indoor environmental quality in the classroom. The development of POE tool followed a...

  6. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  7. MODERN TOOLS FOR MODELING ACTIVITY IT-COMPANIES

    Directory of Open Access Journals (Sweden)

    Марина Петрівна ЧАЙКОВСЬКА

    2015-05-01

    Full Text Available Increasing competition in the market of the web-based applications increases the importance of the quality of services and optimization of processes of interaction with customers. The purpose of the article is to develop recommendations for improving the business processes of IT enterprises of web application segment based on technological tools for business modeling, shaping requirements for the development of an information system for customer interaction; analysis of the effective means of implementation and evaluation of the economic effects of the introduction. A scheme of the business process development and launch of the website was built, based on the analysis of business process models and “swim lane” models, requirements for IP customer relationship management for web studio were established. Market of software to create IP was analyzed, and the ones corresponding to the requirements were selected. IP system was developed and tested, implemented it in the company, an appraisal of the economic effect was conducted.

  8. Development and implementation of a dynamic TES dispatch control component in a PV-CSP techno-economic performance modelling tool

    Science.gov (United States)

    Hansson, Linus; Guédez, Rafael; Larchet, Kevin; Laumert, Bjorn

    2017-06-01

    The dispatchability offered by thermal energy storage (TES) in concentrated solar power (CSP) and solar hybrid plants based on such technology presents the most important difference compared to power generation based only on photovoltaics (PV). This has also been one reason for recent hybridization efforts of the two technologies and the creation of Power Purchase Agreement (PPA) payment schemes based on offering higher payment multiples during daily hours of higher (peak or priority) demand. Recent studies involving plant-level thermal energy storage control strategies are however to a large extent based on pre-determined approaches, thereby not taking into account the actual dynamics of thermal energy storage system operation. In this study, the implementation of a dynamic dispatch strategy in the form of a TRNSYS controller for hybrid PV-CSP plants in the power-plant modelling tool DYESOPT is presented. In doing this it was attempted to gauge the benefits of incorporating a day-ahead approach to dispatch control compared to a fully pre-determined approach determining hourly dispatch only once prior to annual simulation. By implementing a dynamic strategy, it was found possible to enhance technical and economic performance for CSP-only plants designed for peaking operation and featuring low values of the solar multiple. This was achieved by enhancing dispatch control, primarily by taking storage levels at the beginning of every simulation day into account. The sequential prediction of the TES level could therefore be improved, notably for evaluated plants without integrated PV, for which the predicted storage levels deviated less than when PV was present in the design. While also featuring dispatch performance gains, optimal plant configurations for hybrid PV-CSP was found to present a trade-off in economic performance in the form of an increase in break-even electricity price when using the dynamic strategy which was offset to some extent by a reduction in

  9. The role of customized computational tools in product development.

    Energy Technology Data Exchange (ETDEWEB)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  10. Numerical Model Metrics Tools in Support of Navy Operations

    Science.gov (United States)

    Dykes, J. D.; Fanguy, P.

    2017-12-01

    Increasing demands of accurate ocean forecasts that are relevant to the Navy mission decision makers demand tools that quickly provide relevant numerical model metrics to the forecasters. Increasing modelling capabilities with ever-higher resolution domains including coupled and ensemble systems as well as the increasing volume of observations and other data sources to which to compare the model output requires more tools for the forecaster to enable doing more with less. These data can be appropriately handled in a geographic information system (GIS) fused together to provide useful information and analyses, and ultimately a better understanding how the pertinent model performs based on ground truth.. Oceanographic measurements like surface elevation, profiles of temperature and salinity, and wave height can all be incorporated into a set of layers correlated to geographic information such as bathymetry and topography. In addition, an automated system that runs concurrently with the models on high performance machines matches routinely available observations to modelled values to form a database of matchups with which statistics can be calculated and displayed, to facilitate validation of forecast state and derived variables. ArcMAP, developed by Environmental Systems Research Institute, is a GIS application used by the Naval Research Laboratory (NRL) and naval operational meteorological and oceanographic centers to analyse the environment in support of a range of Navy missions. For example, acoustic propagation in the ocean is described with a three-dimensional analysis of sound speed that depends on profiles of temperature, pressure and salinity predicted by the Navy Coastal Ocean Model. The data and model output must include geo-referencing information suitable for accurately placing the data within the ArcMAP framework. NRL has developed tools that facilitate merging these geophysical data and their analyses, including intercomparisons between model

  11. Developing A SPOT CRM Debriefing Tool

    Science.gov (United States)

    Martin, Lynne; Villeda, Eric; Orasanu, Judith; Connors, Mary M. (Technical Monitor)

    1998-01-01

    In a study of CRM LOFT briefings published in 1997, Dismukes, McDonnell & Jobe reported that briefings were not being utilized as fully as they could be and that crews may not be getting the full benefit from LOFT that is possible. On the basis of their findings, they suggested a set of general guidelines for briefings for the industry. Our work builds on this study to try to provide a specific debriefing tool which provides a focus for the strategies that Dismukes et al suggest.

  12. Model and code development

    International Nuclear Information System (INIS)

    Anon.

    1977-01-01

    Progress in model and code development for reactor physics calculations is summarized. The codes included CINDER-10, PHROG, RAFFLE GAPP, DCFMR, RELAP/4, PARET, and KENO. Kinetics models for the PBF were developed

  13. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  14. Thermal behaviour modelling of superplastic forming tools

    OpenAIRE

    Velay , Vincent; Cutard , Thierry; Guegan , N.

    2008-01-01

    High-temperature operational conditions of super plastic forming (SPF) tools induce very complex thermomechanical loadings responsible to their failure. Various materials can be used to manufacture forming tools: ceramic, refractory castable or heat resistant steel. In this paper, an experimental and numerical analysis is performed in order to characterise the environmental loadings undergone by the tool whatever the considered material. This investigation allows to lead a thermal calculation...

  15. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  16. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  17. Some key issues in the development of ergonomic intervention tools

    DEFF Research Database (Denmark)

    Edwards, Kasper; Winkel, Jørgen

    2016-01-01

    Literature reviews suggest that tools facilitating the ergonomic intervention processes should be integrated into rationalization tools, particular if such tools are participative. Such a Tool has recently been developed as an add-in module to the Lean tool “Value Stream Mapping” (VSM). However......, in the investigated context this module seems not to have any direct impact on the generation of proposals with ergonomic consideration. Contextual factors of importance seem to be e.g. allocation of sufficient resources and if work environment issues are generally accepted as part of the VSM methodology...

  18. Designing a training tool for imaging mental models

    Science.gov (United States)

    Dede, Christopher J.; Jayaram, Geetha

    1990-01-01

    The training process can be conceptualized as the student acquiring an evolutionary sequence of classification-problem solving mental models. For example a physician learns (1) classification systems for patient symptoms, diagnostic procedures, diseases, and therapeutic interventions and (2) interrelationships among these classifications (e.g., how to use diagnostic procedures to collect data about a patient's symptoms in order to identify the disease so that therapeutic measures can be taken. This project developed functional specifications for a computer-based tool, Mental Link, that allows the evaluative imaging of such mental models. The fundamental design approach underlying this representational medium is traversal of virtual cognition space. Typically intangible cognitive entities and links among them are visible as a three-dimensional web that represents a knowledge structure. The tool has a high degree of flexibility and customizability to allow extension to other types of uses, such a front-end to an intelligent tutoring system, knowledge base, hypermedia system, or semantic network.

  19. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  20. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  1. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology......, analysis must be employed to determine its capabilities. This kind of analysis is the subject of this dissertation. The main contribution of this work is the Service Relation Model used to describe and analyze the flow of service in models of platforms and systems composed of re-usable components...

  2. Use of System Dynamics Techniques in the Garrison Health Modelling Tool

    Science.gov (United States)

    2010-11-01

    Joint Health Command (JHC) tasked DSTO to develop techniques for modelling Defence health service delivery both in a Garrison environment in Australia ...UNCLASSIFIED UNCLASSIFIED Use of System Dynamics Techniques in the Garrison Health Modelling Tool Mark Burnett, Kerry Clifford and...Garrison Health Modelling Tool, a prototype software package designed to provide decision-support to JHC health officers and managers in a garrison

  3. Development of a piano learning tool

    OpenAIRE

    Baloh, Matevž

    2012-01-01

    This thesis analyzes the appropriateness of the formula defined by the game 'Guitar Hero' in an application, which aims to help it's users learn how to play the piano. The appropriateness is determined through the development of an application. The thesis describes an attempt at the development of a game, the primary intention of which is to be fun, with the secondary purpose of teaching how to play the piano. After this, it describes an attempt at the development of an application, the p...

  4. Design and development of progressive tool for manufacturing washer

    Science.gov (United States)

    Annigeri, Ulhas K.; Raghavendra Ravi Kiran, K.; Deepthi, Y. P.

    2017-07-01

    In a progressive tool the raw material is worked at different station to finally fabricate the component. A progressive tool is a lucrative tool for mass production of components. A lot of automobile and other transport industries develop progressive tool for the production of components. The design of tool involves lot of planning and the same amount of skill of process planning is required in the fabrication of the tool. The design also involves use of thumb rules and standard elements as per experience gained in practice. Manufacturing the press tool is a laborious task as special jigs and fixtures have to be designed for the purpose. Assembly of all the press tool elements is another task where use of accurate measuring instruments for alignment of various tool elements is important. In the present study, design and fabrication of progressive press tool for production of washer has been developed and the press tool has been tried out on a mechanical type of press. The components produced are to dimensions.

  5. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING

    Science.gov (United States)

    LCA-based guidance was developed by EPA under the Framework for Responsible Environmental Decision Making (FRED) effort to demonstrate how to conduct a relative comparison between product types to determine environmental preferability. It identifies data collection needs and iss...

  6. DEVELOPMENT OF SOLUBILITY PRODUCT VISUALIZATION TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    T.F. Turner; A.T. Pauli; J.F. Schabron

    2004-05-01

    Western Research Institute (WRI) has developed software for the visualization of data acquired from solubility tests. The work was performed in conjunction with AB Nynas Petroleum, Nynashamn, Sweden who participated as the corporate cosponsor for this Jointly Sponsored Research (JSR) task. Efforts in this project were split between software development and solubility test development. The Microsoft Windows-compatible software developed inputs up to three solubility data sets, calculates the parameters for six solid body types to fit the data, and interactively displays the results in three dimensions. Several infrared spectroscopy techniques have been examined for potential use in determining bitumen solubility in various solvents. Reflectance, time-averaged absorbance, and transmittance techniques were applied to bitumen samples in single and binary solvent systems. None of the techniques were found to have wide applicability.

  7. A Thermoelastic Hydraulic Fracture Design Tool for Geothermal Reservoir Development

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad Ghassemi

    2003-06-30

    Geothermal energy is recovered by circulating water through heat exchange areas within a hot rock mass. Geothermal reservoir rock masses generally consist of igneous and metamorphic rocks that have low matrix permeability. Therefore, cracks and fractures play a significant role in extraction of geothermal energy by providing the major pathways for fluid flow and heat exchange. Thus, knowledge of conditions leading to formation of fractures and fracture networks is of paramount importance. Furthermore, in the absence of natural fractures or adequate connectivity, artificial fracture are created in the reservoir using hydraulic fracturing. At times, the practice aims to create a number of parallel fractures connecting a pair of wells. Multiple fractures are preferred because of the large size necessary when using only a single fracture. Although the basic idea is rather simple, hydraulic fracturing is a complex process involving interactions of high pressure fluid injections with a stressed hot rock mass, mechanical interaction of induced fractures with existing natural fractures, and the spatial and temporal variations of in-situ stress. As a result it is necessary to develop tools that can be used to study these interactions as an integral part of a comprehensive approach to geothermal reservoir development, particularly enhanced geothermal systems. In response to this need we have set out to develop advanced thermo-mechanical models for design of artificial fractures and rock fracture research in geothermal reservoirs. These models consider the significant hydraulic and thermo-mechanical processes and their interaction with the in-situ stress state. Wellbore failure and fracture initiation is studied using a model that fully couples poro-mechanical and thermo-mechanical effects. The fracture propagation model is based on a complex variable and regular displacement discontinuity formulations. In the complex variable approach the displacement discontinuities are

  8. Development of a tool for evaluating multimedia for surgical education.

    Science.gov (United States)

    Coughlan, Jane; Morar, Sonali S

    2008-09-01

    Educational multimedia has been designed to provide surgical trainees with expert operative information outside of the operating theater. The effectiveness of multimedia (e.g., CD-ROMs) for learning has been a common research topic since the 1990s. To date, however, little discussion has taken place on the mechanisms to evaluate the quality of multimedia-driven teaching. This may be because of a lack of research into the development of appropriate tools for evaluating multimedia, especially for surgical education. This paper reports on a small-scale pilot and exploratory study (n = 12) that developed a tool for surgical multimedia evaluation. The validity of the developed tool was established through adaptation of an existing tool, which was reviewed using experts in surgery, usability, and education. The reliability of the developed tool was tested with surgical trainees who used it to assess a multimedia CD-ROM created for teaching basic surgical skills. The findings contribute to an understanding of surgical trainees' experience of using educational multimedia, in terms of characteristics of the learning material for interface design and content and the process of developing evaluation tools, in terms of inclusion of appropriate assessment criteria. The increasing use of multimedia in medical education necessitates the development of standardized tools for determining the quality of teaching and learning. Little research exists into the development of such tools and so the present work stimulates discussion on how to evaluate surgical training.

  9. Development and validation of a simulation tool dedicated to eddy current non destructive testing of tubes; Developpement d'un modele electromagnetique 3D pour la simulation du controle par Courants de Foucault de tubes en fabrication

    Energy Technology Data Exchange (ETDEWEB)

    Reboud, Ch

    2006-09-15

    Eddy current testing (ECT) technique is widely used in industrial fields such as iron and steel industry. Dedicated simulation tools provide a great assistance for the optimisation of ECT processes. CEA and the Vallourec Research Center have collaborated in order to develop a simulation tool of ECT of tubes. The volume integral method has been chosen for the resolution of Maxwell equations in a stratified medium, in order to get accurate results with a computation time short enough to carry out optimisation or inversion procedures. A fast model has been developed for the simulation of ECT of non magnetic tubes using specific external probes. New flaw geometries have been modelled: holes and notches with flat bottom. Validations of the developments, which have been integrated to the CIVA platform, have been carried out using experimental data recorded in laboratory conditions and in. industrial conditions, successively. The integral equations derived are solved using the Galerkin variant of the method of moments with pulse functions as projection functions. In order to overcome some memory limitations, other projection functions have been considered. A new discretization scheme based on non-uniform B-Splines of degree 1 or 2 has been implemented, which constitutes an original contribution to the existing literature. The decrease of the mesh size needed to get a given accuracy on the result may lead to the simulation of more complex ECT configurations. (author)

  10. Integrating Hydrologic and Water Quality Models as a Decision Support Tool for Implementation of Low Impact Development in a Coastal Urban Watershed under Climate Variability and Sea Level Rise

    Science.gov (United States)

    Chang, N. B.

    2016-12-01

    Many countries concern about development and redevelopment efforts in urban regions to reduce the flood risk by considering hazards such as high-tide events, storm surge, flash floods, stormwater runoff, and impacts of sea level rise. Combining these present and future hazards with vulnerable characteristics found throughout coastal communities such as majority low-lying areas and increasing urban development, create scenarios for increasing exposure of flood hazard. As such, the most vulnerable areas require adaptation strategies and mitigation actions for flood hazard management. In addition, in the U.S., Numeric Nutrient Criteria (NNC) are a critical tool for protecting and restoring the designated uses of a waterbody with regard to nitrogen and phosphorus pollution. Strategies such as low impact development (LID) have been promoted in recent years as an alternative to traditional stormwater management and drainage to control both flooding and water quality impact. LID utilizes decentralized multifunctional site designs and incorporates on-site storm water management practices rather than conventional storm water management approaches that divert flow toward centralized facilities. How to integrate hydrologic and water quality models to achieve the decision support becomes a challenge. The Cross Bayou Watershed of Pinellas County in Tampa Bay, a highly urbanized coastal watershed, is utilized as a case study due to its sensitivity to flood hazards and water quality management within the watershed. This study will aid the County, as a decision maker, to implement its stormwater management policy and honor recent NNC state policy via demonstration of an integrated hydrologic and water quality model, including the Interconnected Channel and Pond Routing Model v.4 (ICPR4) and the BMPTRAIN model as a decision support tool. The ICPR4 can be further coupled with the ADCIRC/SWAN model to reflect the storm surge and seal level rise in coastal regions.

  11. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  12. Rasp Tool on Phoenix Robotic Arm Model

    Science.gov (United States)

    2008-01-01

    This close-up photograph taken at the Payload Interoperability Testbed at the University of Arizona, Tucson, shows the motorized rasp protruding from the bottom of the scoop on the engineering model of NASA's Phoenix Mars Lander's Robotic Arm. The rasp will be placed against the hard Martian surface to cut into the hard material and acquire an icy soil sample for analysis by Phoenix's scientific instruments. The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is led by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  13. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  14. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    . By applying the GREMI2-theories of “innovative milieux” (Aydalot, 1986; Camagni, 1991) to the case study, we will suggest some reasons for the benefits achieved by the Dogme-network, compared to other networks. This analysis will point to the existence of an “innovative milieu” on sustainability within......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...

  15. Software tools for object-based audio production using the Audio Definition Model

    OpenAIRE

    Matthias , Geier; Carpentier , Thibaut; Noisternig , Markus; Warusfel , Olivier

    2017-01-01

    International audience; We present a publicly available set of tools for the integration of the Audio Definition Model (ADM) in production workflows. ADM is an open metadata model for the description of channel-, scene-, and object-based media within a Broadcast Wave Format (BWF) container. The software tools were developed within the European research project ORPHEUS (https://orpheus-audio.eu/) that aims at developing new end-to-end object-based media chains for broadcast. These tools allow ...

  16. Developing Multilateral Surveillance Tools in the EU

    NARCIS (Netherlands)

    de Ruiter, R.

    2008-01-01

    The development of the infrastructure of the Open Method of Coordination (OMC) is an unaddressed topic in scholarly debates. On the basis of secondary literature on the European Employment Strategy, it is hypothesised that a conflict between an incentive and reluctance to act on the EU level on the

  17. Awareness Development Across Perspectives Tool (ADAPT)

    NARCIS (Netherlands)

    Petiet, P.; Maanen, P.P. van; Bemmel, I.E. van; Vliet, A.J. van

    2010-01-01

    Reality can be viewed from several perspectives or disciplines. Due to their background, training and education, soldiers developed a military perspective which is not solely restricted to kinetic activities. In current missions, military personnel is confronted with a reality in which other

  18. Tools for educational change | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-02-03

    Feb 3, 2011 ... One floor above Pinto's office is a relatively new computer lab, part of SchoolNet Mozambique, a project supported by Canada's International Development Research Centre (IDRC) to link schools via the Internet to enhance learning opportunities for students, teachers, and the surrounding community.

  19. Tools for Nanotechnology Education Development Program

    Energy Technology Data Exchange (ETDEWEB)

    Dorothy Moore

    2010-09-27

    The overall focus of this project was the development of reusable, cost-effective educational modules for use with the table top scanning electron microscope (TTSEM). The goal of this project's outreach component was to increase students' exposure to the science and technology of nanoscience.

  20. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Science.gov (United States)

    2013-11-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-D-1279] Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug Administration Staff; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food...

  1. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    Directory of Open Access Journals (Sweden)

    NAN Anca-Petruţa

    2013-07-01

    Full Text Available Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert, making his acquired knowledge and experience available. We opted for a general presentation of the expert systems as well as their necessity, because, the solution to develop the agricultural system can come from artificial intelligence by implementing the expert systems in the field of agricultural insurance, promoting existing insurance products, farmers finding options in depending on their necessities and possibilities. The objective of this article consists of collecting data about different aspects about specific areas of interest of agricultural insurance, preparing the database, a conceptual presentation of a pilot version which will become constantly richer depending on the answers received from agricultural producers, with the clearest exposure of knowledgebase possible. We can justify picking this theme with the fact that even while agricultural insurance plays a very important role in agricultural development, the registered result got from them are modest, reason why solutions need to be found in the scope of developing the agricultural sector. The importance of this consists in the proposal of an immediate viable solution to correspond with the current necessities of agricultural producers and in the proposal of an innovative solution, namely the implementation of expert system in agricultural insurance as a way of promoting insurance products. Our research, even though it treats the subject at an conceptual level, it wants to undertake an

  2. Analysis of Sequence Diagram Layout in Advanced UML Modelling Tools

    Directory of Open Access Journals (Sweden)

    Ņikiforova Oksana

    2016-05-01

    Full Text Available System modelling using Unified Modelling Language (UML is the task that should be solved for software development. The more complex software becomes the higher requirements are stated to demonstrate the system to be developed, especially in its dynamic aspect, which in UML is offered by a sequence diagram. To solve this task, the main attention is devoted to the graphical presentation of the system, where diagram layout plays the central role in information perception. The UML sequence diagram due to its specific structure is selected for a deeper analysis on the elements’ layout. The authors research represents the abilities of modern UML modelling tools to offer automatic layout of the UML sequence diagram and analyse them according to criteria required for the diagram perception.

  3. A new model for the sonic borehole logging tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1990-12-01

    A number of models for the sonic borehole logging tool has earlier been developed. These models which are mainly based on experimental data, are discussed and compared. On this background the new model is developed. It is based on the assumptions that the pores of low porosity formations and the grains of high porosity media may be approximated by cylinders, and that the dimension of these cylinders are given by distribution functions. From these assumptions the transit time Δt p of low porosity formations and Δt g of high porosity media are calculated by use of the Monte Carlo method. Combining the Δt p and Δt g values obtained by use of selected weighting functions seems to permit the determination of the transit time Δt for the full porosity range (0 ≤ φ ≤ 100%). (author)

  4. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... speed doubly-fed induction generator wind turbine concept 3. Variable speed multi-pole permanent magnet synchronous generator wind turbine concept These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies......, connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control...

  5. Libraries Are Dynamic Tools for National Development

    Directory of Open Access Journals (Sweden)

    Amaoge Dorathy Agbo

    2014-12-01

    Full Text Available Building an ideal nation requires a holistic approach. All facets of human activity must be harnessed while all indices of nation building must be taken care of. In doing this, all academic and professional disciplines are involved. Libraries are not exception. This paper looks at various types of libraries and their basic functions, their roles in national development, and in particular, the challenges facing library services in Nigeria, such as inadequately trained staff to meet the increasing demands of users.

  6. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  7. Seductive Atmospheres: Using tools to effectuate spaces for Leadership Development

    DEFF Research Database (Denmark)

    Elmholdt, Kasper Trolle; Clausen, Rune Thorbjørn; Madsen, Mona T

    2018-01-01

    Hospital, this study investigates how a business game is used as a tool to effectuate episodic spaces for leadership development. The study reveals three tool affordances and discusses how they enable and constrain episodic spaces for development and further develops the notion of seductive atmospheres......This study applies an affordance lens to understand the use of management tools and how atmospheres for change and development are created and exploited. Drawing on an ethnographic case study of a consultant-facilitated change intervention among a group of research leaders at a Danish Public...... as an important mechanism. The article suggests that a broader understanding of the use of tools and the role of atmospheres is essential for understanding how episodic spaces for development come to work in relation to organizational change and development....

  8. Prototype of Automated PLC Model Checking Using Continuous Integration Tools

    CERN Document Server

    Lettrich, Michael

    2015-01-01

    To deal with the complexity of operating and supervising large scale industrial installations at CERN, often Programmable Logic Controllers (PLCs) are used. A failure in these control systems can cause a disaster in terms of economic loses, environmental damages or human losses. Therefore the requirements to software quality are very high. To provide PLC developers with a way to verify proper functionality against requirements, a Java tool named PLCverif has been developed which encapsulates and thus simplifies the use of third party model checkers. One of our goals in this project is to integrate PLCverif in development process of PLC programs. When the developer changes the program, all the requirements should be verified again, as a change on the code can produce collateral effects and violate one or more requirements. For that reason, PLCverif has been extended to work with Jenkins CI in order to trigger automatically the verication cases when the developer changes the PLC program. This prototype has been...

  9. Development, Exploitation, and Transition of Computer Aided Engineering (CAE) Tools

    National Research Council Canada - National Science Library

    Carter, Harold W

    2003-01-01

    .... Tasks include CMOS-based microwave component design and fabrication, parallel and mixed-signal VHDL and VBHDL-AMS simulator algorithms, conversion tools for VHDL-AMS models, System-on-a-Chip methods...

  10. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    Directory of Open Access Journals (Sweden)

    Ezio Bartocci

    2016-01-01

    Full Text Available As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  11. Evaluation of the Street Pollution Model OSPM for Measurements at 12 Streets Stations Using a Newly Developed and Freely Available Evaluation Tool

    DEFF Research Database (Denmark)

    Ketzel, Matthias; Jensen, Steen Solvang; Brandt, Jørgen

    2012-01-01

    the observations well, especially for the most recent years, while for NO2 the model over-predicts in two cases. The explanation for this over-prediction is believed to be uncertainties in the traffic or emission input data, but also in model parameters, and the representativeness of the urban background data may...... reproduces the observed basic dependencies of concentrations on meteorological parameters–most notably wind direction and wind speed. However, in some cases the modelled annual trends in NOx and NO2 are slightly different from what is found in the measured concentrations. For NOx the OSPM reproduces......In the present work, the Operational Street Pollution Model (OSPM) has been evaluated in comparison with continuous half-hourly measurements over a multi-year period for five permanent street monitor stations that constitute part of the Danish Air Quality Monitoring Programme as well...

  12. Development of a Microsoft Excel tool for one-parameter Rasch model of continuous items: an application to a safety attitude survey.

    Science.gov (United States)

    Chien, Tsair-Wei; Shao, Yang; Kuo, Shu-Chun

    2017-01-10

    Many continuous item responses (CIRs) are encountered in healthcare settings, but no one uses item response theory's (IRT) probabilistic modeling to present graphical presentations for interpreting CIR results. A computer module that is programmed to deal with CIRs is required. To present a computer module, validate it, and verify its usefulness in dealing with CIR data, and then to apply the model to real healthcare data in order to show how the CIR that can be applied to healthcare settings with an example regarding a safety attitude survey. Using Microsoft Excel VBA (Visual Basic for Applications), we designed a computer module that minimizes the residuals and calculates model's expected scores according to person responses across items. Rasch models based on a Wright map and on KIDMAP were demonstrated to interpret results of the safety attitude survey. The author-made CIR module yielded OUTFIT mean square (MNSQ) and person measures equivalent to those yielded by professional Rasch Winsteps software. The probabilistic modeling of the CIR module provides messages that are much more valuable to users and show the CIR advantage over classic test theory. Because of advances in computer technology, healthcare users who are familiar to MS Excel can easily apply the study CIR module to deal with continuous variables to benefit comparisons of data with a logistic distribution and model fit statistics.

  13. Evaluation and selection of CASE tool for SMART OTS development

    International Nuclear Information System (INIS)

    Park, K. O; Seo, S. M.; Seo, Y. S.; Koo, I. S.; Jang, M. H.

    1999-01-01

    CASE(Computer-Aided Software Engineering) tool is a software that aids in software engineering activities such as requirement analysis, design, testing, configuration management, and project management. The evaluation and selection of commercial CASE tools for the specific software development project is not a easy work because the technical ability of an evaluator and the maturity of a software development organization are required. In this paper, we discuss selection strategies, characteristic survey, evaluation criteria, and the result of CASE tool selection for the development of SMART(System-integrated Modular Advanced ReacTor) OTS(Operator Training Simulator)

  14. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, Edwin J.; Frambach, Ruud T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client

  15. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  16. Evaluating IMU communication skills training programme: assessment tool development.

    Science.gov (United States)

    Yeap, R; Beevi, Z; Lukman, H

    2008-08-01

    This article describes the development of four assessment tools designed to evaluate the communication skills training (CST) programme at the International Medical University (IMU). The tools measure pre-clinical students' 1) perceived competency in basic interpersonal skills, 2) attitude towards patient-centred communication, 3) conceptual knowledge on doctor-patient communication, and 4) acceptance of the CST programme.

  17. Validation of the 25-Item Stanford Faculty Development Program Tool on Clinical Teaching Effectiveness.

    Science.gov (United States)

    Mintz, Marcy; Southern, Danielle A; Ghali, William A; Ma, Irene W Y

    2015-01-01

    CONSTRUCT: The 25-item Stanford Faculty Development Program Tool on Clinical Teaching Effectiveness assesses clinical teaching effectiveness. Valid and reliable rating of teaching effectiveness is helpful for providing faculty with feedback. The 25-item Stanford Faculty Development Program Tool on Clinical Teaching Effectiveness was intended to evaluate seven dimensions of clinical teaching. Confirmation of the structure of this tool has not been previously performed. This study sought to validate this tool using a confirmatory factor analysis, testing a 7-factor model and compared its goodness of fit with a modified model. Acceptability of the use of the tool was assessed using a 6-item survey, completed by final year medical students (N = 119 of 156 students; 76%). The testing of the goodness of fit indicated that the 7-factor model performed poorly, χ(2)(254) = 457.4, p tool on their preceptors on a biweekly basis, only 25% were willing to do so on a weekly basis. Our study failed to confirm factor structure of the 25-item tool. A modified tool with fewer, more conceptually distinct items was best fit by a 5-factor model. Further, the acceptability of use for the 25-item tool may be poor for rotations with a new preceptor weekly. The abbreviated tool may be preferable in that setting.

  18. A participatory decision support tool to access costs and benefits or tourism development scenarios : application of the ADAPTIVE model to Greater Giyani, South Africa

    NARCIS (Netherlands)

    Henkens, R.J.H.G.; Tassone, V.C.; Grafakos, S.; Groot, de R.S.; Luttik, J.

    2007-01-01

    The tourism industry represents a thriving business and offers many opportunities for tourism development all around the world. Each development will have its economic, socio-cultural and ecological costs and benefits. Many of these are difficult to assess and to value, which often leads to

  19. WP3 Prototype development for operational planning tool

    DEFF Research Database (Denmark)

    Kristoffersen, Trine; Meibom, Peter; Apfelbeck, J.

    of electricity load and wind power production, and to cover forced outages of power plants and transmission lines. Work has been carried out to include load uncertainty and forced outages in the two main components of the Wilmar Planning tool namely the Scenario Tree Tool and the Joint Market Model. This work...... is documented in chapter 1 and 2. The inclusion of load uncertainty and forced outages in the Scenario Tree Tool enables calculation of the demand for reserve power depending on the forecast horizon. The algorithm is given in Section 3.1. The design of a modified version of the Joint Market Model enabling...

  20. Tools for tracking progress. Indicators for sustainable energy development

    International Nuclear Information System (INIS)

    Khan, A.; Rogner, H.H.; Aslanian, G.

    2000-01-01

    A project on 'Indicators for Sustainable Energy Development (ISED)' was introduced by the IAEA as a part of its work programme on Comparative Assessment of Energy Sources for the biennium 1999-2000. It is being pursued by the Planning and Economic Studies Section of the Department of Nuclear Energy. The envisaged tasks are to: (1) identify the main components of sustainable energy development and derive a consistent set of appropriate indicators, keeping in view the indicators for Agenda 21, (2) establish relationship of ISED with those of the Agenda 21, and (3) review the Agency's databases and tools to determine the modifications required to apply the ISED. The first two tasks are being pursued with the help of experts from various international organizations and Member States. In this connection two expert group meetings were held, one in May 1999 and the other in November 1999. The following nine topics were identified as the key issues: social development; economic development; environmental congeniality and waste management; resource depletion; adequate provision of energy and disparities; energy efficiency; energy security; energy supply options; and energy pricing. A new conceptual framework model specifically tuned to the energy sector was developed, drawing upon work by other organizations in the environmental area. Within the framework of this conceptual model, two provisional lists of ISED - a full list and a core list - have been prepared. They cover indicators for the following energy related themes and sub-themes under the economic, social and environmental dimensions of sustainable energy development: Economic dimension: Economic activity levels; End-use energy intensities of selected sectors and different manufacturing industries; energy supply efficiency; energy security; and energy pricing. Social dimension: Energy accessibility and disparities. Environmental dimension: Air pollution (urban air quality; global climate change concern); water

  1. Evaluation of air pollution modelling tools as environmental engineering courseware.

    Science.gov (United States)

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results.

  2. Development of a Microsoft Excel tool for one-parameter Rasch model of continuous items: an application to a safety attitude survey

    Directory of Open Access Journals (Sweden)

    Tsair-Wei Chien

    2017-01-01

    Full Text Available Abstract Background Many continuous item responses (CIRs are encountered in healthcare settings, but no one uses item response theory’s (IRT probabilistic modeling to present graphical presentations for interpreting CIR results. A computer module that is programmed to deal with CIRs is required. To present a computer module, validate it, and verify its usefulness in dealing with CIR data, and then to apply the model to real healthcare data in order to show how the CIR that can be applied to healthcare settings with an example regarding a safety attitude survey. Methods Using Microsoft Excel VBA (Visual Basic for Applications, we designed a computer module that minimizes the residuals and calculates model’s expected scores according to person responses across items. Rasch models based on a Wright map and on KIDMAP were demonstrated to interpret results of the safety attitude survey. Results The author-made CIR module yielded OUTFIT mean square (MNSQ and person measures equivalent to those yielded by professional Rasch Winsteps software. The probabilistic modeling of the CIR module provides messages that are much more valuable to users and show the CIR advantage over classic test theory. Conclusions Because of advances in computer technology, healthcare users who are familiar to MS Excel can easily apply the study CIR module to deal with continuous variables to benefit comparisons of data with a logistic distribution and model fit statistics.

  3. ISRU System Model Tool: From Excavation to Oxygen Production

    Science.gov (United States)

    Santiago-Maldonado, Edgardo; Linne, Diane L.

    2007-01-01

    In the late 80's, conceptual designs for an in situ oxygen production plant were documented in a study by Eagle Engineering [1]. In the "Summary of Findings" of this study, it is clearly pointed out that: "reported process mass and power estimates lack a consistent basis to allow comparison." The study goes on to say: "A study to produce a set of process mass, power, and volume requirements on a consistent basis is recommended." Today, approximately twenty years later, as humans plan to return to the moon and venture beyond, the need for flexible up-to-date models of the oxygen extraction production process has become even more clear. Multiple processes for the production of oxygen from lunar regolith are being investigated by NASA, academia, and industry. Three processes that have shown technical merit are molten regolith electrolysis, hydrogen reduction, and carbothermal reduction. These processes have been selected by NASA as the basis for the development of the ISRU System Model Tool (ISMT). In working to develop up-to-date system models for these processes NASA hopes to accomplish the following: (1) help in the evaluation process to select the most cost-effective and efficient process for further prototype development, (2) identify key parameters, (3) optimize the excavation and oxygen production processes, and (4) provide estimates on energy and power requirements, mass and volume of the system, oxygen production rate, mass of regolith required, mass of consumables, and other important parameters. Also, as confidence and high fidelity is achieved with each component's model, new techniques and processes can be introduced and analyzed at a fraction of the cost of traditional hardware development and test approaches. A first generation ISRU System Model Tool has been used to provide inputs to the Lunar Architecture Team studies.

  4. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  5. Analysis and prediction of pest dynamics in an agroforestry context using Tiko'n, a generic tool to develop food web models

    Science.gov (United States)

    Rojas, Marcela; Malard, Julien; Adamowski, Jan; Carrera, Jaime Luis; Maas, Raúl

    2017-04-01

    While it is known that climate change will impact future plant-pest population dynamics, potentially affecting crop damage, agroforestry with its enhanced biodiversity is said to reduce the outbreaks of pest insects by providing natural enemies for the control of pest populations. This premise is known in the literature as the natural enemy hypothesis and has been widely studied qualitatively. However, disagreement still exists on whether biodiversity enhancement reduces pest outbreaks, showing the need of quantitatively understanding the mechanisms behind the interactions between pests and natural enemies, also known as trophic interactions. Crop pest models that study insect population dynamics in agroforestry contexts are very rare, and pest models that take trophic interactions into account are even rarer. This may be due to the difficulty of representing complex food webs in a quantifiable model. There is therefore a need for validated food web models that allow users to predict the response of these webs to changes in climate in agroforestry systems. In this study we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web models; the program uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. Tiko'n was run using coffee leaf miner (Leucoptera coffeella) and associated parasitoid data from a shaded coffee plantation, showing the mechanisms of insect population dynamics within a tri-trophic food web in an agroforestry system.

  6. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  7. Development of Nylon Based FDM Filament for Rapid Tooling Application

    Science.gov (United States)

    Singh, R.; Singh, S.

    2014-04-01

    There has been critical need for development of cost effective nylon based wire to be used as feed stock filament for fused deposition modelling (FDM) machine. But hitherto, very less work has been reported for development of alternate solution of acrylonitrile butadiene styrene (ABS) based wire which is presently used in most of FDM machines. The present research work is focused on development of nylon based wire as an alternative of ABS wire (which is to be used as feedstock filament on FDM) without changing any hardware or software of machine. For the present study aluminium oxide (Al2O3) as additive in different proportion has been used with nylon fibre. Single screw extruder was used for wire preparation and wire thus produced was tested on FDM. Mechanical properties i.e. tensile strength and percentage elongation of finally developed wire have been optimized by Taguchi L9 technique. The work represented major development in reducing cost and time in rapid tooling applications.

  8. Development and commissioning of decision support tools for sewerage management.

    Science.gov (United States)

    Manic, G; Printemps, C; Zug, M; Lemoine, C

    2006-01-01

    Managing sewerage systems is a highly complex task due to the dynamic nature of the facilities. Their performance strongly depends on the know-how applied by the operators. In order to define optimal operational settings, two decision support tools based on mathematical models have been developed. Moreover, easy-to-use interfaces have been created as well, aiding operators who presumably do not have the necessary skills to use modelling software. The two developed programs simulate the behaviour of both wastewater treatment plants (WWTP) and sewer network systems, respectively. They have essentially the same structure, including raw data management and statistical analysis, a simulation layer using the application programming interface of the applied software and a layer responsible for the representation of the obtained results. Four user modes are provided in the two software including the simulation of historical data using the applied and novel operational settings, as well as modes concerning prediction of possible operation periods and updates. Concerning the WWTP software, it was successfully installed in Nantes (France) in June 2004. Moreover, the one managing sewer networks has been deployed in Saint-Malo (France) in January 2005. This paper presents the structure of the developed software and the first results obtained during the commissioning phase.

  9. Modeling Tourism Sustainable Development

    Science.gov (United States)

    Shcherbina, O. A.; Shembeleva, E. A.

    The basic approaches to decision making and modeling tourism sustainable development are reviewed. Dynamics of a sustainable development is considered in the Forrester's system dynamics. Multidimensionality of tourism sustainable development and multicriteria issues of sustainable development are analyzed. Decision Support Systems (DSS) and Spatial Decision Support Systems (SDSS) as an effective technique in examining and visualizing impacts of policies, sustainable tourism development strategies within an integrated and dynamic framework are discussed. Main modules that may be utilized for integrated modeling sustainable tourism development are proposed.

  10. Water Loss Management: Tools and Methods for Developing Countries

    NARCIS (Netherlands)

    Mutikanga, H.E.

    2012-01-01

    Water utilities in developing countries are struggling to provide customers with a reliable level of service due to their peculiar water distribution characteristics including poorly zoned networks with irregular supply operating under restricted budgets. These unique conditions demand unique tools

  11. Water Loss Management : Tools and Methods for Developing Countries

    NARCIS (Netherlands)

    Mutikanga, H.E.

    2012-01-01

    Water utilities in developing countries are struggling to provide customers with a reliable level of service due to their peculiar water distribution characteristics including poorly zoned networks with irregular supply operating under restricted budgets. These unique conditions demand unique tools

  12. Psychometric properties of a Mental Health Team Development Audit Tool.

    LENUS (Irish Health Repository)

    Roncalli, Silvia

    2013-02-01

    To assist in improving team working in Community Mental Health Teams (CMHTs), the Mental Health Commission formulated a user-friendly but yet-to-be validated 25-item Mental Health Team Development Audit Tool (MHDAT).

  13. Right approach to 3D modeling using CAD tools

    Science.gov (United States)

    Baddam, Mounica Reddy

    The thesis provides a step-by-step methodology to enable an instructor dealing with CAD tools to optimally guide his/her students through an understandable 3D modeling approach which will not only enhance their knowledge about the tool's usage but also enable them to achieve their desired result in comparatively lesser time. In the known practical field, there is particularly very little information available to apply CAD skills to formal beginners' training sessions. Additionally, advent of new software in 3D domain cumulates updating into a more difficult task. Keeping up to the industry's advanced requirements emphasizes the importance of more skilled hands in the field of CAD development, rather than just prioritizing manufacturing in terms of complex software features. The thesis analyses different 3D modeling approaches specified to the varieties of CAD tools currently available in the market. Utilizing performance-time databases, learning curves have been generated to measure their performance time, feature count etc. Based on the results, improvement parameters have also been provided for (Asperl, 2005).

  14. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    . To illustrate these concepts a number of examples are used. These include models of polymer membranes, distillation and catalyst behaviour. Some detailed considerations within these models are stated and discussed. Model generation concepts are introduced and ideas of a reference model are given that shows...

  15. An Integrated Package of Neuromusculoskeletal Modeling Tools in Simulink (TM)

    National Research Council Canada - National Science Library

    Davoodi, R

    2001-01-01

    .... Blocks representing the skeletal linkage, sensors, muscles, and neural controllers are developed using separate software tools and integrated in the powerful simulation environment of Simulink (Mathworks Inc., USA...

  16. influence.ME: tools for detecting influential data in mixed effects models

    NARCIS (Netherlands)

    Nieuwenhuis, Rense; te Grotenhuis, M.; Pelzer, B.

    2012-01-01

    influence.ME provides tools for detecting influential data in mixed effects models. The application of these models has become common practice, but the development of diagnostic tools has lagged behind. influence.ME calculates standardized measures of influential data for the point estimates of

  17. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  18. A student-centered approach for developing active learning: the construction of physical models as a teaching tool in medical physiology.

    Science.gov (United States)

    Rezende-Filho, Flávio Moura; da Fonseca, Lucas José Sá; Nunes-Souza, Valéria; Guedes, Glaucevane da Silva; Rabelo, Luiza Antas

    2014-09-15

    Teaching physiology, a complex and constantly evolving subject, is not a simple task. A considerable body of knowledge about cognitive processes and teaching and learning methods has accumulated over the years, helping teachers to determine the most efficient way to teach, and highlighting student's active participation as a means to improve learning outcomes. In this context, this paper describes and qualitatively analyzes an experience of a student-centered teaching-learning methodology based on the construction of physiological-physical models, focusing on their possible application in the practice of teaching physiology. After having Physiology classes and revising the literature, students, divided in small groups, built physiological-physical models predominantly using low-cost materials, for studying different topics in Physiology. Groups were followed by monitors and guided by teachers during the whole process, finally presenting the results in a Symposium on Integrative Physiology. Along the proposed activities, students were capable of efficiently creating physiological-physical models (118 in total) highly representative of different physiological processes. The implementation of the proposal indicated that students successfully achieved active learning and meaningful learning in Physiology while addressing multiple learning styles. The proposed method has proved to be an attractive, accessible and relatively simple approach to facilitate the physiology teaching-learning process, while facing difficulties imposed by recent requirements, especially those relating to the use of experimental animals and professional training guidelines. Finally, students' active participation in the production of knowledge may result in a holistic education, and possibly, better professional practices.

  19. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  20. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  1. Effectiveness of operation tools developed by KEKB operators

    International Nuclear Information System (INIS)

    Sugino, K.; Satoh, Y.; Kitabayashi, T.

    2004-01-01

    The main tasks of KEKB (High Energy Accelerator Research Organization B-physics) operators are beam tuning and injection, operation logging, monitoring of accelerator conditions and safety management. New beam tuning methods are frequently applied to KEKB in order to accomplish high luminosity. In such a situation, various operation tools have been developed by the operators to realize efficient operation. In this paper, we describe effectiveness of tools developed by the operators. (author)

  2. Measuring vaccine hesitancy: The development of a survey tool.

    OpenAIRE

    Larson, HJ; Jarrett, C; Schulz, WS; Chaudhuri, M; Zhou, Y; Dube, E; Schuster, M; MacDonald, NE; Wilson, R; SAGE Working Group on Vaccine Hesitancy,; , COLLABORATORS; Eskola, J; Liang, X; Chaudhuri, M; Dubé, E

    2015-01-01

    : In March 2012, the SAGE Working Group on Vaccine Hesitancy was convened to define the term "vaccine hesitancy", as well as to map the determinants of vaccine hesitancy and develop tools to measure and address the nature and scale of hesitancy in settings where it is becoming more evident. The definition of vaccine hesitancy and a matrix of determinants guided the development of a survey tool to assess the nature and scale of hesitancy issues. Additionally, vaccine hesitancy questi...

  3. Conceptual Models as Tools for Communication Across Disciplines

    Directory of Open Access Journals (Sweden)

    Marieke Heemskerk

    2003-12-01

    Full Text Available To better understand and manage complex social-ecological systems, social scientists and ecologists must collaborate. However, issues related to language and research approaches can make it hard for researchers in different fields to work together. This paper suggests that researchers can improve interdisciplinary science through the use of conceptual models as a communication tool. The authors share lessons from a workshop in which interdisciplinary teams of young scientists developed conceptual models of social-ecological systems using data sets and metadata from Long-Term Ecological Research sites across the United States. Both the process of model building and the models that were created are discussed. The exercise revealed that the presence of social scientists in a group influenced the place and role of people in the models. This finding suggests that the participation of both ecologists and social scientists in the early stages of project development may produce better questions and more accurate models of interactions between humans and ecosystems. Although the participants agreed that a better understanding of human intentions and behavior would advance ecosystem science, they felt that interdisciplinary research might gain more by training strong disciplinarians than by merging ecology and social sciences into a new field. It is concluded that conceptual models can provide an inspiring point of departure and a guiding principle for interdisciplinary group discussions. Jointly developing a model not only helped the participants to formulate questions, clarify system boundaries, and identify gaps in existing data, but also revealed the thoughts and assumptions of fellow scientists. Although the use of conceptual models will not serve all purposes, the process of model building can help scientists, policy makers, and resource managers discuss applied problems and theory among themselves and with those in other areas.

  4. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  5. The EDF/SEPTEN crisis team calculation tools and models

    International Nuclear Information System (INIS)

    De Magondeaux, B.; Grimaldi, X.

    1993-01-01

    Electricite de France (EDF) has developed a set of simplified tools and models called TOUTEC and CRISALIDE which are devoted to be used by the French utility National Crisis Team in order to perform the task of diagnosis and prognosis during an emergency situation. As a severe accident could have important radiological consequences, this method is focused on the diagnosis of the state of the safety barriers and on the prognosis of their behaviour. These tools allow the crisis team to deliver public authorities with information on the radiological risk and to provide advices to manage the accident on the damaged unit. At a first level, TOUTEC is intended to complement the hand-book with simplified calculation models and predefined relationships. It can avoid tedious calculation during stress conditions. The main items are the calculation of the primary circuit breach size and the evaluation of hydrogen over pressurization. The set of models called CRISALIDE is devoted to evaluate the following critical parameters: delay before core uncover, which would signify more severe consequences if it occurs, containment pressure behaviour and finally source term. With these models, crisis team comes able to take into account combinations of boundary conditions according to safety and auxiliary systems availability

  6. ThermoFit: A Set of Software Tools, Protocols and Schema for the Organization of Thermodynamic Data and for the Development, Maintenance, and Distribution of Internally Consistent Thermodynamic Data/Model Collections

    Science.gov (United States)

    Ghiorso, M. S.

    2013-12-01

    Internally consistent thermodynamic databases are critical resources that facilitate the calculation of heterogeneous phase equilibria and thereby support geochemical, petrological, and geodynamical modeling. These 'databases' are actually derived data/model systems that depend on a diverse suite of physical property measurements, calorimetric data, and experimental phase equilibrium brackets. In addition, such databases are calibrated with the adoption of various models for extrapolation of heat capacities and volumetric equations of state to elevated temperature and pressure conditions. Finally, these databases require specification of thermochemical models for the mixing properties of solid, liquid, and fluid solutions, which are often rooted in physical theory and, in turn, depend on additional experimental observations. The process of 'calibrating' a thermochemical database involves considerable effort and an extensive computational infrastructure. Because of these complexities, the community tends to rely on a small number of thermochemical databases, generated by a few researchers; these databases often have limited longevity and are universally difficult to maintain. ThermoFit is a software framework and user interface whose aim is to provide a modeling environment that facilitates creation, maintenance and distribution of thermodynamic data/model collections. Underlying ThermoFit are data archives of fundamental physical property, calorimetric, crystallographic, and phase equilibrium constraints that provide the essential experimental information from which thermodynamic databases are traditionally calibrated. ThermoFit standardizes schema for accessing these data archives and provides web services for data mining these collections. Beyond simple data management and interoperability, ThermoFit provides a collection of visualization and software modeling tools that streamline the model/database generation process. Most notably, ThermoFit facilitates the

  7. Development and Application of a Protocol for Definition of Process Conditions for Directional Solidification: Integrating Fundamental Theory, Experimentation and Modeling Tools (Preprint)

    Science.gov (United States)

    2012-03-01

    Defect map for directional solidification with superimposed predicted preferred solidification conditions for a range of bar thicknesses and mold ...J.J. Schirra (The Mineral, Metals & Materials Society, 2000) 189-200. 7. A.J. Elliott, “ Directional Solidification of Large Cross-Section Ni -Base...AFRL-RX-WP-TP-2012-0252 DEVELOPMENT AND APPLICATION OF A PROTOCOL FOR DEFINITION OF PROCESS CONDITIONS FOR DIRECTIONAL SOLIDIFICATION

  8. A proposed adaptation of the European Foundation for Quality Management Excellence Model to physical activity programmes for the elderly - development of a quality self-assessment tool using a modified Delphi process

    Directory of Open Access Journals (Sweden)

    Marques Ana I

    2011-09-01

    Full Text Available Abstract Background There has been a growing concern in designing physical activity (PA programmes for elderly people, since evidence suggests that such health promotion interventions may reduce the deleterious effects of the ageing process. Complete programme evaluations are a necessary prerequisite to continuous quality improvements. Being able to refine, adapt and create tools that are suited to the realities and contexts of PA programmes for the elderly in order to support its continuous improvement is, therefore, crucial. Thus, the aim of this study was to develop a self-assessment tool for PA programmes for the elderly. Methods A 3-round Delphi process was conducted via the Internet with 43 national experts in PA for the elderly, management and delivery of PA programmes for the elderly, sports management, quality management and gerontology, asking experts to identify the propositions that they considered relevant for inclusion in the self-assessment tool. Experts reviewed a list of proposed statements, based on the criteria and sub-criteria from the European Foundation for Quality Management Excellence Model (EFQM and PA guidelines for older adults and rated each proposition from 1 to 8 (disagree to agree and modified and/or added propositions. Propositions receiving either bottom or top scores of greater than 70% were considered to have achieved consensus to drop or retain, respectively. Results In round 1, of the 196 originally-proposed statements (best practice principles, the experts modified 41, added 1 and achieved consensus on 93. In round 2, a total of 104 propositions were presented, of which experts modified 39 and achieved consensus on 53. In the last round, of 51 proposed statements, the experts achieved consensus on 19. After 3 rounds of rating, experts had not achieved consensus on 32 propositions. The resulting tool consisted of 165 statements that assess nine management areas involved in the development of PA programmes for

  9. A proposed adaptation of the European Foundation for Quality Management Excellence Model to physical activity programmes for the elderly - development of a quality self-assessment tool using a modified Delphi process.

    Science.gov (United States)

    Marques, Ana I; Santos, Leonel; Soares, Pedro; Santos, Rute; Oliveira-Tavares, António; Mota, Jorge; Carvalho, Joana

    2011-09-29

    There has been a growing concern in designing physical activity (PA) programmes for elderly people, since evidence suggests that such health promotion interventions may reduce the deleterious effects of the ageing process. Complete programme evaluations are a necessary prerequisite to continuous quality improvements. Being able to refine, adapt and create tools that are suited to the realities and contexts of PA programmes for the elderly in order to support its continuous improvement is, therefore, crucial. Thus, the aim of this study was to develop a self-assessment tool for PA programmes for the elderly. A 3-round Delphi process was conducted via the Internet with 43 national experts in PA for the elderly, management and delivery of PA programmes for the elderly, sports management, quality management and gerontology, asking experts to identify the propositions that they considered relevant for inclusion in the self-assessment tool. Experts reviewed a list of proposed statements, based on the criteria and sub-criteria from the European Foundation for Quality Management Excellence Model (EFQM) and PA guidelines for older adults and rated each proposition from 1 to 8 (disagree to agree) and modified and/or added propositions. Propositions receiving either bottom or top scores of greater than 70% were considered to have achieved consensus to drop or retain, respectively. In round 1, of the 196 originally-proposed statements (best practice principles), the experts modified 41, added 1 and achieved consensus on 93. In round 2, a total of 104 propositions were presented, of which experts modified 39 and achieved consensus on 53. In the last round, of 51 proposed statements, the experts achieved consensus on 19. After 3 rounds of rating, experts had not achieved consensus on 32 propositions. The resulting tool consisted of 165 statements that assess nine management areas involved in the development of PA programmes for the elderly. Based on experts' opinions, a self

  10. The scientific modeling assistant: An advanced software tool for scientific model building

    Science.gov (United States)

    Keller, Richard M.; Sims, Michael H.

    1991-01-01

    Viewgraphs on the scientific modeling assistant: an advanced software tool for scientific model building are presented. The objective is to build a specialized software tool to assist in scientific model-building.

  11. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  12. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    Science.gov (United States)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  13. Development Of Remote Hanford Connector Gasket Replacement Tooling For DWPF

    International Nuclear Information System (INIS)

    Krementz, D.; Coughlin, Jeffrey

    2009-01-01

    The Defense Waste Processing Facility (DWPF) requested the Savannah River National Laboratory (SRNL) to develop tooling and equipment to remotely replace gaskets in mechanical Hanford connectors to reduce personnel radiation exposure as compared to the current hands-on method. It is also expected that radiation levels will continually increase with future waste streams. The equipment is operated in the Remote Equipment Decontamination Cell (REDC), which is equipped with compressed air, two master-slave manipulators (MSM's) and an electro-mechanical manipulator (EMM) arm for operation of the remote tools. The REDC does not provide access to electrical power, so the equipment must be manually or pneumatically operated. The MSM's have a load limit at full extension of ten pounds, which limited the weight of the installation tool. In order to remotely replace Hanford connector gaskets several operations must be performed remotely, these include: removal of the spent gasket and retaining ring (retaining ring is also called snap ring), loading the new snap ring and gasket into the installation tool and installation of the new gasket into the Hanford connector. SRNL developed and tested tools that successfully perform all of the necessary tasks. Removal of snap rings from horizontal and vertical connectors is performed by separate air actuated retaining ring removal tools and is manipulated in the cell by the MSM. In order install a new gasket, the snap ring loader is used to load a new snap ring into a groove in the gasket installation tool. A new gasket is placed on the installation tool and retained by custom springs. An MSM lifts the installation tool and presses the mounted gasket against the connector block. Once the installation tool is in position, the gasket and snap ring are installed onto the connector by pneumatic actuation. All of the tools are located on a custom work table with a pneumatic valve station that directs compressed air to the desired tool and

  14. DEVELOPMENT OF REMOTE HANFORD CONNECTOR GASKET REPLACEMENT TOOLING FOR DWPF

    Energy Technology Data Exchange (ETDEWEB)

    Krementz, D.; Coughlin, Jeffrey

    2009-05-05

    The Defense Waste Processing Facility (DWPF) requested the Savannah River National Laboratory (SRNL) to develop tooling and equipment to remotely replace gaskets in mechanical Hanford connectors to reduce personnel radiation exposure as compared to the current hands-on method. It is also expected that radiation levels will continually increase with future waste streams. The equipment is operated in the Remote Equipment Decontamination Cell (REDC), which is equipped with compressed air, two master-slave manipulators (MSM's) and an electro-mechanical manipulator (EMM) arm for operation of the remote tools. The REDC does not provide access to electrical power, so the equipment must be manually or pneumatically operated. The MSM's have a load limit at full extension of ten pounds, which limited the weight of the installation tool. In order to remotely replace Hanford connector gaskets several operations must be performed remotely, these include: removal of the spent gasket and retaining ring (retaining ring is also called snap ring), loading the new snap ring and gasket into the installation tool and installation of the new gasket into the Hanford connector. SRNL developed and tested tools that successfully perform all of the necessary tasks. Removal of snap rings from horizontal and vertical connectors is performed by separate air actuated retaining ring removal tools and is manipulated in the cell by the MSM. In order install a new gasket, the snap ring loader is used to load a new snap ring into a groove in the gasket installation tool. A new gasket is placed on the installation tool and retained by custom springs. An MSM lifts the installation tool and presses the mounted gasket against the connector block. Once the installation tool is in position, the gasket and snap ring are installed onto the connector by pneumatic actuation. All of the tools are located on a custom work table with a pneumatic valve station that directs compressed air to the desired

  15. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  16. Scratch as a computational modelling tool for teaching physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-05-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling programs. In this article, we briefly discuss why Scratch could be a useful tool for computational modelling in the primary or secondary physics classroom, and we present practical examples of how it can be used to build a model.

  17. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  18. Spatial Modeling Tools for Cell Biology

    Science.gov (United States)

    2006-10-01

    of the cells total volume. The cytosol contains thousands of enzymes that are responsible for the catalyzation of glycolysis and gluconeogenesis ... dog , swine and pig models [Pantely, 1990, 1991; Stanley 1992]. In these studies, blood flow through the left anterior descending (LAD) coronary...perfusion. In conclusion, even thought our model falls within the (rather large) error bounds of experimental dog , pig and swine models, the

  19. SBML qualitative models: a model representation format and infrastructure to foster interactions between qualitative modelling formalisms and tools.

    Science.gov (United States)

    Chaouiya, Claudine; Bérenguier, Duncan; Keating, Sarah M; Naldi, Aurélien; van Iersel, Martijn P; Rodriguez, Nicolas; Dräger, Andreas; Büchel, Finja; Cokelaer, Thomas; Kowal, Bryan; Wicks, Benjamin; Gonçalves, Emanuel; Dorier, Julien; Page, Michel; Monteiro, Pedro T; von Kamp, Axel; Xenarios, Ioannis; de Jong, Hidde; Hucka, Michael; Klamt, Steffen; Thieffry, Denis; Le Novère, Nicolas; Saez-Rodriguez, Julio; Helikar, Tomáš

    2013-12-10

    Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. We present the Systems Biology Markup Language (SBML) Qualitative Models Package ("qual"), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models. SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.

  20. Embedded Systems Development Tools: A MODUS-oriented Market Overview

    Directory of Open Access Journals (Sweden)

    Loupis Michalis

    2014-03-01

    Full Text Available Background: The embedded systems technology has perhaps been the most dominating technology in high-tech industries, in the past decade. The industry has correctly identified the potential of this technology and has put its efforts into exploring its full potential. Objectives: The goal of the paper is to explore the versatility of the application in the embedded system development based on one FP7-SME project. Methods/Approach: Embedded applications normally demand high resilience and quality, as well as conformity to quality standards and rigid performance. As a result embedded system developers have adopted software methods that yield high quality. The qualitative approach to examining embedded systems development tools has been applied in this work. Results: This paper presents a MODUS-oriented market analysis in the domains of Formal Verification tools, HW/SW co-simulation tools, Software Performance Optimization tools and Code Generation tools. Conclusions: The versatility of applications this technology serves is amazing. With all this performance potential, the technology has carried with itself a large number of issues which the industry essentially needs to resolve to be able to harness the full potential contained. The MODUS project toolset addressed four discrete domains of the ESD Software Market, in which corresponding open tools were developed

  1. Fuzzy regression modeling for tool performance prediction and degradation detection.

    Science.gov (United States)

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  2. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  3. Mainstreaming Low-Carbon Climate-Resilient growth pathways into Development Finance Institutions' activities. A research project on the standards, tools and metrics to support transition to the low-carbon climate-resilient development models. Paper 1 - Climate and development finance institutions: linking climate finance, development finance and the transition to low-carbon, climate-resilient economic models

    International Nuclear Information System (INIS)

    Eschalier, Claire; Cochran, Ian; Deheza, Mariana; Risler, Ophelie; Forestier, Pierre

    2015-10-01

    Development finance institutions (DFIs) are in a position to be key actors in aligning development and the 2 deg. challenge. One of the principal challenges today is to scale-up the financial flows to the trillions of dollars per year necessary to achieve the 2 deg. C long-term objectives. Achieving this transition to a low-carbon, climate resilient (LCCR) economic model requires the integration or 'mainstreaming' of climate issues as a prism through which all investment decisions should be made. This paper presents an overview of the opportunities and challenges of linking a LCCR transition with the objectives of development finance. It first presents the two-fold challenge of climate change and development for countries around the world. Second, the paper explores the role of development finance institutions and their support for the transition to a low-carbon, climate-resilient economic model. Finally, it examines a necessary paradigm shift to integrate climate and development objectives to establish a 'LCCR development model' able to simultaneously tackling development priorities and needs for resilient, low-carbon growth. This will necessitate a move from focusing on a 'siloed' vision of climate finance to a means of aligning activities across the economy with the LCCR objectives to ensure that the majority of investments are coherent with this long-term transition. (authors)

  4. Developing Free and Open Source Interactive Teaching Tools

    Science.gov (United States)

    Nelson, E.

    2016-12-01

    Online learning has become an embedded component of education, but existing resources are often provided as institution-hosted content management systems (that may or may not be closed source). Creating interactive online applets to enhance student education is an alternative to these limited-customization systems that can be accomplished on a small budget. This presentation will break down the anatomy of author-developed online teaching tools created with open source packages to provide a survey of the development tools utilized—from the underlying website framework to interfacing with the scientific data. The availability of hosting and maintaining interactive teaching tools, whether static or dynamic, on no- or low-cost platforms will also be discussed. By constructing an interactive teaching tool from the ground up, scientists and educators are afforded complete flexibility and creativity in the design.

  5. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  6. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...... for sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable...... the context that the building is located in, as well as, a tool which facilitates the discussion of which type of sustainability is achieved in specific projects....

  7. Developing and Validating a New Classroom Climate Observation Assessment Tool.

    Science.gov (United States)

    Leff, Stephen S; Thomas, Duane E; Shapiro, Edward S; Paskewich, Brooke; Wilson, Kim; Necowitz-Hoffman, Beth; Jawad, Abbas F

    2011-01-01

    The climate of school classrooms, shaped by a combination of teacher practices and peer processes, is an important determinant for children's psychosocial functioning and is a primary factor affecting bullying and victimization. Given that there are relatively few theoretically-grounded and validated assessment tools designed to measure the social climate of classrooms, our research team developed an observation tool through participatory action research (PAR). This article details how the assessment tool was designed and preliminarily validated in 18 third-, fourth-, and fifth-grade classrooms in a large urban public school district. The goals of this study are to illustrate the feasibility of a PAR paradigm in measurement development, ascertain the psychometric properties of the assessment tool, and determine associations with different indices of classroom levels of relational and physical aggression.

  8. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    Startups play a crucial role in exploiting the commercial advantages created by new, advanced technologies. Surprisingly, the processes by which the entrepreneur commercialises these technologies are largely undescribed - partly due to the absence of appropriate process data capture tools....... This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  9. Microfield exposure tool enables advances in EUV lithography development

    Energy Technology Data Exchange (ETDEWEB)

    Naulleau, Patrick

    2009-09-07

    With demonstrated resist resolution of 20 nm half pitch, the SEMATECH Berkeley BUV microfield exposure tool continues to push crucial advances in the areas of BUY resists and masks. The ever progressing shrink in computer chip feature sizes has been fueled over the years by a continual reduction in the wavelength of light used to pattern the chips. Recently, this trend has been threatened by unavailability of lens materials suitable for wavelengths shorter than 193 nm. To circumvent this roadblock, a reflective technology utilizing a significantly shorter extreme ultraviolet (EUV) wavelength (13.5 nm) has been under development for the past decade. The dramatic wavelength shrink was required to compensate for optical design limitations intrinsic in mirror-based systems compared to refractive lens systems. With this significant reduction in wavelength comes a variety of new challenges including developing sources of adequate power, photoresists with suitable resolution, sensitivity, and line-edge roughness characteristics, as well as the fabrication of reflection masks with zero defects. While source development can proceed in the absence of available exposure tools, in order for progress to be made in the areas of resists and masks it is crucial to have access to advanced exposure tools with resolutions equal to or better than that expected from initial production tools. These advanced development tools, however, need not be full field tools. Also, implementing such tools at synchrotron facilities allows them to be developed independent of the availability of reliable stand-alone BUY sources. One such tool is the SEMATECH Berkeley microfield exposure tool (MET). The most unique attribute of the SEMA TECH Berkeley MET is its use of a custom-coherence illuminator made possible by its implementation on a synchrotron beamline. With only conventional illumination and conventional binary masks, the resolution limit of the 0.3-NA optic is approximately 25 nm, however

  10. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  11. MTK: An AI tool for model-based reasoning

    Science.gov (United States)

    Erickson, William K.; Schwartz, Mary R.

    1987-01-01

    A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Center is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control and trend analysis of the space station Thermal Management System (TMS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined, along with examples from the thermal system to highlight the motivating factors behind them. An overview of the capabilities of MTK is given.

  12. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  13. A model of tool wear monitoring system for turning

    OpenAIRE

    Šimunović, Goran; Ficko, Mirko; Šarić, Tomislav; Milošević, Mijodrag; Antić, Aco

    2015-01-01

    Acquiring high-quality and timely information on the tool wear condition in real time, presents a necessary prerequisite for identification of tool wear degree, which significantly improves the stability and quality of the machining process. Defined in this paper is a model of tool wear monitoring system with special emphasis on the module for acquisition and processing of vibration acceleration signal by applying discrete wavelet transformations (DWT) in signal decomposition. The paper prese...

  14. Models and Modelling Tools for Chemical Product and Process Design

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    2016-01-01

    -process design. Illustrative examples highlighting the need for efficient model-based systems will be presented, where the need for predictive models for innovative chemical product-process design will be highlighted. The examples will cover aspects of chemical product-process design where the idea of the grand......The design, development and reliability of a chemical product and the process to manufacture it, need to be consistent with the end-use characteristics of the desired product. One of the common ways to match the desired product-process characteristics is through trial and error based experiments......, which can be expensive and time consuming. An alternative approach is the use of a systematic model-based framework according to an established work-flow in product-process design, replacing some of the time consuming and/or repetitive experimental steps. The advantages of the use of a model...

  15. Developing Anticipatory Life Cycle Assessment Tools to Support Responsible Innovation

    Science.gov (United States)

    Wender, Benjamin

    Several prominent research strategy organizations recommend applying life cycle assessment (LCA) early in the development of emerging technologies. For example, the US Environmental Protection Agency, the National Research Council, the Department of Energy, and the National Nanotechnology Initiative identify the potential for LCA to inform research and development (R&D) of photovoltaics and products containing engineered nanomaterials (ENMs). In this capacity, application of LCA to emerging technologies may contribute to the growing movement for responsible research and innovation (RRI). However, existing LCA practices are largely retrospective and ill-suited to support the objectives of RRI. For example, barriers related to data availability, rapid technology change, and isolation of environmental from technical research inhibit application of LCA to developing technologies. This dissertation focuses on development of anticipatory LCA tools that incorporate elements of technology forecasting, provide robust explorations of uncertainty, and engage diverse innovation actors in overcoming retrospective approaches to environmental assessment and improvement of emerging technologies. Chapter one contextualizes current LCA practices within the growing literature articulating RRI and identifies the optimal place in the stage gate innovation model to apply LCA. Chapter one concludes with a call to develop anticipatory LCA---building on the theory of anticipatory governance---as a series of methodological improvements that seek to align LCA practices with the objectives of RRI. Chapter two provides a framework for anticipatory LCA, identifies where research from multiple disciplines informs LCA practice, and builds off the recommendations presented in the preceding chapter. Chapter two focuses on crystalline and thin film photovoltaics (PV) to illustrate the novel framework, in part because PV is an environmentally motivated technology undergoing extensive R&D efforts and

  16. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  17. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  18. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, N.M.; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and

  19. Agent Based Modeling as an Educational Tool

    Science.gov (United States)

    Fuller, J. H.; Johnson, R.; Castillo, V.

    2012-12-01

    Motivation is a key element in high school education. One way to improve motivation and provide content, while helping address critical thinking and problem solving skills, is to have students build and study agent based models in the classroom. This activity visually connects concepts with their applied mathematical representation. "Engaging students in constructing models may provide a bridge between frequently disconnected conceptual and mathematical forms of knowledge." (Levy and Wilensky, 2011) We wanted to discover the feasibility of implementing a model based curriculum in the classroom given current and anticipated core and content standards.; Simulation using California GIS data ; Simulation of high school student lunch popularity using aerial photograph on top of terrain value map.

  20. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  1. The Integrated Medical Model: A Decision Support Tool for In-flight Crew Health Care

    Science.gov (United States)

    Butler, Doug

    2009-01-01

    This viewgraph presentation reviews the development of an Integrated Medical Model (IMM) decision support tool for in-flight crew health care safety. Clinical methods, resources, and case scenarios are also addressed.

  2. The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods

    NARCIS (Netherlands)

    Verpoorten, Dominique; Poumay, M; Leclercq, D

    2006-01-01

    Please, cite this publication as: Verpoorten, D., Poumay, M., & Leclercq, D. (2006). The 8 Learning Events Model: a Pedagogic Conceptual Tool Supporting Diversification of Learning Methods. Proceedings of International Workshop in Learning Networks for Lifelong Competence Development, TENCompetence

  3. Physics-based Modeling Tools for Life Prediction and Durability Assessment of Advanced Materials, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The technical objectives of this program are: (1) to develop a set of physics-based modeling tools to predict the initiation of hot corrosion and to address pit and...

  4. Graphical Tools for Linear Structural Equation Modeling

    Science.gov (United States)

    2014-06-01

    regression coefficient βS A.CQ1 van- ishes, which can be used to test whether the specification of Model 2 is compatible with the data. Most...because they are all compatible with the graph in Figure 19a, which displays the skeleton and v-structures. Note that we cannot reverse the edge from...im- plications of linear structual equation models. R-428, <http://ftp.cs.ucla.edu/pub/stat_ser/r428.pdf>, CA. To ap- pear in Proceedings of AAAI-2014

  5. Software Engineering Tools for Scientific Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We design and demonstrate the feasibility of extending the open source Eclipse integrated development environment (IDE) to support the full range of capabilities now...

  6. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  7. Development of bore tools for pipe welding and cutting

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Ito, Akira; Takiguchi, Yuji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Experimental Reactor (ITER), in-vessel components replacement and maintenance requires that connected cooling pipes be cut and removed beforehand and that new components be installed to which cooling pipes must be rewelded. All welding must be inspected for soundness after completion. These tasks require a new task concept for ensuring shielded areas and access from narrow ports. Thus, it became necessary to develop autonomous locomotion welding and cutting tools for branch and main pipes to weld pipes by in-pipe access; a system was proposed that cut and welded branch and main pipes after passing inside pipe curves, and elemental technologies developed. This paper introduces current development in tools for welding and cutting branch pipes and other tools for welding and cutting the main pipe. (author)

  8. Toposcopy : A modelling tool for CITYGML

    NARCIS (Netherlands)

    Groneman, A.; Zlatanova, S.

    2009-01-01

    The new 3D standard CityGML has been attracting a lot of attention in the last few years. Many characteristics of the XML-based format make it suitable for storage and exchange of virtual 3D city models. It provides possibilities to store semantic and geometric information and has the potential to

  9. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  10. Development of thick wall welding and cutting tools for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Nakahira, Masataka; Takahashi, Hiroyuki; Akou, Kentaro; Koizumi, Koichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    The Vacuum Vessel, which is a core component of International Thermonuclear Experimental Reactor (ITER), is required to be exchanged remotely in a case of accident such as superconducting coil failure. The in-vessel components such as blanket and divertor are planned to be exchanged or fixed. In these exchange or maintenance operations, the thick wall welding and cutting are inevitable and remote handling tools are necessary. The thick wall welding and cutting tools for blanket are under developing in the ITER R and D program. The design requirement is to weld or cut the stainless steel of 70 mm thickness in the narrow space. Tungsten inert gas (TIG) arc welding, plasma cutting and iodine laser welding/cutting are selected as primary option. Element welding and cutting tests, design of small tools to satisfy space requirement, test fabrication and performance tests were performed. This paper reports the tool design and overview of welding and cutting tests. (author)

  11. Development of a Safety Management Web Tool for Horse Stables.

    Science.gov (United States)

    Leppälä, Jarkko; Kolstrup, Christina Lunner; Pinzke, Stefan; Rautiainen, Risto; Saastamoinen, Markku; Särkijärvi, Susanna

    2015-11-12

    Managing a horse stable involves risks, which can have serious consequences for the stable, employees, clients, visitors and horses. Existing industrial or farm production risk management tools are not directly applicable to horse stables and they need to be adapted for use by managers of different types of stables. As a part of the InnoEquine project, an innovative web tool, InnoHorse, was developed to support horse stable managers in business, safety, pasture and manure management. A literature review, empirical horse stable case studies, expert panel workshops and stakeholder interviews were carried out to support the design. The InnoHorse web tool includes a safety section containing a horse stable safety map, stable safety checklists, and examples of good practices in stable safety, horse handling and rescue planning. This new horse stable safety management tool can also help in organizing work processes in horse stables in general.

  12. Slab2 - Updated Subduction Zone Geometries and Modeling Tools

    Science.gov (United States)

    Moore, G.; Hayes, G. P.; Portner, D. E.; Furtney, M.; Flamme, H. E.; Hearne, M. G.

    2017-12-01

    The U.S. Geological Survey database of global subduction zone geometries (Slab1.0), is a highly utilized dataset that has been applied to a wide range of geophysical problems. In 2017, these models have been improved and expanded upon as part of the Slab2 modeling effort. With a new data driven approach that can be applied to a broader range of tectonic settings and geophysical data sets, we have generated a model set that will serve as a more comprehensive, reliable, and reproducible resource for three-dimensional slab geometries at all of the world's convergent margins. The newly developed framework of Slab2 is guided by: (1) a large integrated dataset, consisting of a variety of geophysical sources (e.g., earthquake hypocenters, moment tensors, active-source seismic survey images of the shallow slab, tomography models, receiver functions, bathymetry, trench ages, and sediment thickness information); (2) a dynamic filtering scheme aimed at constraining incorporated seismicity to only slab related events; (3) a 3-D data interpolation approach which captures both high resolution shallow geometries and instances of slab rollback and overlap at depth; and (4) an algorithm which incorporates uncertainties of contributing datasets to identify the most probable surface depth over the extent of each subduction zone. Further layers will also be added to the base geometry dataset, such as historic moment release, earthquake tectonic providence, and interface coupling. Along with access to several queryable data formats, all components have been wrapped into an open source library in Python, such that suites of updated models can be released as further data becomes available. This presentation will discuss the extent of Slab2 development, as well as the current availability of the model and modeling tools.

  13. Development of Sustainability Assessment Tool for Malaysian hydropower industry: A case study

    Science.gov (United States)

    Turan, Faiz Mohd; Johan, Kartina; Abu Sofian, Muhammad Irfan

    2018-04-01

    This research deals with the development of sustainability assessment tools as a medium to assess the performance of a hydropower project compliances towards sustainability practice. Since the increasing needs of implementing sustainability practice, developed countries are utilizing sustainability tools to achieve sustainable development goals. Its inception within ASEAN countries including Malaysia is still low. The problem with most tools developed from other countries is that it is not very comprehensive as well as its implementation factors are not suitable for the local environment that is not quantified. Hence, there is a need to develop a suitable sustainable assessment tool for the Malaysian hydropower industry to comply with the sustainable development goals as a bridging gap between the governor and the practitioner. The steps of achieving this goal is separated into several parts. The first part is to identify sustainable parameters from established tools as a model for comparison to enhance new parameters. The second stage is to convert equivalent quantification value from the model to the new developed tools. The last stage is to develop software program as a mean of gaining energy company feedback with systematic sustainable reporting from the surveyor so as to be able to integrate sustainability assessment, monitoring and reporting for self-improved reporting.

  14. MODEL CAR TRANSPORT SYSTEM - MODERN ITS EDUCATION TOOL

    Directory of Open Access Journals (Sweden)

    Karel Bouchner

    2017-12-01

    Full Text Available The model car transport system is a laboratory intended for a practical development in the area of the motor traffic. It is also an important education tool for students’ hands-on training, enabling students to test the results of their own studies. The main part of the model car transportation network is a model in a ratio 1:87 (HO, based on component units of FALLER Car system, e.g. cars, traffic lights, carriage way, parking spaces, stop sections, branch-off junctions, sensors and control sections. The model enables to simulate real traffic situations. It includes a motor traffic in a city, in a small village, on a carriageway between a city and a village including a railway crossing. The traffic infrastructure includes different kinds of intersections, such as T-junctions, a classic four-way crossroad and four-way traffic circle, with and without traffic lights control. Another important part of the model is a segment of a highway which includes an elevated crossing with highway approaches and exits.

  15. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    Science.gov (United States)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  16. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  17. New droplet model developments

    International Nuclear Information System (INIS)

    Dorso, C.O.; Myers, W.D.; Swiatecki, W.J.; Moeller, P.; Treiner, J.; Weiss, M.S.

    1985-09-01

    A brief summary is given of three recent contributions to the development of the Droplet Model. The first concerns the electric dipole moment induced in octupole deformed nuclei by the Coulomb redistribution. The second concerns a study of squeezing in nuclei and the third is a study of the improved predictive power of the model when an empirical ''exponential'' term is included. 25 refs., 3 figs

  18. Highly Integrated Model Assessment Technology and Tools

    Science.gov (United States)

    Pirnay-Dummer, Pablo; Ifenthaler, Dirk; Spector, J. Michael

    2010-01-01

    Effective and efficient measurement of the development of skill and knowledge, especially in domains of human activity that involve complex and challenging problems, is important with regard to workplace and academic performance. However, there has been little progress in the area of practical measurement and assessment, due in part to the lack of…

  19. Process development and tooling design for intrinsic hybrid composites

    Science.gov (United States)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  20. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  1. Synthesis and Development of Diagnostic Tools for Medical Imaging

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Henrik

    The need for novel diagnostic tools in medical imaging is increasing since they can improve the positive therapeutic outcome as well as patient compliance. In this thesis different diagnostic tools were developed within an interdisciplinary project, whereas the main work reported in this thesis...... of injectable fiducial tissue markers for surgical guidance of non-palpable tumors and brachytherapy. As radioactive tracer, radioiodinated SAIB-derivatives were developed based on the regioselective ipso-iodination of aryl-TMS moieties. Radioiodination was conducted under carrier free conditions in high...... was synthesized. Remote loading of one candidate was successful; however, the proper contrast level was not sufficient to be visible by CT-imaging. Another diagnostic tool for blood pool imaging is DOTA-modified pluronic/cyclodextrin (CD)-based polyrotaxanes (PRs). With the previously reported chelation of Gd and...

  2. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  3. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  4. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  5. Tools and Models for Integrating Multiple Cellular Networks

    Energy Technology Data Exchange (ETDEWEB)

    Gerstein, Mark [Yale Univ., New Haven, CT (United States). Gerstein Lab.

    2015-11-06

    In this grant, we have systematically investigated the integrated networks, which are responsible for the coordination of activity between metabolic pathways in prokaryotes. We have developed several computational tools to analyze the topology of the integrated networks consisting of metabolic, regulatory, and physical interaction networks. The tools are all open-source, and they are available to download from Github, and can be incorporated in the Knowledgebase. Here, we summarize our work as follow. Understanding the topology of the integrated networks is the first step toward understanding its dynamics and evolution. For Aim 1 of this grant, we have developed a novel algorithm to determine and measure the hierarchical structure of transcriptional regulatory networks [1]. The hierarchy captures the direction of information flow in the network. The algorithm is generally applicable to regulatory networks in prokaryotes, yeast and higher organisms. Integrated datasets are extremely beneficial in understanding the biology of a system in a compact manner due to the conflation of multiple layers of information. Therefore for Aim 2 of this grant, we have developed several tools and carried out analysis for integrating system-wide genomic information. To make use of the structural data, we have developed DynaSIN for protein-protein interactions networks with various dynamical interfaces [2]. We then examined the association between network topology with phenotypic effects such as gene essentiality. In particular, we have organized E. coli and S. cerevisiae transcriptional regulatory networks into hierarchies. We then correlated gene phenotypic effects by tinkering with different layers to elucidate which layers were more tolerant to perturbations [3]. In the context of evolution, we also developed a workflow to guide the comparison between different types of biological networks across various species using the concept of rewiring [4], and Furthermore, we have developed

  6. Modeling Tools Predict Flow in Fluid Dynamics

    Science.gov (United States)

    2010-01-01

    "Because rocket engines operate under extreme temperature and pressure, they present a unique challenge to designers who must test and simulate the technology. To this end, CRAFT Tech Inc., of Pipersville, Pennsylvania, won Small Business Innovation Research (SBIR) contracts from Marshall Space Flight Center to develop software to simulate cryogenic fluid flows and related phenomena. CRAFT Tech enhanced its CRUNCH CFD (computational fluid dynamics) software to simulate phenomena in various liquid propulsion components and systems. Today, both government and industry clients in the aerospace, utilities, and petrochemical industries use the software for analyzing existing systems as well as designing new ones."

  7. Model Fusion Tool - the Open Environmental Modelling Platform Concept

    Science.gov (United States)

    Kessler, H.; Giles, J. R.

    2010-12-01

    The vision of an Open Environmental Modelling Platform - seamlessly linking geoscience data, concepts and models to aid decision making in times of environmental change. Governments and their executive agencies across the world are facing increasing pressure to make decisions about the management of resources in light of population growth and environmental change. In the UK for example, groundwater is becoming a scarce resource for large parts of its most densely populated areas. At the same time river and groundwater flooding resulting from high rainfall events are increasing in scale and frequency and sea level rise is threatening the defences of coastal cities. There is also a need for affordable housing, improved transport infrastructure and waste disposal as well as sources of renewable energy and sustainable food production. These challenges can only be resolved if solutions are based on sound scientific evidence. Although we have knowledge and understanding of many individual processes in the natural sciences it is clear that a single science discipline is unable to answer the questions and their inter-relationships. Modern science increasingly employs computer models to simulate the natural, economic and human system. Management and planning requires scenario modelling, forecasts and ‘predictions’. Although the outputs are often impressive in terms of apparent accuracy and visualisation, they are inherently not suited to simulate the response to feedbacks from other models of the earth system, such as the impact of human actions. Geological Survey Organisations (GSO) are increasingly employing advances in Information Technology to visualise and improve their understanding of geological systems. Instead of 2 dimensional paper maps and reports many GSOs now produce 3 dimensional geological framework models and groundwater flow models as their standard output. Additionally the British Geological Survey have developed standard routines to link geological

  8. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...... need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted....

  9. Assessment Tool Development for Extracurricular Smet Programs for Girls

    Science.gov (United States)

    House, Jody; Johnson, Molly; Borthwick, Geoffrey

    Many different programs have been designed to increase girls' interest in and exposure to science, mathematics, engineering, and technology (SMET). Two of these programs are discussed and contrasted in the dimensions of length, level of science content, pedagogical approach, degree of self- vs. parent-selected participants, and amount of communitybuilding content. Two different evaluation tools were used. For one program, a modified version of the University of Pittsburgh's undergraduate engineering attitude assessment survey was used. Program participants' responses were compared to those from a fifth grade, mixed-sex science class. The only gender difference found was in the area of parental encouragement. The girls in the special class were more encouraged to participate in SMET areas. For the second program, a new age-appropriate tool developed specifically for these types of programs was used, and the tool itself was evaluated. The results indicate that the new tool has construct validity. On the basis of these preliminary results, a long-term plan for the continued development of the assessment tool is outlined.

  10. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  11. Watershed modeling tools and data for prognostic and diagnostic

    Science.gov (United States)

    Chambel-Leitao, P.; Brito, D.; Neves, R.

    2009-04-01

    When eutrophication is considered an important process to control it can be accomplished reducing nitrogen and phosphorus losses from both point and nonpoint sources and helping to assess the effectiveness of the pollution reduction strategy. HARP-NUT guidelines (Guidelines on Harmonized Quantification and Reporting Procedures for Nutrients) (Borgvang & Selvik, 2000) are presented by OSPAR as the best common quantification and reporting procedures for calculating the reduction of nutrient inputs. In 2000, OSPAR HARP-NUT guidelines on a trial basis. They were intended to serve as a tool for OSPAR Contracting Parties to report, in a harmonized manner, their different commitments, present or future, with regard to nutrients under the OSPAR Convention, in particular the "Strategy to Combat Eutrophication". HARP-NUT Guidelines (Borgvang and Selvik, 2000; Schoumans, 2003) were developed to quantify and report on the individual sources of nitrogen and phosphorus discharges/losses to surface waters (Source Orientated Approach). These results can be compared to nitrogen and phosphorus figures with the total riverine loads measured at downstream monitoring points (Load Orientated Approach), as load reconciliation. Nitrogen and phosphorus retention in river systems represents the connecting link between the "Source Orientated Approach" and the "Load Orientated Approach". Both approaches are necessary for verification purposes and both may be needed for providing the information required for the various commitments. Guidelines 2,3,4,5 are mainly concerned with the sources estimation. They present a set of simple calculations that allow the estimation of the origin of loads. Guideline 6 is a particular case where the application of a model is advised, in order to estimate the sources of nutrients from diffuse sources associated with land use/land cover. The model chosen for this was SWAT (Arnold & Fohrer, 2005) model because it is suggested in the guideline 6 and because it

  12. ENVIRONMENTAL ACCOUNTING: A MANAGEMENT TOOL FOR SUSTAINABLE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Nicolae Virag

    2014-12-01

    Full Text Available The paper aims to analyze the ways in which accounting as a social science and management information tool can contribute to sustainable development. The paper highlights the emergence of the environmental accounting concept, the applicability of the environmental accounting, types of environmental accounting, scope and benefits of environmental accounting.

  13. Development of the writing readiness inventory tool in context (WRITIC)

    NARCIS (Netherlands)

    Hartingsveldt, M.J. van; Vries, L. de; Cup, E.H.C.; Groot, I.J.M. de; Nijhuis-Van der Sanden, M.W.G.

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  14. Developing an Intranet: Tool Selection and Management Issues.

    Science.gov (United States)

    Chou, David C.

    1998-01-01

    Moving corporate systems onto an intranet will increase the data traffic within the corporate network, which necessitates a high-quality management process to the intranet. Discusses costs and benefits of adopting an intranet, tool availability and selection criteria, and management issues for developing an intranet. (Author/AEF)

  15. 109 Strategizing Drama as Tool for Advocacy and Rural Development

    African Journals Online (AJOL)

    Nekky Umera

    undertones, most of these organizations have failed to discover and employ drama/theatre as potent tool for the effective .... accept new innovations and changes. The Longman Dictionary of. Contemporary ... Development on the other hand is the end product of the success of advocacy. Citing contemporary paradigm shift, ...

  16. Sharpening a Tool for Teaching: The Zone of Proximal Development

    Science.gov (United States)

    Wass, Rob; Golding, Clinton

    2014-01-01

    Vygotsky's Zone of Proximal Development (ZPD) provides an important understanding of learning, but its implications for teachers are often unclear or limited and could be further explored. We use conceptual analysis to sharpen the ZPD as a teaching tool, illustrated with examples from teaching critical thinking in zoology. Our conclusions are…

  17. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  18. Development of a Psychotropic PRN Medication Evaluative Tool

    Science.gov (United States)

    Silk, Larry; Watt, Jackie; Pilon, Nancy; Draper, Chad

    2013-01-01

    This article describes a psychotropic PRN Evaluative Tool developed by interprofessional clinicians to address inconsistent reporting and assessment of the effectiveness of PRN medications used for people who are developmentally disabled. Fifty-nine participants (37 males, 22 females), ages 16 to 60 years, were included in the review, all…

  19. The Limitations of Monetary Tools in a Developing Economy like ...

    African Journals Online (AJOL)

    The Limitations of Monetary Tools in a Developing Economy like Nigeria. ... AFRREV IJAH: An International Journal of Arts and Humanities ... and prices flexibility, belief that the economy of was self adjusting and equilibrium income always tend towards its full employment level when disturbed especially in the long-run.

  20. Economy diversification: a potent tool for tourism development in ...

    African Journals Online (AJOL)

    Economy diversification: a potent tool for tourism development in Nigeria. ... AFRREV STECH: An International Journal of Science and Technology ... On this vain, this work reviewed the current state of some sectors in Nigeria, highlighting the effect of dependence on mono-product economy and emphasize tourism potential ...

  1. Crash Attenuator Data Collection and Life Cycle Tool Development

    Science.gov (United States)

    2014-06-14

    This research study was aimed at data collection and development of a decision support tool for life cycle cost assessment of crash attenuators. Assessing arrenuator life cycle costs based on in-place expected costs and not just the initial cost enha...

  2. Tool development to understand rural resource users' land use and ...

    African Journals Online (AJOL)

    Tool development to understand rural resource users' land use and impacts on land type changes in Madagascar. ... explore and understand decisions and management strategies. We finally report on first outcomes of the game including land use decisions, reaction to market fluctuation and landscape change. RÉSUMÉ

  3. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  4. budgeting as a strategic tool for development in the arts

    African Journals Online (AJOL)

    Admin

    This paper examines budgeting as a strategic tool for development in the Arts. Budgeting as a fundamental ... controlling the spending of money. It refers to ... executing adequate control over the many units of the organization, inter alia, towards effective planning and control, best described as “a management tool”. Types Of ...

  5. Millennium Development Goals: Tool or token of global social governance?

    NARCIS (Netherlands)

    Al Raee, M.; Amoateng, Elvis; Avenyo, E.K.; Beshay, Youssef; Bierbaum, M.; Keijser, C.; Sinha, R.

    2014-01-01

    In this paper we argue that the Millennium Development Goals (MDGs) experience suggests that Global Social Governance (GSG) exists and that the MDGs have been an effective tool in creating a global accountability framework despite shortcomings mainly arising in the formulation process. The paper

  6. Reflective Journaling: A Tool for Teacher Professional Development

    Science.gov (United States)

    Dreyer, Lorna M.

    2015-01-01

    This qualitative study explores the introduction of postgraduate education students to reflective journaling as a tool for professional development. Students were purposefully selected to keep a weekly journal in which they reflected in and on the activities (methodologies, techniques, strategies) they engaged in while executing a workplace…

  7. Development of the Writing Readiness Inventory Tool in Context (WRITIC)

    NARCIS (Netherlands)

    van Hartingsveldt, Margo J.; de Vries, Liesbeth; Cup, Edith HC; de Groot, Imelda JM; Nijhuis-van der Sanden, Maria WG

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  8. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Simic, Zdenko; Banov, Reni

    2014-01-01

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  9. A Study of Collaborative Software Development Using Groupware Tools

    Science.gov (United States)

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  10. A crowdsourcing model for creating preclinical medical education study tools.

    Science.gov (United States)

    Bow, Hansen C; Dattilo, Jonathan R; Jonas, Andrea M; Lehmann, Christoph U

    2013-06-01

    During their preclinical course work, medical students must memorize and recall substantial amounts of information. Recent trends in medical education emphasize collaboration through team-based learning. In the technology world, the trend toward collaboration has been characterized by the crowdsourcing movement. In 2011, the authors developed an innovative approach to team-based learning that combined students' use of flashcards to master large volumes of content with a crowdsourcing model, using a simple informatics system to enable those students to share in the effort of generating concise, high-yield study materials. The authors used Google Drive and developed a simple Java software program that enabled students to simultaneously access and edit sets of questions and answers in the form of flashcards. Through this crowdsourcing model, medical students in the class of 2014 at the Johns Hopkins University School of Medicine created a database of over 16,000 questions that corresponded to the Genes to Society basic science curriculum. An analysis of exam scores revealed that students in the class of 2014 outperformed those in the class of 2013, who did not have access to the flashcard system, and a survey of students demonstrated that users were generally satisfied with the system and found it a valuable study tool. In this article, the authors describe the development and implementation of their crowdsourcing model for creating study materials, emphasize its simplicity and user-friendliness, describe its impact on students' exam performance, and discuss how students in any educational discipline could implement a similar model of collaborative learning.

  11. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    OpenAIRE

    Carmen – Elena DOBROTĂ; Claudia VASILCA

    2015-01-01

    Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidat...

  12. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool

    OpenAIRE

    Newman, Dina L.; Snyder, Christopher W.; Fisk, J. Nick; Wright, L. Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select?format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student lan...

  13. Tool for test driven development of JavaScript applications

    OpenAIRE

    Stamać, Gregor

    2015-01-01

    Thesis describes the implementation of a tool for testing JavaScript code. The tool is designed to help us in test-driven development of JavaScript-based applications. Therefore, it is important to display test results as quickly as possible. The thesis is divided into four parts. First part describes JavaScript environment. It contains a brief history of the JavaScript language, prevalence, strengths and weaknesses. This section also describes TypeScript programming language that is a super...

  14. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  15. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Lerman, Susannah B.; Nislow, Keith H.; Nowak, David J.; DeStefano, Stephen; King, David I.; Jones-Farrand, D. Todd

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat to sustain bird and other wildlife populations. The primary goal of this study was to integrate wildlife suitability indices to an existing national urban forest assessment tool, i-Tree. We quantified available habitat characteristics of urban forests for ten northeastern U.S. cities, and summarized bird habitat relationships from the literature in terms of variables that were represented in the i-Tree datasets. With these data, we generated habitat suitability equations for nine bird species representing a range of life history traits and conservation status that predicts the habitat suitability based on i-Tree data. We applied these equations to the urban forest datasets to calculate the overall habitat suitability for each city and the habitat suitability for different types of land-use (e.g., residential, commercial, parkland) for each bird species. The proposed habitat models will help guide wildlife managers, urban planners, and landscape designers who require specific information such as desirable habitat conditions within an urban management project to help improve the suitability of urban forests for birds.

  16. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  17. Development of computational tools for automatic modeling and FE (Finite Element) analysis of corroded pipelines; Desenvolvimento de ferramentas computacionais para modelagem e analise automatica de defeitos de corrosao em dutos via MEF (Metodo de Elemento Finito)

    Energy Technology Data Exchange (ETDEWEB)

    Cabral, Helder Lima Dias; Willmersdorf, Ramiro Brito; Lyra, Paulo Roberto Maciel [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Engenharia Mecanica], e-mail: hldcabral@yahoo.com.br, e-mail: ramiro@willmersdorf.net, e-mail: prmlyra@ufpe.br; Silva, Silvana Maria Bastos Afonso da [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Dept. de Engenharia Civil], e-mail: smb@ufpe.br

    2008-06-15

    Corrosion is one of the most common causes of accidents involving oil and gas pipelines. The computational simulation through finite element method (FEM) is one of the most efficient tools to reliably quantify the remaining strength of corroded pipes. However, the modeling process demands intense manual engineering labor and it is also slow and extremely repetitive; therefore it is very prone to errors. The main purpose of this work is to present the PIPEFLAW program which has tools for generating automatically FE pipe models with corrosion defects, ready to be analyzed with commercial FEM programs. PIPEFLAW has computational tools based on MSC.Patran pre and post-processing program, and were written in PCL (patran command language). The program has a user friendly customized graphical interface, which allows the user to provide the main parameters of the pipe and defect (or a series of defects). The PIPEFLAW program allows the user to generate automatically FE pipe models with rectangular or elliptical shaped corrosion defects located on the internal or external pipe surface. Defects generated by the PIPEFLAW program can assume the configuration of an isolated defect (single defect) or multiple defects (aligned or located in an arbitrary position). These tools were validated by comparing the results of numerical simulations, made with the PIPEFLAW tools, with the numerical, experimental and semi-empiric results available in the literature. Results confirmed the robustness of PIPEFLAW tools which proved to be a rapid way of generating reliable FE models ready to be used on the structural evaluation of corroded pipelines. (author)

  18. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  19. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  20. Ongoing development of digital radiotherapy plan review tools

    International Nuclear Information System (INIS)

    Ebert, M.A.; Hatton, J.; Cornes, D.

    2011-01-01

    Full text: To describe ongoing development of software to support the review of radiotherapy treatment planning system (TPS) data. The 'SWAN' software program was conceived in 2000 and initially developed for the RADAR (TROG 03.04) prostate radiotherapy trial. Validation of the SWAN program has been occurring via implementation by TROG in support of multiple clinical trials. Development has continued and the SWAN software program is now supported by modular components which comprise the 'SW AN system'. This provides a comprehensive set of tools for the review, analysis and archive of TPS exports. The SWAN system has now been used in support of over 20 radiotherapy trials and to review the plans of over 2,000 trial participants. The use of the system for the RADAR trial is now culminating in the derivation of dose-outcomes indices for prostate treatment toxicity. Newly developed SWAN tools include enhanced remote data archive/retrieval, display of dose in both relative and absolute modes, and interfacing to a Matlab-based add-on ('VAST') that allows quantitative analysis of delineated volumes including regional overlap statistics for multi-observer studies. Efforts are continuing to develop the SWAN system in the context of international collaboration aimed at harmonising the quality-assurance activities of collaborative trials groups. Tools such as the SWAN system are essential for ensuring the collection of accurate and reliable evidence to guide future radiotherapy treatments. One of the principal challenges of developing such a tool is establishing a development path that will ensure its validity and applicability well into the future.

  1. Preliminary Development of an Object-Oriented Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  2. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  3. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  4. Advanced REACH Tool (ART) : Calibration of the mechanistic model

    NARCIS (Netherlands)

    Schinkel, J.; Warren, N.; Fransman, W.; Tongeren, M. van; McDonnell, P.; Voogd, E.; Cherrie, J.W.; Tischer, M.; Kromhout, H.; Tielemans, E.

    2011-01-01

    The mechanistic model of the Advanced Reach Tool (ART) provides a relative ranking of exposure levels from different scenarios. The objectives of the calibration described in this paper are threefold: to study whether the mechanistic model scores are accurately ranked in relation to exposure

  5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Molecular modeling has become a valuable and essential tool to medicinal chemists in the drug design process. Molecular modeling describes the generation, manipula- tion or representation of three-dimensional structures of molecules and associated physico-chemical properties. It involves a range of computerized ...

  6. The Neonatal Eating Assessment Tool: Development and Content Validation.

    Science.gov (United States)

    Pados, Britt F; Estrem, Hayley H; Thoyre, Suzanne M; Park, Jinhee; McComish, Cara

    2017-11-01

    To develop and content validate the Neonatal Eating Assessment Tool (NeoEAT), a parent-report measure of infant feeding. The NeoEAT was developed in three phases. Phase 1: Items were generated from a literature review, available assessment tools, and parents' descriptions of problematic feeding in infants.Phase 2: Professionals rated items for relevance and clarity. Content validity indices were calculated. Phase 3: Parent understanding was explored through cognitive interviews. Phase 1: Descriptions of infant feeding were obtained from 12 parents of children with diagnosed feeding problems and 29 parents of infants younger than seven months. Phase 2: Nine professionals rated items. Phase 3: Sixteen parents of infants younger than seven months completed the cognitive interview. Content validity of the NeoEAT. Three versions were developed: NeoEAT Breastfeeding (72 items), NeoEAT Bottle Feeding (74 items), and NeoEAT Breastfeeding and Bottle Feeding (89 items).

  7. Searching for Sentient Design Tools for Game Development

    DEFF Research Database (Denmark)

    Liapis, Antonios

    a large volume of game content or to reduce designer effort by automating the mechanizable aspects of content creation, such as feasibility checking. However elaborate the type of content such tools can create, they remain subservient to their human developers/creators (who have tightly designed all......Over the last twenty years, computer games have grown from a niche market targeting young adults to an important player in the global economy, engaging millions of people from different cultural backgrounds. As both the number and the size of computer games continue to rise, game companies handle...... increasing demand by expanding their cadre, compressing development cycles and reusing code or assets. To limit development time and reduce the cost of content creation, commercial game engines and procedural content generation are popular shortcuts. Content creation tools are means to either generate...

  8. [Development and validation of a tool for evaluating core competencies in nursing cancer patients on chemotherapy].

    Science.gov (United States)

    Kim, Sung Hae; Park, Jae Hyun

    2012-10-01

    This study was done to develop tool to evaluate the core competencies regarding nursing cancer patients on chemotherapy, and to verify the reliability and efficacy of the developed tool. A tool to evaluate the core competencies was developed from a preliminary tool consisting of 112 items verified by expert groups. The adequacy of the preliminary tool was analyzed and refined to the final evaluation tool containing 76 items in 8 core competencies and 18 specific competencies. The evaluation tool is in the form of a self-report, and each item is evaluated according to a 3-point scale. From September 22 to October 14, 2011, 349 survey responses were analyzed using SPSS 20.0 and the WINSTEPS program that employs the Rasch model. Results indicated that there were no inappropriate items and the items had low levels of difficulty in comparison with the knowledge levels of the study participants. The results of factor analysis yielded 18 factors, and the reliability of the tools was very high with Cronbach's α=.97. The results of this study can be used for training and evaluation of core competencies for nursing cancer patients, and for standardizing nursing practices associated with chemotherapy.

  9. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  10. Developing a Model Component

    Science.gov (United States)

    Fields, Christina M.

    2013-01-01

    The Spaceport Command and Control System (SCCS) Simulation Computer Software Configuration Item (CSCI) is responsible for providing simulations to support test and verification of SCCS hardware and software. The Universal Coolant Transporter System (UCTS) was a Space Shuttle Orbiter support piece of the Ground Servicing Equipment (GSE). The initial purpose of the UCTS was to provide two support services to the Space Shuttle Orbiter immediately after landing at the Shuttle Landing Facility. The UCTS is designed with the capability of servicing future space vehicles; including all Space Station Requirements necessary for the MPLM Modules. The Simulation uses GSE Models to stand in for the actual systems to support testing of SCCS systems during their development. As an intern at Kennedy Space Center (KSC), my assignment was to develop a model component for the UCTS. I was given a fluid component (dryer) to model in Simulink. I completed training for UNIX and Simulink. The dryer is a Catch All replaceable core type filter-dryer. The filter-dryer provides maximum protection for the thermostatic expansion valve and solenoid valve from dirt that may be in the system. The filter-dryer also protects the valves from freezing up. I researched fluid dynamics to understand the function of my component. The filter-dryer was modeled by determining affects it has on the pressure and velocity of the system. I used Bernoulli's Equation to calculate the pressure and velocity differential through the dryer. I created my filter-dryer model in Simulink and wrote the test script to test the component. I completed component testing and captured test data. The finalized model was sent for peer review for any improvements. I participated in Simulation meetings and was involved in the subsystem design process and team collaborations. I gained valuable work experience and insight into a career path as an engineer.

  11. Pilot evaluation of a continuing professional development tool for developing leadership skills.

    Science.gov (United States)

    Patterson, Brandon J; Chang, Elizabeth H; Witry, Matthew J; Garza, Oscar W; Trewet, CoraLynn B

    2013-01-01

    Strategies are needed to assure essential nonclinical competencies, such as leadership, can be gained using a continuing professional development (CPD) framework. The objective of this study was to explore student pharmacists' utilization and perceived effectiveness of a CPD tool for leadership development in an elective course. Students completed 2 CPD cycles during a semester-long leadership elective using a CPD tool. A questionnaire was used to measure students' perceptions of utility, self-efficacy, and satisfaction in completing CPD cycles when using a tool to aid in this process. The CPD tool was completed twice by 7 students. On average, students spent nearly 5 hours per CPD cycle. More than half (57.1%) scored themselves as successful or very successful in achieving their learning plans, and most (71.4%) found the tool somewhat useful in developing their leadership skills. Some perceived that the tool provided a systematic way to engage in leadership development, whereas others found it difficult to use. In this pilot study, most student pharmacists successfully achieved a leadership development plan and found the CPD tool useful. Providing students with more guidance may help facilitate use and effectiveness of CPD tools. There is a need to continue to develop and refine tools that assist in the CPD of pharmacy practitioners at all levels. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  13. Development of 3D CAD system as a design tool for PEACER development

    International Nuclear Information System (INIS)

    Lee, H. W.; Jung, K. J.; Jung, S. H.; Hwang, I. S.

    2003-01-01

    In an effort to resolve generic concerns with current power reactors, PEACER[1] has been developed as a proliferation-resistant waste transmutation reactor based on a unique combination of technologies of a prove a fast reactor and the heavy liquid metal coolant. In order to develop engineering design and visualize its performance, a three dimensional computer aided design (3D CAD) method has been devised. Based on conceptual design, system, structure and components of PEACER are defined. Using resuIts from finite element stress analyzer, computational fluid dynamics tool, nuclear analysis tool, etc, 3D visualization is achieved on the geometric construct based on CATIA[3]. A 3D visualization environment is utilized not only to overcome the integration complexity but also to manipulate data flow such as meshing information used in analysis codes. The 3D CAD system in this paper includes an open language, Virtual Reality Modeling Language (VRML)[4,5], to deliver analyses results on 3D objects, interactively. Such modeling environment is expected to improve the efficiency of designing the conceptual reactor, PEACER, reducing time and cost. ResuIts of 3D design and system performance simulation will be presented

  14. Development of 3D CAD system as a design tool for PEACER development

    Energy Technology Data Exchange (ETDEWEB)

    Lee, H. W.; Jung, K. J.; Jung, S. H.; Hwang, I. S. [Seoul National University, Seoul (Korea, Republic of)

    2003-07-01

    In an effort to resolve generic concerns with current power reactors, PEACER[1] has been developed as a proliferation-resistant waste transmutation reactor based on a unique combination of technologies of a prove a fast reactor and the heavy liquid metal coolant. In order to develop engineering design and visualize its performance, a three dimensional computer aided design (3D CAD) method has been devised. Based on conceptual design, system, structure and components of PEACER are defined. Using resuIts from finite element stress analyzer, computational fluid dynamics tool, nuclear analysis tool, etc, 3D visualization is achieved on the geometric construct based on CATIA[3]. A 3D visualization environment is utilized not only to overcome the integration complexity but also to manipulate data flow such as meshing information used in analysis codes. The 3D CAD system in this paper includes an open language, Virtual Reality Modeling Language (VRML)[4,5], to deliver analyses results on 3D objects, interactively. Such modeling environment is expected to improve the efficiency of designing the conceptual reactor, PEACER, reducing time and cost. ResuIts of 3D design and system performance simulation will be presented.

  15. Development of 3D CAD system as a design tool for PEACER development

    International Nuclear Information System (INIS)

    Jeong, Kwang Jin; Lee, Hyoung Won; Jeong, Seung Ho; Shin, Jong Gye; Hwang, Il Soon

    2003-01-01

    In an effort to resolve generic concerns with current power reactors, PEACER has been developed as a proliferation-resistant waste transmutation reactor based on a unique combination of technologies of a proven fast reactor and the heavy liquid metal coolant. In order to develop engineering design and visualize its performance, a three-dimensional computer aided design (3D CAD) method has been devised. Based on conceptual design, system, structure and components of PEACER are defined. Using results from finite element stress analyzer, computational fluid dynamics tool, nuclear analysis tool, etc, 3D visualization is achieved on the geometric construct based on CATIA. A 3D visualization environment is utilized not only to overcome the integration complexity but also to manipulate data flow such as meshing information used in analysis codes. The 3D CAD system in this paper includes an open language, Virtual Reality Modeling Language (VRML), to deliver analyses results on 3D objects, interactively. Such modeling environment is expected to improve the efficiency of designing the conceptual reactor, PEACER, reducing time and cost. Results of 3D design and stress analysis simulation will be presented as an example case. (author)

  16. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  17. Developing electronic cooperation tools: a case from norwegian health care.

    Science.gov (United States)

    Larsen, Eli; Mydske, Per Kristen

    2013-06-19

    Many countries aim to create electronic cooperational tools in health care, but the progress is rather slow. The study aimed to uncover how the authoritys' financing policies influence the development of electronic cooperational tools within public health care. An interpretative approach was used in this study. We performed 30 semistructured interviews with vendors, policy makers, and public authorities. Additionally, we conducted an extensive documentation study and participated in 18 workshops concerning information and communication technology (ICT) in Norwegian health care. We found that the interorganizational communication in sectors like health care, that have undergone an independent development of their internal information infrastructure would find it difficult to create electronic services that interconnect the organizations because such connections would affect all interconnected organizations within the heterogenic structure. The organizations would, to a large extent, depend on new functionality in existing information systems. Electronic patient records play a central role in all parts of the health care sector and therefore dependence is established to the information systems and theirs vendors. The Norwegian government authorities, which run more than 80% of the Norwegian health care, have not taken extraordinary steps to compensate for this dependency-the government's political philosophy is that each health care institution should pay for further electronic patient record development. However, cooperational tools are complex due to the number of players involved and the way they are intertwined with the overall workflow. The customers are not able to buy new functionalities on the drawing table, while the electronic patient record vendors are not willing to take the economic risk in developing cooperational tools. Thus, the market mechanisms in the domain are challenged. We also found that public projects that were only financed for the first

  18. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    Science.gov (United States)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  19. The teaching portfolio as a professional development tool for anaesthetists.

    Science.gov (United States)

    Sidhu, N S

    2015-05-01

    A teaching portfolio (TP) is a document containing a factual description of a teacher's teaching strengths and accomplishments, allowing clinicians to display them for examination by others. The primary aim of a TP is to improve quality of teaching by providing a structure for self-reflection, which in turn aids professional development in medical education. Contents typically include a personal statement on teaching, an overview of teaching accomplishments and activities, feedback from colleagues and learners, a reflective component and some examples of teaching material. Electronic portfolios are more portable and flexible compared to paper portfolios. Clinicians gain the most benefit from a TP when it is used as a tool for self-reflection of their teaching practice and not merely as a list of activities and achievements. This article explains why and how anaesthetists might use a TP as a tool for professional development in medical education.

  20. Innovation tools of economic development of the enterprise

    Directory of Open Access Journals (Sweden)

    Fedor Pavlovich Zotov

    2012-12-01

    Full Text Available Ways to generate new economic and financial benefits from the practice of rationalization work in the industrial enterprise are considered. An attempt to combine the practice rationalization work with the capabilities of tools and techniques of the modern management technologies is made. It is offered to learn the tools and techniques of the technologies by members of the 4types of the formed cross-functional teams through the tutorials. It is offered to distribute the tutorials between the four stages of the method PDCA management cycle. It is shown that the creation of teams and development of tutorials will create internal resources for innovation projects to achieve effective changes in economic development of the enterprise.

  1. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  2. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have effec...... effectively used Ideogramic UML to teach object-oriented modeling and the UML to groups of students using the UML for project assignments....

  3. Predictive models of moth development

    Science.gov (United States)

    Degree-day models link ambient temperature to insect life-stages, making such models valuable tools in integrated pest management. These models increase management efficacy by predicting pest phenology. In Wisconsin, the top insect pest of cranberry production is the cranberry fruitworm, Acrobasis v...

  4. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    International Nuclear Information System (INIS)

    Chang, Y.H.; Mosleh, A.; Dang, V.N.

    2003-01-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  5. Development and testing of a community stakeholder park audit tool.

    Science.gov (United States)

    Kaczynski, Andrew T; Stanis, Sonja A Wilhelm; Besenyi, Gina M

    2012-03-01

    Parks are valuable community resources, and auditing park environments is important for understanding their influence on physical activity and health. However, few tools exist that engage citizens in this process. The purpose of this study was to develop a user-friendly tool that would enable diverse stakeholders to quickly and reliably audit community parks for their potential to promote physical activity. A secondary aim was to examine community stakeholders' reactions to the process of developing and using the new tool. The study employed a sequential, multiphase process including three workshops and field testing to ensure the new instrument was the product of input and feedback from a variety of potential stakeholders and was psychometrically sound. All study stages, including data collection and analysis, occurred in 2010. Stakeholder recommendations were combined with reviews of existing instruments to create the new Community Park Audit Tool (CPAT). The CPAT contains four sections titled Park Information, Access and Surrounding Neighborhood, Park Activity Areas, and Park Quality and Safety. Inter-rater analyses demonstrated strong reliability for the vast majority of the items in the tool. Further, stakeholders reported a range of positive reactions resulting from their engagement in the project. The CPAT provides a reliable and user-friendly means of auditing parks for their potential to promote physical activity. Future use of the CPAT can facilitate greater engagement of diverse groups in evaluating and advocating for improved parks and overall healthy community design. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  6. CREST Cost of Renewable Energy Spreadsheet Tool: A Model for Developing Cost-Based Incentives in the United States; User Manual Version 4, August 2009 - March 2011 (Updated July 2013)

    Energy Technology Data Exchange (ETDEWEB)

    Gifford, J. S.; Grace, R. C.

    2013-07-01

    The objective of this document is to help model users understand how to use the CREST model to support renewable energy incentives, FITs, and other renewable energy rate-setting processes. This user manual will walk the reader through the spreadsheet tool, including its layout and conventions, offering context on how and why it was created. This user manual will also provide instructions on how to populate the model with inputs that are appropriate for a specific jurisdiction's policymaking objectives and context. Finally, the user manual will describe the results and outline how these results may inform decisions about long-term renewable energy support programs.

  7. Development of the Writing Readiness Inventory Tool in Context (WRITIC)

    OpenAIRE

    van Hartingsveldt, Margo J.; de Vries, Liesbeth; Cup, Edith HC; de Groot, Imelda JM; Nijhuis-van der Sanden, Maria WG

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was established with 251 children following regular education. To identify scale constructs, factor analysis was performed. Discriminative validity was established by examining contrast groups with good ...

  8. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  9. Personnel training and development as a tool for organizational efficiency

    OpenAIRE

    Shodeinde, Olubukunola

    2015-01-01

    This study examined the personnel training and development as a tool for organizational efficiency. Employees of MTN Corporate Head Office in Lagos State served as the study population. The study adopted a qualitative approach using questionnaire as main instrument of primary data collection. A total of 110 questionnaires were administered to 217 employees of MTN Nigeria. Using bar charts to illustrate the degree of response; the result of the findings shows that respondents agreed that there...

  10. Development of a Visual Inspection Data Collection Tool for Evaluation of Fielded PV Module Condition

    Energy Technology Data Exchange (ETDEWEB)

    Packard, C. E.; Wohlgemuth, J. H.; Kurtz, S. R.

    2012-08-01

    A visual inspection data collection tool for the evaluation of fielded photovoltaic (PV) modules has been developed to facilitate describing the condition of PV modules with regard to field performance. The proposed data collection tool consists of 14 sections, each documenting the appearance or properties of a part of the module. This report instructs on how to use the collection tool and defines each attribute to ensure reliable and valid data collection. This tool has been evaluated through the inspection of over 60 PV modules produced by more than 20 manufacturers and fielded at two different sites for varying periods of time. Aggregated data from such a single data collection tool has the potential to enable longitudinal studies of module condition over time, technology evolution, and field location for the enhancement of module reliability models.

  11. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  12. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  13. Developing a Malaysia flood model

    Science.gov (United States)

    Haseldine, Lucy; Baxter, Stephen; Wheeler, Phil; Thomson, Tina

    2014-05-01

    Faced with growing exposures in Malaysia, insurers have a need for models to help them assess their exposure to flood losses. The need for an improved management of flood risks has been further highlighted by the 2011 floods in Thailand and recent events in Malaysia. The increasing demand for loss accumulation tools in Malaysia has lead to the development of the first nationwide probabilistic Malaysia flood model, which we present here. The model is multi-peril, including river flooding for thousands of kilometres of river and rainfall-driven surface water flooding in major cities, which may cause losses equivalent to river flood in some high-density urban areas. The underlying hazard maps are based on a 30m digital surface model (DSM) and 1D/2D hydraulic modelling in JFlow and RFlow. Key mitigation schemes such as the SMART tunnel and drainage capacities are also considered in the model. The probabilistic element of the model is driven by a stochastic event set based on rainfall data, hence enabling per-event and annual figures to be calculated for a specific insurance portfolio and a range of return periods. Losses are estimated via depth-damage vulnerability functions which link the insured damage to water depths for different property types in Malaysia. The model provides a unique insight into Malaysian flood risk profiles and provides insurers with return period estimates of flood damage and loss to property portfolios through loss exceedance curve outputs. It has been successfully validated against historic flood events in Malaysia and is now being successfully used by insurance companies in the Malaysian market to obtain reinsurance cover.

  14. Using an evaluative tool to develop effective mathscasts

    Science.gov (United States)

    Galligan, Linda; Hobohm, Carola; Peake, Katherine

    2017-09-01

    This study is situated in a course designed for both on-campus and online pre-service and in-service teachers, where student-created mathscasts provide a way for university lecturers to assess students' quality of teaching, and understanding of mathematics. Teachers and pre-service teachers, in a university course with 90% online enrolment, were asked to create mathscasts to explain mathematics concepts at middle school level. This paper describes the process of developing and refining a tool for the creation and evaluation of quality student-produced mathscasts. The study then investigates the usefulness of the tool within the context of pedagogy and mathematical understanding. Despite an abundance of mathscasts already available on the web, there is merit in creating mathscasts, not only as a tool for teaching, but also as a means of learning by doing. The premise for creating student-produced mathscasts was to capture the creators' mathematical understanding and pedagogical approach to teaching a mathematical concept, which were then peer-assessed and graded. The analysis included surveys, practice mathscasts with peer- and self-reviews, and students' final assessed mathscasts. The results indicate that the use of the evaluative tool resulted in an improvement in quality of student-created mathscasts and critiques thereof. The paper concludes with a discussion on future directions of student-produced mathscasts.

  15. Analytical Modelling Of Milling For Tool Design And Selection

    Science.gov (United States)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  16. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  17. Development and content validation of the power mobility training tool.

    Science.gov (United States)

    Kenyon, Lisa K; Farris, John P; Cain, Brett; King, Emily; VandenBerg, Ashley

    2018-01-01

    This paper outlines the development and content validation of the power mobility training tool (PMTT), an observational tool designed to assist therapists in developing power mobility training programs for children who have multiple, severe impairments. Initial items on the PMTT were developed based on a literature review and in consultation with therapists experienced in the use of power mobility. Items were trialled in clinical settings, reviewed, and refined. Items were then operationalized and an administration manual detailing scoring for each item was created. Qualitative and quantitative methods were used to establish content validity via a 15 member, international expert panel. The content validity ratio (CVR) was determined for each possible item. Of the 19 original items, 10 achieved minimum required CVR values and were included in the final version of the PMTT. Items related to manoeuvring a power mobility device were merged and an item related to the number of switches used concurrently to operate a power mobility device were added to the PMTT. The PMTT may assist therapists in developing training programs that facilitate the acquisition of beginning power mobility skills in children who have multiple, severe impairments. Implications for Rehabilitation The Power Mobility Training Tool (PMTT) was developed to help guide the development of power mobility intervention programs for children who have multiple, severe impairments. The PMTT can be used with children who access a power mobility device using either a joystick or a switch. Therapists who have limited experience with power mobility may find the PMTT to be helpful in setting up and conducting power mobility training interventions as a feasible aspect of a plan of care for children who have multiple, severe impairments.

  18. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  19. Development of a tool to evaluate geropsychology knowledge and skill competencies.

    Science.gov (United States)

    Karel, Michele J; Emery, Erin E; Molinari, Victor

    2010-09-01

    Workforce shortages to meet the mental health needs of the world's aging population are well documented. Within the field of professional geropsychology in the U.S.A., a national conference was convened in 2006 to delineate competencies for psychological practice with older adults and a training model for the field. The conference produced the Pikes Peak Model of Geropsychology Training. The Council of Professional Geropsychology Training Programs (CoPGTP) aimed to produce a competency evaluation tool to help individuals define training needs for and evaluate progress in development of the Pikes Peak professional geropsychology competencies. A CoPGTP task force worked for one year to adapt the Pikes Peak Model geropsychology attitude, knowledge, and skill competencies into an evaluation tool for use by supervisors, students and professional psychologists at all levels of geropsychology training. The task force developed a competency rating tool, which included delineation of behavioral anchors for each of the Pikes Peak geropsychology knowledge and skill competencies and use of a developmental rating scale. Pilot testing was conducted, with 13 individuals providing feedback on the clarity and feasibility of the tool for evaluation of oneself or students. The Geropsychology Knowledge and Skills Assessment Tool, Version 1.1, is now posted on the CoPGTP website and is being used by geropsychology training programs in the U.S.A. The evaluation tool has both strengths and limitations. We discuss future directions for its ongoing validation and professional use.

  20. Feasibility assessment tool for urban anaerobic digestion in developing countries.

    Science.gov (United States)

    Lohri, Christian Riuji; Rodić, Ljiljana; Zurbrügg, Christian

    2013-09-15

    This paper describes a method developed to support feasibility assessments of urban anaerobic digestion (AD). The method not only uses technical assessment criteria but takes a broader sustainability perspective and integrates technical-operational, environmental, financial-economic, socio-cultural, institutional, policy and legal criteria into the assessment tool developed. Use of the tool can support decision-makers with selecting the most suitable set-up for the given context. The tool consists of a comprehensive set of questions, structured along four distinct yet interrelated dimensions of sustainability factors, which all influence the success of any urban AD project. Each dimension answers a specific question: I) WHY? What are the driving forces and motivations behind the initiation of the AD project? II) WHO? Who are the stakeholders and what are their roles, power, interests and means of intervention? III) WHAT? What are the physical components of the proposed AD chain and the respective mass and resource flows? IV) HOW? What are the key features of the enabling or disabling environment (sustainability aspects) affecting the proposed AD system? Disruptive conditions within these four dimensions are detected. Multi Criteria Decision Analysis is used to guide the process of translating the answers from six sustainability categories into scores, combining them with the relative importance (weights) attributed by the stakeholders. Risk assessment further evaluates the probability that certain aspects develop differently than originally planned and assesses the data reliability (uncertainty factors). The use of the tool is demonstrated with its application in a case study for Bahir Dar in Ethiopia. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  2. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  3. Toward the Development of Virtual Surgical Tools to Aid Orthopaedic FE Analyses

    Directory of Open Access Journals (Sweden)

    Srinivas C. Tadepalli

    2010-01-01

    Full Text Available Computational models of joint anatomy and function provide a means for biomechanists, physicians, and physical therapists to understand the effects of repetitive motion, acute injury, and degenerative diseases. Finite element models, for example, may be used to predict the outcome of a surgical intervention or to improve the design of prosthetic implants. Countless models have been developed over the years to address a myriad of orthopaedic procedures. Unfortunately, few studies have incorporated patient-specific models. Historically, baseline anatomic models have been used due to the demands associated with model development. Moreover, surgical simulations impose additional modeling challenges. Current meshing practices do not readily accommodate the inclusion of implants. Our goal is to develop a suite of tools (virtual instruments and guides which enable surgical procedures to be readily simulated and to facilitate the development of all-hexahedral finite element mesh definitions.

  4. Simulation Modeling of Lakes in Undergraduate and Graduate Classrooms Increases Comprehension of Climate Change Concepts and Experience with Computational Tools

    Science.gov (United States)

    Carey, Cayelan C.; Gougis, Rebekka Darner

    2017-01-01

    Ecosystem modeling is a critically important tool for environmental scientists, yet is rarely taught in undergraduate and graduate classrooms. To address this gap, we developed a teaching module that exposes students to a suite of modeling skills and tools (including computer programming, numerical simulation modeling, and distributed computing)…

  5. Development and testing of a community flood resilience measurement tool

    Science.gov (United States)

    Keating, Adriana; Campbell, Karen; Szoenyi, Michael; McQuistan, Colin; Nash, David; Burer, Meinrad

    2017-01-01

    Given the increased attention on resilience strengthening in international humanitarian and development work, there is a growing need to invest in its measurement and the overall accountability of resilience strengthening initiatives. The purpose of this article is to present our framework and tool for measuring community-level resilience to flooding and generating empirical evidence and to share our experience in the application of the resilience concept. At the time of writing the tool is being tested in 75 communities across eight countries. Currently 88 potential sources of resilience are measured at the baseline (initial state) and end line (final state) approximately 2 years later. If a flood occurs in the community during the study period, resilience outcome measures are recorded. By comparing pre-flood characteristics to post-flood outcomes, we aim to empirically verify sources of resilience, something which has never been done in this field. There is an urgent need for the continued development of theoretically anchored, empirically verified, and practically applicable disaster resilience measurement frameworks and tools so that the field may (a) deepen understanding of the key components of disaster resilience in order to better target resilience-enhancing initiatives, and (b) enhance our ability to benchmark and measure disaster resilience over time, and (c) compare how resilience changes as a result of different capacities, actions and hazards.

  6. Tool flank wear model and parametric optimization in end milling of metal matrix composite using carbide tool: Response surface methodology approach

    Directory of Open Access Journals (Sweden)

    R. Arokiadass

    2012-04-01

    Full Text Available Highly automated CNC end milling machines in manufacturing industry requires reliable model for prediction of tool flank wear. This model later can be used to predict the tool flank wear (VBmax according to the process parameters. In this investigation an attempt was made to develop an empirical relationship to predict the tool flank wear (VBmax of carbide tools while machining LM25 Al/SiCp incorporating the process parameters such as spindle speed (N, feed rate (f, depth of cut (d and various % wt. of silicon carbide (S. Response surface methodology (RSM was applied to optimizing the end milling process parameters to attain the minimum tool flank wear. Predicted values obtained from the developed model and experimental results are compared, and error <5 percent is observed. In addition, it is concluded that the flank wear increases with the increase of SiCp percentage weight in the MMC.

  7. The development of a practical tool for risk assessment of manual work – the HAT-tool

    NARCIS (Netherlands)

    Kraker, H. de; Douwes, M.

    2008-01-01

    For the Dutch Ministry of Social Affairs and Employment we developed a tool to assess the risks of developing complaints of the arm, neck or shoulders during manual work. The tool was developed for every type of organization and is easy to use, does not require measurements other than time and can

  8. Requirements Document for Development of a Livermore Tomography Tools Interface

    Energy Technology Data Exchange (ETDEWEB)

    Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-09

    In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’s poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.

  9. The development of a tool to predict team performance.

    Science.gov (United States)

    Sinclair, M A; Siemieniuch, C E; Haslam, R A; Henshaw, M J D C; Evans, L

    2012-01-01

    The paper describes the development of a tool to predict quantitatively the success of a team when executing a process. The tool was developed for the UK defence industry, though it may be useful in other domains. It is expected to be used by systems engineers in initial stages of systems design, when concepts are still fluid, including the structure of the team(s) which are expected to be operators within the system. It enables answers to be calculated for questions such as "What happens if I reduce team size?" and "Can I reduce the qualifications necessary to execute this process and still achieve the required level of success?". The tool has undergone verification and validation; it predicts fairly well and shows promise. An unexpected finding is that the tool creates a good a priori argument for significant attention to Human Factors Integration in systems projects. The simulations show that if a systems project takes full account of human factors integration (selection, training, process design, interaction design, culture, etc.) then the likelihood of team success will be in excess of 0.95. As the project derogates from this state, the likelihood of team success will drop as low as 0.05. If the team has good internal communications and good individuals in key roles, the likelihood of success rises towards 0.25. Even with a team comprising the best individuals, p(success) will not be greater than 0.35. It is hoped that these results will be useful for human factors professionals involved in systems design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Combining modelling tools to evaluate a goose management scheme

    NARCIS (Netherlands)

    Baveco, Hans; Bergjord, Anne Kari; Bjerke, Jarle W.; Chudzińska, Magda E.; Pellissier, Loïc; Simonsen, Caroline E.; Madsen, Jesper; Tombre, Ingunn M.; Nolet, Bart A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  11. Combining modelling tools to evaluate a goose management scheme.

    NARCIS (Netherlands)

    Baveco, J.M.; Bergjord, A.K.; Bjerke, J.W.; Chudzińska, M.E.; Pellissier, L.; Simonsen, C.E.; Madsen, J.; Tombre, Ingunn M.; Nolet, B.A.

    2017-01-01

    Many goose species feed on agricultural land, and with growing goose numbers, conflicts with agriculture are increasing. One possible solution is to designate refuge areas where farmers are paid to leave geese undisturbed. Here, we present a generic modelling tool that can be used to designate the

  12. Molecular Modeling: A Powerful Tool for Drug Design and Molecular ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 5. Molecular Modeling: A Powerful Tool for Drug Design and Molecular Docking. Rama Rao Nadendla. General Article Volume 9 Issue 5 May 2004 pp 51-60. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. California Geriatric Education Center Logic Model: An Evaluation and Communication Tool

    Science.gov (United States)

    Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.

    2009-01-01

    A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…

  14. A new web-based modelling tool (Websim-MILQ) aimed at optimisation of thermal treatments in the dairy industry

    NARCIS (Netherlands)

    Schutyser, M.A.I.; Straatsma, J.; Keijzer, P.M.; Verschueren, M.; Jong, de P.

    2008-01-01

    In the framework of a cooperative EU research project (MILQ-QC-TOOL) a web-based modelling tool (Websim-MILQ) was developed for optimisation of thermal treatments in the dairy industry. The web-based tool enables optimisation of thermal treatments with respect to product safety, quality and costs.

  15. Development and Assessment of a New 3D Neuroanatomy Teaching Tool for MRI Training

    Science.gov (United States)

    Drapkin, Zachary A.; Lindgren, Kristen A.; Lopez, Michael J.; Stabio, Maureen E.

    2015-01-01

    A computerized three-dimensional (3D) neuroanatomy teaching tool was developed for training medical students to identify subcortical structures on a magnetic resonance imaging (MRI) series of the human brain. This program allows the user to transition rapidly between two-dimensional (2D) MRI slices, 3D object composites, and a combined model in…

  16. A Modeling Tool for Household Biogas Burner Flame Port Design

    Science.gov (United States)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  17. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  18. Development of the physical model

    International Nuclear Information System (INIS)

    Liu Zunqi; Morsy, Samir

    2001-01-01

    Full text: The Physical Model was developed during Program 93+2 as a technical tool to aid enhanced information analysis and now is an integrated part of the Department's on-going State evaluation process. This paper will describe the concept of the Physical Model, including its objectives, overall structure and the development of indicators with designated strengths, followed by a brief description of using the Physical Model in implementing the enhanced information analysis. The work plan for expansion and update of the Physical Model is also presented at the end of the paper. The development of the Physical Model is an attempt to identify, describe and characterize every known process for carrying out each step necessary for the acquisition of weapons-usable material, i.e., all plausible acquisition paths for highly enriched uranium (HEU) and separated plutonium (Pu). The overall structure of the Physical Model has a multilevel arrangement. It includes at the top level all the main steps (technologies) that may be involved in the nuclear fuel cycle from the source material production up to the acquisition of weapons-usable material, and then beyond the civilian fuel cycle to the development of nuclear explosive devices (weaponization). Each step is logically interconnected with the preceding and/or succeeding steps by nuclear material flows. It contains at its lower levels every known process that is associated with the fuel cycle activities presented at the top level. For example, uranium enrichment is broken down into three branches at the second level, i.e., enrichment of UF 6 , UCl 4 and U-metal respectively; and then further broken down at the third level into nine processes: gaseous diffusion, gas centrifuge, aerodynamic, electromagnetic, molecular laser (MLIS), atomic vapor laser (AVLIS), chemical exchange, ion exchange and plasma. Narratives are presented at each level, beginning with a general process description then proceeding with detailed

  19. NREL Multiphysics Modeling Tools and ISC Device for Designing Safer Li-Ion Batteries

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, Ahmad A.; Yang, Chuanbo

    2016-03-24

    The National Renewable Energy Laboratory has developed a portfolio of multiphysics modeling tools to aid battery designers better understand the response of lithium ion batteries to abusive conditions. We will discuss this portfolio, which includes coupled electrical, thermal, chemical, electrochemical, and mechanical modeling. These models can simulate the response of a cell to overheating, overcharge, mechanical deformation, nail penetration, and internal short circuit. Cell-to-cell thermal propagation modeling will be discussed.

  20. High-Speed-/-Hypersonic-Weapon-Development-Tool Integration

    National Research Council Canada - National Science Library

    Duchow, Erin M; Munson, Michael J; Alonge, Jr, Frank A

    2006-01-01

    Multiple tools exist to aid in the design and evaluation of high-speed weapons. This paper documents efforts to integrate several existing tools, including the Integrated Hypersonic Aeromechanics Tool (IHAT)1-7...

  1. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  2. Indigenous youth-developed self-assessment: The Personal Balance Tool.

    Science.gov (United States)

    Barraza, Rachelle; Bartgis, Jami

    2016-01-01

    The Fresno American Indian Health Project (FAIHP) Youth Council developed and pilot tested a strength-based, holistic, and youth-friendly self-assessment tool grounded in the Medicine Wheel, a framework and theoretical orientation for teaching wellness in many tribal communities. This paper summarizes the development of the Youth Personal Balance Tool and the methods used for tool revisions through two separate pilot studies and ongoing process evaluations across 3 years. Using a community-based participatory evaluation model, FAIHP leveraged community resources to implement an annual youth Gathering of Native Americans to support youth in healing from historical and intergenerational trauma and restoring communities to balance by making them a part of the solution. This tool is one of many outcomes of their work. The Youth Council is offering the tool as a gift (in line with the cultural value of generosity) to other Indigenous communities that are searching for culturally competent self-assessment tools for youth. The authors believe this tool has the potential to progress the field in strength-based, holistic, youth-friendly assessment as a culturally competent method for Indigenous evaluation and research.

  3. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    Although software usability has long been emphasized, there is a lot of software with poor usability. In Usability Engineering, usability professionals prescribe a classical usability approach to improving software usability. It is essential to prototype and usability test user interfaces before....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... interface objects and properties. We built visualizations such as Lifelines, Parallel Coordinates, Heatmap, etc. to show that the formula-based approach is powerful enough for building customized visualizations. The evaluation with Cognitive Dimensions shows that the formula-based approach is cognitively...

  4. Development of dosimetry tools for proton therapy research

    International Nuclear Information System (INIS)

    Kim, Jong-Won; Kim, Dogyun

    2010-01-01

    Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.

  5. NEEMO 20: Science Training, Operations, and Tool Development

    Science.gov (United States)

    Graff, T.; Miller, M.; Rodriguez-Lanetty, M.; Chappell, S.; Naids, A.; Hood, A.; Coan, D.; Abell, P.; Reagan, M.; Janoiko, B.

    2016-01-01

    The 20th mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated evaluation of operational protocols and tools designed to enable future exploration beyond low-Earth orbit. NEEMO 20 was conducted from the Aquarius habitat off the coast of Key Largo, FL in July 2015. The habitat and its surroundings provide a convincing analog for space exploration. A crew of six (comprised of astronauts, engineers, and habitat technicians) lived and worked in and around the unique underwater laboratory over a mission duration of 14-days. Incorporated into NEEMO 20 was a diverse Science Team (ST) comprised of geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center (JSC), as well as marine scientists from the Department of Biological Sciences at Florida International University (FIU). This team trained the crew on the science to be conducted, defined sampling techniques and operational procedures, and planned and coordinated the science focused Extra Vehicular Activities (EVAs). The primary science objectives of NEEMO 20 was to study planetary sampling techniques and tools in partial gravity environments under realistic mission communication time delays and operational pressures. To facilitate these objectives two types of science sites were employed 1) geoscience sites with available rocks and regolith for testing sampling procedures and tools and, 2) marine science sites dedicated to specific research focused on assessing the photosynthetic capability of corals and their genetic connectivity between deep and shallow reefs. These marine sites and associated research objectives included deployment of handheld instrumentation, context descriptions, imaging, and sampling; thus acted as a suitable proxy for planetary surface exploration activities. This abstract briefly summarizes the scientific training, scientific operations, and tool

  6. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    Directory of Open Access Journals (Sweden)

    Carmen – Elena DOBROTĂ

    2015-12-01

    Full Text Available Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidate a performing public administration it was a challenging task, taking into account the types of interventions within Operational Programme Administrative Capacity Development 2007 – 2013 and the continuous changes in the economic and social environment in Romania and Europe. The aim of this article is to provide a short description of the approach used by the Managing Authority for OPACD within the performance management of the structural funds in Romania between 2008 and 2014. The paper offers a broad image of the way in which evaluations (ad-hoc, intermediate and performance were used in different stages of OP implementation as a tool of management.

  7. Development and first application of an operating events ranking tool

    International Nuclear Information System (INIS)

    Šimić, Zdenko; Zerger, Benoit; Banov, Reni

    2015-01-01

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups

  8. Evaluating EML Modeling Tools for Insurance Purposes: A Case Study

    Directory of Open Access Journals (Sweden)

    Mikael Gustavsson

    2010-01-01

    Full Text Available As with any situation that involves economical risk refineries may share their risk with insurers. The decision process generally includes modelling to determine to which extent the process area can be damaged. On the extreme end of modelling the so-called Estimated Maximum Loss (EML scenarios are found. These scenarios predict the maximum loss a particular installation can sustain. Unfortunately no standard model for this exists. Thus the insurers reach different results due to applying different models and different assumptions. Therefore, a study has been conducted on a case in a Swedish refinery where several scenarios previously had been modelled by two different insurance brokers using two different softwares, ExTool and SLAM. This study reviews the concept of EML and analyses the used models to see which parameters are most uncertain. Also a third model, EFFECTS, was employed in an attempt to reach a conclusion with higher reliability.

  9. Using urban forest assessment tools to model bird habitat potential

    Science.gov (United States)

    Susannah B. Lerman; Keith H. Nislow; David J. Nowak; Stephen DeStefano; David I. King; D. Todd. Jones-Farrand

    2014-01-01

    The alteration of forest cover and the replacement of native vegetation with buildings, roads, exotic vegetation, and other urban features pose one of the greatest threats to global biodiversity. As more land becomes slated for urban development, identifying effective urban forest wildlife management tools becomes paramount to ensure the urban forest provides habitat...

  10. Advanced Computing Tools and Models for Accelerator Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  11. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  12. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  13. A communication tool to improve the patient journey modeling process.

    Science.gov (United States)

    Curry, Joanne; McGregor, Carolyn; Tracy, Sally

    2006-01-01

    Quality improvement is high on the agenda of Health Care Organisations (HCO) worldwide. Patient journey modeling is a relatively recent innovation in healthcare quality improvement that models the patient's movement through the HCO by viewing it from a patient centric perspective. Critical to the success of the redesigning care process is the involvement of all stakeholders and their commitment to actively participate in the process. Tools which promote this type of communication are a critical enabler that can significantly affect the overall process redesign outcomes. Such a tool must also be able to incorporate additional factors such as relevant policies and procedures, staff roles, system usage and measurements such as process time and cost. This paper presents a graphically based communication tool that can be used as part of the patient journey modeling process to promote stakeholder involvement, commitment and ownership as well highlighting the relationship of other relevant variables that contribute to the patient's journey. Examples of how the tool has been used and the framework employed are demonstrated via a midwife-led primary care case study. A key contribution of this research is the provision of a graphical communication framework that is simple to use, is easily understood by a diverse range of stakeholders and enables ready recognition of patient journey issues. Results include strong stakeholder buy-in and significant enhancement to the overall design of the future patient journey. Initial results indicate that the use of such a communication tool can improve the patient journey modeling process and the overall quality improvement outcomes.

  14. Mage: A Tool for Developing Interactive Instructional Graphics

    Science.gov (United States)

    Pavkovic, Stephen F.

    2005-01-01

    Mage is a graphics program developed for visualization of three-dimensional structures of proteins and other macromolecules. An application of the Mage program is reported here for developing interactive instructional graphics files (kinemages) of much smaller scale. Examples are given illustrating features of VSEPR models, permanent dipoles,…

  15. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  16. Developing Cyberinfrastructure Tools and Services for Metadata Quality Evaluation

    Science.gov (United States)

    Mecum, B.; Gordon, S.; Habermann, T.; Jones, M. B.; Leinfelder, B.; Powers, L. A.; Slaughter, P.

    2016-12-01

    Metadata and data quality are at the core of reusable and reproducible science. While great progress has been made over the years, much of the metadata collected only addresses data discovery, covering concepts such as titles and keywords. Improving metadata beyond the discoverability plateau means documenting detailed concepts within the data such as sampling protocols, instrumentation used, and variables measured. Given that metadata commonly do not describe their data at this level, how might we improve the state of things? Giving scientists and data managers easy to use tools to evaluate metadata quality that utilize community-driven recommendations is the key to producing high-quality metadata. To achieve this goal, we created a set of cyberinfrastructure tools and services that integrate with existing metadata and data curation workflows which can be used to improve metadata and data quality across the sciences. These tools work across metadata dialects (e.g., ISO19115, FGDC, EML, etc.) and can be used to assess aspects of quality beyond what is internal to the metadata such as the congruence between the metadata and the data it describes. The system makes use of a user-friendly mechanism for expressing a suite of checks as code in popular data science programming languages such as Python and R. This reduces the burden on scientists and data managers to learn yet another language. We demonstrated these services and tools in three ways. First, we evaluated a large corpus of datasets in the DataONE federation of data repositories against a metadata recommendation modeled after existing recommendations such as the LTER best practices and the Attribute Convention for Dataset Discovery (ACDD). Second, we showed how this service can be used to display metadata and data quality information to data producers during the data submission and metadata creation process, and to data consumers through data catalog search and access tools. Third, we showed how the centrally

  17. Development and application of explorative tools in the field of architectural geometry: L-systems

    Directory of Open Access Journals (Sweden)

    Petruševski Ljiljana

    2010-01-01

    Full Text Available The concept of L-Systems was created as base for axiomatic theory of biologic growth. L-systems are applied in computer graphics for fractal generation, as well as in models of biological structures and simulations of their growth. Within generic architecture, by applying L-systems, the natural growth mechanisms are used as generators of architectural geometry. After mathematical and logical explanations of the chosen generic concept of L-systems, this study examines its generic potential, which is the base for development of specific explorative tools in the field of architectural geometry. Within a wider research activity titled 'Generic Explorations', the original software parametric tools have been developed, allowing generation of a complex architectural geometry based on the concept of L-systems. Variation of parametric values facilitates creation and further exploration of generated spatial forms. The paper presents possibilities of developed explorative tools, their particularities, as well as an overview of their initial application results.

  18. Development and Demonstration of The WEC-Sim Wave Energy Converter Simulation Tool

    OpenAIRE

    Lawson, Michael; Yu, Yi-Hsiang; Ruehl, Kelley; Michelen, Carlos

    2014-01-01

    The National Renewable Energy Laboratory (NREL) and Sandia National Laboratories (SNL) have developed WEC-Sim to provide the wave energy converter (WEC) design community with an open-source simulation tool. WEC-Sim models the system dynamics of WEC devices using multi- body dynamics methods and simulates hydrodynamic forces using coefficients predicted from potential flow models. In this paper we describe the methodology used in WEC-Sim and demonstrate the use of the code by simulating three ...

  19. Development of an information retrieval tool for biomedical patents.

    Science.gov (United States)

    Alves, Tiago; Rodrigues, Rúben; Costa, Hugo; Rocha, Miguel

    2018-06-01

    The volume of biomedical literature has been increasing in the last years. Patent documents have also followed this trend, being important sources of biomedical knowledge, technical details and curated data, which are put together along the granting process. The field of Biomedical text mining (BioTM) has been creating solutions for the problems posed by the unstructured nature of natural language, which makes the search of information a challenging task. Several BioTM techniques can be applied to patents. From those, Information Retrieval (IR) includes processes where relevant data are obtained from collections of documents. In this work, the main goal was to build a patent pipeline addressing IR tasks over patent repositories to make these documents amenable to BioTM tasks. The pipeline was developed within @Note2, an open-source computational framework for BioTM, adding a number of modules to the core libraries, including patent metadata and full text retrieval, PDF to text conversion and optical character recognition. Also, user interfaces were developed for the main operations materialized in a new @Note2 plug-in. The integration of these tools in @Note2 opens opportunities to run BioTM tools over patent texts, including tasks from Information Extraction, such as Named Entity Recognition or Relation Extraction. We demonstrated the pipeline's main functions with a case study, using an available benchmark dataset from BioCreative challenges. Also, we show the use of the plug-in with a user query related to the production of vanillin. This work makes available all the relevant content from patents to the scientific community, decreasing drastically the time required for this task, and provides graphical interfaces to ease the use of these tools. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Development of diamond coated tool and its performance in ...

    Indian Academy of Sciences (India)

    Unknown

    In recent years, low pressure synthesis of diamond coating from gas phase on a suitable tool substrate has opened up new opportunities to expand applications of diamond tools widely. In fact a coated diamond tool combines the strengths of both single crystal diamond and PCD compact in one cutting tool and has better.