WorldWideScience

Sample records for performance modeling tool

  1. Performability Modelling Tools, Evaluation Techniques and Applications

    NARCIS (Netherlands)

    Haverkort, Boudewijn R.H.M.

    1990-01-01

    This thesis deals with three aspects of quantitative evaluation of fault-tolerant and distributed computer and communication systems: performability evaluation techniques, performability modelling tools, and performability modelling applications. Performability modelling is a relatively new

  2. Port performance evaluation tool based on microsimulation model

    Directory of Open Access Journals (Sweden)

    Tsavalista Burhani Jzolanda

    2017-01-01

    Full Text Available As port performance is becoming correlative to national competitiveness, the issue of port performance evaluation has significantly raised. Port performances can simply be indicated by port service levels to the ship (e.g., throughput, waiting for berthing etc., as well as the utilization level of equipment and facilities within a certain period. The performances evaluation then can be used as a tool to develop related policies for improving the port’s performance to be more effective and efficient. However, the evaluation is frequently conducted based on deterministic approach, which hardly captures the nature variations of port parameters. Therefore, this paper presents a stochastic microsimulation model for investigating the impacts of port parameter variations to the port performances. The variations are derived from actual data in order to provide more realistic results. The model is further developed using MATLAB and Simulink based on the queuing theory.

  3. Performance Analysis, Modeling and Scaling of HPC Applications and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Bhatele, Abhinav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-01-13

    E cient use of supercomputers at DOE centers is vital for maximizing system throughput, mini- mizing energy costs and enabling science breakthroughs faster. This requires complementary e orts along several directions to optimize the performance of scienti c simulation codes and the under- lying runtimes and software stacks. This in turn requires providing scalable performance analysis tools and modeling techniques that can provide feedback to physicists and computer scientists developing the simulation codes and runtimes respectively. The PAMS project is using time allocations on supercomputers at ALCF, NERSC and OLCF to further the goals described above by performing research along the following fronts: 1. Scaling Study of HPC applications; 2. Evaluation of Programming Models; 3. Hardening of Performance Tools; 4. Performance Modeling of Irregular Codes; and 5. Statistical Analysis of Historical Performance Data. We are a team of computer and computational scientists funded by both DOE/NNSA and DOE/ ASCR programs such as ECRP, XStack (Traleika Glacier, PIPER), ExaOSR (ARGO), SDMAV II (MONA) and PSAAP II (XPACC). This allocation will enable us to study big data issues when analyzing performance on leadership computing class systems and to assist the HPC community in making the most e ective use of these resources.

  4. Wave and Wind Model Performance Metrics Tools

    Science.gov (United States)

    Choi, J. K.; Wang, D. W.

    2016-02-01

    Continual improvements and upgrades of Navy ocean wave and wind models are essential to the assurance of battlespace environment predictability of ocean surface wave and surf conditions in support of Naval global operations. Thus, constant verification and validation of model performance is equally essential to assure the progress of model developments and maintain confidence in the predictions. Global and regional scale model evaluations may require large areas and long periods of time. For observational data to compare against, altimeter winds and waves along the tracks from past and current operational satellites as well as moored/drifting buoys can be used for global and regional coverage. Using data and model runs in previous trials such as the planned experiment, the Dynamics of the Adriatic in Real Time (DART), we demonstrated the use of accumulated altimeter wind and wave data over several years to obtain an objective evaluation of the performance the SWAN (Simulating Waves Nearshore) model running in the Adriatic Sea. The assessment provided detailed performance of wind and wave models by using cell-averaged statistical variables maps with spatial statistics including slope, correlation, and scatter index to summarize model performance. Such a methodology is easily generalized to other regions and at global scales. Operational technology currently used by subject matter experts evaluating the Navy Coastal Ocean Model and the Hybrid Coordinate Ocean Model can be expanded to evaluate wave and wind models using tools developed for ArcMAP, a GIS application developed by ESRI. Recent inclusion of altimeter and buoy data into a format through the Naval Oceanographic Office's (NAVOCEANO) quality control system and the netCDF standards applicable to all model output makes it possible for the fusion of these data and direct model verification. Also, procedures were developed for the accumulation of match-ups of modelled and observed parameters to form a data base

  5. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  6. Modelling tools for assessing bioremediation performance and risk of chlorinated solvents in clay tills

    DEFF Research Database (Denmark)

    Chambon, Julie Claire Claudia

    design are challenging. This thesis presents the development and application of analytical and numerical models to improve our understanding of transport and degradation processes in clay tills, which is crucial for assessing bioremediation performance and risk to groundwater. A set of modelling tools...... to groundwater and bioremediation performance in low-permeability media....

  7. Modelling and Development of a High Performance Milling Process with Monolithic Cutting Tools

    International Nuclear Information System (INIS)

    Ozturk, E.; Taylor, C. M.; Turner, S.; Devey, M.

    2011-01-01

    Critical aerospace components usually require difficult to machine workpiece materials like nickel based alloys. Moreover; there is a pressing need to maximize the productivity of machining operations. This need can be satisfied by selection of higher feed velocity, axial and radial depths. But there may be several problems during machining in this case. Due to high cutting speeds in high performance machining, the tool life may be unacceptably low. If magnitudes of cutting forces are high, out of tolerance static form errors may result; moreover in the extreme cases, the cutting tool may break apart. Forced vibrations may deteriorate the surface quality. Chatter vibrations may develop if the selected parameters result in instability. In this study, in order to deal with the tool life issue, several experimental cuts are made with different tool geometries, and the best combination in terms of tool life is selected. A force model is developed and the results of the force model are verified by experimental results. The force model is used in predicting the effect of process parameters on cutting forces. In order to account for the other concerns such as static form errors, forced and chatter vibrations, additional process models are currently under development.

  8. How to define the tool kit for the corrective maintenance service? : a tool kit definition model under the service performance criterion

    NARCIS (Netherlands)

    Chen, Denise

    2009-01-01

    Currently, the rule of defining tool kits is varied and more engineer's aspects oriented. However, the decision of the tool kit's definition is a trade-off problem between the cost and the service performance. This project is designed to develop a model that can integrate the engineer's preferences

  9. EnergiTools(R) - a power plant performance monitoring and diagnosis tool

    International Nuclear Information System (INIS)

    Ancion, P.V.; Bastien, R.; Ringdahl, K.

    2000-01-01

    Westinghouse EnergiTools(R) is a performance diagnostic tool for power generation plants that combines the power of on-line process data acquisition with advanced diagnostics methodologies. The system uses analytical models based on thermodynamic principles combined with knowledge of component diagnostic experts. An issue in modeling expert knowledge is to have a framework that can represent and process uncertainty in complex systems. In such experiments, it is nearly impossible to build deterministic models for the effects of faults on symptoms. A methodology based on causal probabilistic graphs, more specifically on Bayesian belief networks, has been implemented in EnergiTools(R) to capture the fault-symptom relationships. The methodology estimates the likelihood of the various component failures using the fault-symptom relationships. The system also has the ability to use neural networks for processes that are difficult to model analytically. An application is the estimation of the reactor power in nuclear power plant by interpreting several plant indicators. EnergiTools(R) is used for the on-line performance monitoring and diagnostics at Vattenfall Ringhals nuclear power plants in Sweden. It has led to the diagnosis of various performance issues with plant components. Two case studies are presented. In the first case, an overestimate of the thermal power due to a faulty instrument was found, which led to a plant operation below its optimal power. The paper shows how the problem was discovered, using the analytical thermodynamic calculations. The second case shows an application of EnergiTools(R) for the diagnostic of a condenser failure using causal probabilistic graphs

  10. The cognitive environment simulation as a tool for modeling human performance and reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Pople, H. Jr.; Roth, E.M.

    1990-01-01

    The US Nuclear Regulatory Commission is sponsoring a research program to develop improved methods to model the cognitive behavior of nuclear power plant (NPP) personnel. Under this program, a tool for simulating how people form intentions to act in NPP emergency situations was developed using artificial intelligence (AI) techniques. This tool is called Cognitive Environment Simulation (CES). The Cognitive Reliability Assessment Technique (or CREATE) was also developed to specify how CBS can be used to enhance the measurement of the human contribution to risk in probabilistic risk assessment (PRA) studies. The next step in the research program was to evaluate the modeling tool and the method for using the tool for Human Reliability Analysis (HRA) in PRAs. Three evaluation activities were conducted. First, a panel of highly distinguished experts in cognitive modeling, AI, PRA and HRA provided a technical review of the simulation development work. Second, based on panel recommendations, CES was exercised on a family of steam generator tube rupture incidents where empirical data on operator performance already existed. Third, a workshop with HRA practitioners was held to analyze a worked example of the CREATE method to evaluate the role of CES/CREATE in HRA. The results of all three evaluations indicate that CES/CREATE represents a promising approach to modeling operator intention formation during emergency operations

  11. Cockpit System Situational Awareness Modeling Tool

    Science.gov (United States)

    Keller, John; Lebiere, Christian; Shay, Rick; Latorella, Kara

    2004-01-01

    This project explored the possibility of predicting pilot situational awareness (SA) using human performance modeling techniques for the purpose of evaluating developing cockpit systems. The Improved Performance Research Integration Tool (IMPRINT) was combined with the Adaptive Control of Thought-Rational (ACT-R) cognitive modeling architecture to produce a tool that can model both the discrete tasks of pilots and the cognitive processes associated with SA. The techniques for using this tool to predict SA were demonstrated using the newly developed Aviation Weather Information (AWIN) system. By providing an SA prediction tool to cockpit system designers, cockpit concepts can be assessed early in the design process while providing a cost-effective complement to the traditional pilot-in-the-loop experiments and data collection techniques.

  12. Model and tool requirements for co-simulation of building performance

    NARCIS (Netherlands)

    Trcka, M.; Hensen, J.L.M.

    2006-01-01

    The use of building performance simulation (BPS) can substantially help in improving building design towards higher occupant comfort and lower fuel consumption, while reducing emission of greenhouse gasses. Unfortunately, current BPS tools do not allow inter-tool communication and thus limit a

  13. Alien wavelength modeling tool and field trial

    DEFF Research Database (Denmark)

    Sambo, N.; Sgambelluri, A.; Secondini, M.

    2015-01-01

    A modeling tool is presented for pre-FEC BER estimation of PM-QPSK alien wavelength signals. A field trial is demonstrated and used as validation of the tool's correctness. A very close correspondence between the performance of the field trial and the one predicted by the modeling tool has been...

  14. A unified tool for performance modelling and prediction

    International Nuclear Information System (INIS)

    Gilmore, Stephen; Kloul, Leila

    2005-01-01

    We describe a novel performability modelling approach, which facilitates the efficient solution of performance models extracted from high-level descriptions of systems. The notation which we use for our high-level designs is the Unified Modelling Language (UML) graphical modelling language. The technology which provides the efficient representation capability for the underlying performance model is the multi-terminal binary decision diagram (MTBDD)-based PRISM probabilistic model checker. The UML models are compiled through an intermediate language, the stochastic process algebra PEPA, before translation into MTBDDs for solution. We illustrate our approach on a real-world analysis problem from the domain of mobile telephony

  15. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  16. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  17. The GEDI Performance Tool

    Science.gov (United States)

    Hancock, S.; Armston, J.; Tang, H.; Patterson, P. L.; Healey, S. P.; Marselis, S.; Duncanson, L.; Hofton, M. A.; Kellner, J. R.; Luthcke, S. B.; Sun, X.; Blair, J. B.; Dubayah, R.

    2017-12-01

    NASA's Global Ecosystem Dynamics Investigation will mount a multi-track, full-waveform lidar on the International Space Station (ISS) that is optimised for the measurement of forest canopy height and structure. GEDI will use ten laser tracks, two 10 mJ "power beams" and eight 5 mJ "coverage beams" to produce global (51.5oS to 51.5oN) maps of above ground biomass (AGB), canopy height, vegetation structure and other biophysical parameters. The mission has a requirement to generate a 1 km AGB map with 80% of pixels with ≤ 20% standard error or 20 Mg·ha-1, whichever is greater. To assess performance and compare to mission requirements, an end-to-end simulator has been developed. The simulator brings together tools to propagate the effects of measurement and sampling error on GEDI data products. The simulator allows us to evaluate the impact of instrument performance, ISS orbits, processing algorithms and losses of data that may occur due to clouds, snow, leaf-off conditions, and areas with an insufficient signal-to-noise ratio (SNR). By evaluating the consequences of operational decisions on GEDI data products, this tool provides a quantitative framework for decision-making and mission planning. Here we demonstrate the performance tool by using it to evaluate the trade-off between measurement and sampling error on the 1 km AGB data product. Results demonstrate that the use of coverage beams during the day (lowest GEDI SNR case) over very dense forests (>95% canopy cover) will result in some measurement bias. Omitting these low SNR cases increased the sampling error. Through this an SNR threshold for a given expected canopy cover can be set. The other applications of the performance tool are also discussed, such as assessing the impact of decisions made in the AGB modelling and signal processing stages on the accuracy of final data products.

  18. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  19. Performative Tools and Collaborative Learning

    DEFF Research Database (Denmark)

    Minder, Bettina; Lassen, Astrid Heidemann

    of performative tools used in transdisciplinary events for collaborative learning. The results of this single case study add to extant knowledge- and learning literature by providing the reader with a rich description of characteristics and learning functions of performative tools in transdisciplinary events......The use of performative tools can support collaborative learning across knowledge domains (i.e. science and practice), because they create new spaces for dialog. However, so far innovation literature provides little answers to the important discussion of how to describe the effects and requirements...... and a description of how they interrelate with the specific setting of such an event. Furthermore, they complement previous findings by relating performative tools to collaborative learning for knowledge intensive ideas....

  20. OISI dynamic end-to-end modeling tool

    Science.gov (United States)

    Kersten, Michael; Weidler, Alexander; Wilhelm, Rainer; Johann, Ulrich A.; Szerdahelyi, Laszlo

    2000-07-01

    The OISI Dynamic end-to-end modeling tool is tailored to end-to-end modeling and dynamic simulation of Earth- and space-based actively controlled optical instruments such as e.g. optical stellar interferometers. `End-to-end modeling' is meant to denote the feature that the overall model comprises besides optical sub-models also structural, sensor, actuator, controller and disturbance sub-models influencing the optical transmission, so that the system- level instrument performance due to disturbances and active optics can be simulated. This tool has been developed to support performance analysis and prediction as well as control loop design and fine-tuning for OISI, Germany's preparatory program for optical/infrared spaceborne interferometry initiated in 1994 by Dornier Satellitensysteme GmbH in Friedrichshafen.

  1. A tool for standardized collector performance calculations including PVT

    DEFF Research Database (Denmark)

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...... can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations....

  2. Application of the Performance Validation Tool for the Evaluation of NSSS Control System Performance

    International Nuclear Information System (INIS)

    Sohn, Suk-whun

    2011-01-01

    When a control system is supplied to nuclear power plant (NPP) under construction, static tests and dynamic tests are typically performed for evaluating its performance. The dynamic test is not realistic for validating the performance of the replaced hardware in operating NPPs because of potential risks and economic burden. We have, therefore, developed a performance validation tool which can evaluate the dynamic performances of the control system without undertaking real plant tests. The window-based nuclear plant performance analyzer (Win-NPA) is used as a virtual NPP in the developed tool and provides appropriate control loads to the target control system via hardwired cables in a manner that the interfaces are identical to the field wiring. The outputs from the control system are used as the simulation inputs of the plant model within the Win-NPA. With this closed-loop configuration, major transient events were simulated to check the performance of the implemented control system by comparing it with that of the control system model of the Win-NPA and that of the old hardware. The developed tool and the methodology were successfully applied to the hardware replacement project for Yonggwang (YGN) 3 and 4 feedwater control system (FWCS) in 2008. Several errors in the implemented control system were fixed through the performance validation tests and the operability tests. As a result, the control system of the YGN 3 and 4 has demonstrated an excellent control performance since then. On the basis of YGN 3 and 4 project experiences, we are performing a similar project in Ulchin (UCN) 3 and 4. This methodology can also be applied to other NPPs under construction as a tool for pre-operational dynamic tests. These performance tests before performing power ascension tests (PATs) are conducive to preventing unnecessary plant transients or unwanted reactor trips caused by hidden errors of control systems during PATs. (author)

  3. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  4. Diamond and cBN hybrid and nanomodified cutting tools with enhanced performance: Development, testing and modelling

    DEFF Research Database (Denmark)

    Loginov, Pavel; Mishnaevsky, Leon; Levashov, Evgeny

    2015-01-01

    with 25% of diamond replaced by cBN grains demonstrate 20% increased performance as compared with pure diamond machining tools, and more than two times higher performance as compared with pure cBN tools. Further, cast iron machining efficiency of the wheels modified by hBN particles was 80% more efficient......The potential of enhancement of superhard steel and cast iron cutting tool performance on the basis of microstuctural modifications of the tool materials is studied. Hybrid machining tools with mixed diamond and cBN grains, as well as machining tool with composite nanomodified metallic binder...... are developed, and tested experimentally and numerically. It is demonstrated that both combination of diamond and cBN (hybrid structure) and nanomodification of metallic binder (with hexagonal boron nitride/hBN platelets) lead to sufficient improvement of the cast iron machining performance. The superhard tools...

  5. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  6. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  7. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  8. High Performance Grinding and Advanced Cutting Tools

    CERN Document Server

    Jackson, Mark J

    2013-01-01

    High Performance Grinding and Advanced Cutting Tools discusses the fundamentals and advances in high performance grinding processes, and provides a complete overview of newly-developing areas in the field. Topics covered are grinding tool formulation and structure, grinding wheel design and conditioning and applications using high performance grinding wheels. Also included are heat treatment strategies for grinding tools, using grinding tools for high speed applications, laser-based and diamond dressing techniques, high-efficiency deep grinding, VIPER grinding, and new grinding wheels.

  9. Comparison of BrainTool to other UML modeling and model transformation tools

    Science.gov (United States)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  10. ExaSAT: An exascale co-design tool for performance modeling

    International Nuclear Information System (INIS)

    Unat, Didem; Chan, Cy; Zhang, Weiqun; Williams, Samuel; Bachan, John

    2015-01-01

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range of hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.

  11. Model Performance Evaluation and Scenario Analysis (MPESA)

    Science.gov (United States)

    Model Performance Evaluation and Scenario Analysis (MPESA) assesses the performance with which models predict time series data. The tool was developed Hydrological Simulation Program-Fortran (HSPF) and the Stormwater Management Model (SWMM)

  12. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  13. An applied artificial intelligence approach towards assessing building performance simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Yezioro, Abraham [Faculty of Architecture and Town Planning, Technion IIT (Israel); Dong, Bing [Center for Building Performance and Diagnostics, School of Architecture, Carnegie Mellon University (United States); Leite, Fernanda [Department of Civil and Environmental Engineering, Carnegie Mellon University (United States)

    2008-07-01

    With the development of modern computer technology, a large amount of building energy simulation tools is available in the market. When choosing which simulation tool to use in a project, the user must consider the tool's accuracy and reliability, considering the building information they have at hand, which will serve as input for the tool. This paper presents an approach towards assessing building performance simulation results to actual measurements, using artificial neural networks (ANN) for predicting building energy performance. Training and testing of the ANN were carried out with energy consumption data acquired for 1 week in the case building called the Solar House. The predicted results show a good fitness with the mathematical model with a mean absolute error of 0.9%. Moreover, four building simulation tools were selected in this study in order to compare their results with the ANN predicted energy consumption: Energy{sub 1}0, Green Building Studio web tool, eQuest and EnergyPlus. The results showed that the more detailed simulation tools have the best simulation performance in terms of heating and cooling electricity consumption within 3% of mean absolute error. (author)

  14. A flexible tool for hydraulic and water quality performance analysis of green infrastructure

    Science.gov (United States)

    Massoudieh, A.; Alikhani, J.

    2017-12-01

    Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. To be used to evaluate the effect design configurations on the long-term performance of GIs, models should be able to consider processes within GIs with good fidelity. In this presentation, a sophisticated, yet flexible tool for hydraulic and water quality assessment of GIs will be introduced. The tool can be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media employed in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biogeochemical processes affecting contaminants such as evapotranspiration, plant uptake, reactions, and particle-associated transport accurately while maintaining a high degree of flexibility to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated. The process-based model framework developed here can be used to model a diverse range of GI practices such as stormwater ponds, green roofs, retention ponds, bioretention systems, infiltration trench, permeable pavement and other custom-designed combinatory systems. An example of the application of the system to evaluate the performance of a rain-garden system will be demonstrated.

  15. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  16. Investigation of tool engagement and cutting performance in machining a pocket

    Science.gov (United States)

    Adesta, E. Y. T.; Hamidon, R.; Riza, M.; Alrashidi, R. F. F. A.; Alazemi, A. F. F. S.

    2018-01-01

    This study investigates the variation of tool engagement for different profile of cutting. In addition, behavior of cutting force and cutting temperature for different tool engagements for machining a pocket also been explored. Initially, simple tool engagement models were developed for peripheral and slot cutting for different types of corner. Based on these models, the tool engagements for contour and zig zag tool path strategies for a rectangular shape pocket with dimension 80 mm x 60 mm were analyzed. Experiments were conducted to investigate the effect of tool engagements on cutting force and cutting temperature for the machining of a pocket of AISI H13 material. The cutting parameters used were 150m/min cutting speed, 0.05mm/tooth feed, and 0.1mm depth of cut. Based on the results obtained, the changes of cutting force and cutting temperature performance there exist a relationship between cutting force, cutting temperature and tool engagement. A higher cutting force and cutting temperature is obtained when the cutting tool goes through up milling and when the cutting tool makes a full engagement with the workpiece.

  17. Animal models: an important tool in mycology.

    Science.gov (United States)

    Capilla, Javier; Clemons, Karl V; Stevens, David A

    2007-12-01

    Animal models of fungal infections are, and will remain, a key tool in the advancement of the medical mycology. Many different types of animal models of fungal infection have been developed, with murine models the most frequently used, for studies of pathogenesis, virulence, immunology, diagnosis, and therapy. The ability to control numerous variables in performing the model allows us to mimic human disease states and quantitatively monitor the course of the disease. However, no single model can answer all questions and different animal species or different routes of infection can show somewhat different results. Thus, the choice of which animal model to use must be made carefully, addressing issues of the type of human disease to mimic, the parameters to follow and collection of the appropriate data to answer those questions being asked. This review addresses a variety of uses for animal models in medical mycology. It focuses on the most clinically important diseases affecting humans and cites various examples of the different types of studies that have been performed. Overall, animal models of fungal infection will continue to be valuable tools in addressing questions concerning fungal infections and contribute to our deeper understanding of how these infections occur, progress and can be controlled and eliminated.

  18. Annex 2: performance audit tool PAT

    Energy Technology Data Exchange (ETDEWEB)

    Kaldorf, S.; Gruber, P.

    2000-10-15

    This annex to the report takes a look at experience gained with one of two expert systems used for error-detection and diagnosis in existing buildings. The Performance Audit Tool PAT is based on an expert system for the detection and diagnosis of under-performance. Its aim is to assist the system's operator. It was developed as a stand-alone system which periodically runs a remote detection and diagnosis check. The various types of under-performance that the system can detect are discussed. The first objective of this study was to present the structure of the tool and describe its application to two types of system: a central air-handling unit and individual zones. The second objective was to discuss experience gained with this type of tool.

  19. 10th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists. tools have been commercialized, but others are operated as open source by a growing research community.

  20. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    Science.gov (United States)

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  1. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  2. Tool wear modeling using abductive networks

    Science.gov (United States)

    Masory, Oren

    1992-09-01

    A tool wear model based on Abductive Networks, which consists of a network of `polynomial' nodes, is described. The model relates the cutting parameters, components of the cutting force, and machining time to flank wear. Thus real time measurements of the cutting force can be used to monitor the machining process. The model is obtained by a training process in which the connectivity between the network's nodes and the polynomial coefficients of each node are determined by optimizing a performance criteria. Actual wear measurements of coated and uncoated carbide inserts were used for training and evaluating the established model.

  3. A Designer’s Guide to Human Performance Modelling (La Modelisation des Performances Humaines: Manuel du Concepteur).

    Science.gov (United States)

    1998-12-01

    into the Systems Engineering Process 17 5.3 Validation of HPMs 18 5.4 Commercialisation of human performance modelling software 18 5.5 Model Tool...budget) so that inappropriate models/tools are not offered. The WG agreed that another form of ’ educating ’ designers in the use of models was by means... Commercialisation of human performance modelling Software 5.2.8 Include human performance in system test. g More and more, customer’s are mandating the provision

  4. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  5. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    Science.gov (United States)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  6. Comparing the performance of three digital forensic tools

    Directory of Open Access Journals (Sweden)

    Brian Cusack

    Full Text Available Software used for digital forensic investigations requires to be verified against reliability and validity criteria. In this paper, three well known tools are tested against the mandatory features of digital forensic tools published by the National Institute of Standards (NIST. It was found that a variation in performance existed between the tools, with all having measureable areas of non performance. The findings have an impact on the professional use of the tools and illustrate the need for benchmarking and testing of the tools before use.

  7. Application of performance assessment as a tool for guiding project work

    International Nuclear Information System (INIS)

    McCombie, C.; Zuidema, P.

    1992-01-01

    The ultimate aim of the performance assessment methodology developed over the last 10-15 years is to predict quantitatively the behavior of disposal systems over periods of time into the far future. The methodology can, however, also be applied in range of tasks during repository development and is in many programmes used as a tool for improving or optimizing the design of subsystem components of for guiding the course of project planning. In Swiss waste management program, there are several examples of the use of performance assessment as a tool in the manner mentioned above. The interaction between research models, assessment models and simplified models is considered to be of key importance and corresponding measures are taken to properly structure the process and to track the data: first, the results of all applications of the models are included in a consistent manner in the scenario analyses for the different sites and systems and, second, consistency in the underlying assumptions and in the data used in the different model calculations is assured by the consequent application of a configuration data management system (CDM). Almost all the applications of performance assessment have been included in Swiss work, but for this paper, only two examples have been selected: applications of performance assessment in both the HLW and the LLW program; and acceptance of specific waste types and their allocation to an appropriate repository on the basis of simplified safety analyses

  8. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  9. Integrating a Decision Management Tool with UML Modeling Tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    by proposing potential subsequent design issues. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, these decisions are typically not connected to the models created during...... integration of formerly disconnected tools improves tool usability as well as decision maker productivity....

  10. Physically vapor deposited coatings on tools: performance and wear phenomena

    International Nuclear Information System (INIS)

    Koenig, W.; Fritsch, R.; Kammermeier, D.

    1991-01-01

    Coatings produced by physical vapor deposition (PVD) enhance the performance of tools for a broad variety of production processes. In addition to TiN, nowadays (Ti,Al)N and Ti(C,N) coated tools are available. This gives the opportunity to compare the performance of different coatings under identical machining conditions and to evaluate causes and phenomena of wear. TiN, (Ti,Al)N and Ti(C,N) coatings on high speed steel (HSS) show different performances in milling and turning of heat treated steel. The thermal and frictional properties of the coating materials affect the structure, the thickness and the flow of the chips, the contact area on the rake face and the tool life. Model tests show the influence of internal cooling and the thermal conductivity of coated HSS inserts. TiN and (Ti,Zr)N PVD coatings on cemented carbides were examined in interrupted turning and in milling of heat treated steel. Experimental results show a significant influence of typical time-temperature cycles of PVD and chemical vapor deposition (CVD) coating processes on the physical data and on the performance of the substrates. PVD coatings increase tool life, especially towards lower cutting speeds into ranges which cannot be applied with CVD coatings. The reason for this is the superior toughness of the PVD coated carbide. The combination of tough, micrograin carbide and PVD coating even enables broaching of case hardened sliding gears at a cutting speed of 66 m min -1 . (orig.)

  11. Modeling the dielectric logging tool at high frequency

    International Nuclear Information System (INIS)

    Chew, W.C.

    1987-01-01

    The high frequency dielectric logging tool has been used widely in electromagnetic well logging, because by measuring the dielectric constant at high frequencies (1 GHz), the water saturation of rocks could be known without measuring the water salinity in the rocks. As such, it could be used to delineate fresh water bearing zones, as the dielectric constant of fresh water is much higher than that of oil while they may have the same resistivity. The authors present a computer model, though electromagnetic field analysis, the response of such a measurement tool in a well logging environment. As the measurement is performed at high frequency, usually with small separation between the transmitter and receivers, some small geological features could be measured by such a tool. They use the computer model to study the behavior of such a tool across geological bed boundaries, and also across thin geological beds. Such a study could be very useful in understanding the limitation on the resolution of the tool. Furthermore, they could study the standoff effect and the depth of investigation of such a tool. This could delineate the range of usefulness of the measurement

  12. Effect of micro-scale texturing on the cutting tool performance

    Science.gov (United States)

    Vasumathy, D.; Meena, Anil

    2018-05-01

    The present study is mainly focused on the cutting performance of the micro-scale textured carbide tools while turning AISI 304 austenitic stainless steel under dry cutting environment. The texture on the rake face of the carbide tools was fabricated by laser machining. The cutting performance of the textured tools was further compared with conventional tools in terms of cutting forces, tool wear, machined surface quality and chip curl radius. SEM and EDS analyses have been also performed to better understand the tool surface characteristics. Results show that the grooves help in breaking the tool-chip contact leading to a lesser tool-chip contact area which results in reduced iron (Fe) adhesion to the tool.

  13. Graph and model transformation tools for model migration : empirical results from the transformation tool contest

    NARCIS (Netherlands)

    Rose, L.M.; Herrmannsdoerfer, M.; Mazanek, S.; Van Gorp, P.M.E.; Buchwald, S.; Horn, T.; Kalnina, E.; Koch, A.; Lano, K.; Schätz, B.; Wimmer, M.

    2014-01-01

    We describe the results of the Transformation Tool Contest 2010 workshop, in which nine graph and model transformation tools were compared for specifying model migration. The model migration problem—migration of UML activity diagrams from version 1.4 to version 2.2—is non-trivial and practically

  14. Design of a Cognitive Tool to Enhance Problemsolving Performance

    Science.gov (United States)

    Lee, Youngmin; Nelson, David

    2005-01-01

    The design of a cognitive tool to support problem-solving performance for external representation of knowledge is described. The limitations of conventional knowledge maps are analyzed in proposing the tool. The design principles and specifications are described. This tool is expected to enhance learners problem-solving performance by allowing…

  15. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  16. Runtime Performance Monitoring Tool for RTEMS System Software

    Science.gov (United States)

    Cho, B.; Kim, S.; Park, H.; Kim, H.; Choi, J.; Chae, D.; Lee, J.

    2007-08-01

    RTEMS is a commercial-grade real-time operating system that supports multi-processor computers. However, there are not many development tools for RTEMS. In this paper, we report new RTEMS-based runtime performance monitoring tool. We have implemented a light weight runtime monitoring task with an extension to the RTEMS APIs. Using our tool, software developers can verify various performance- related parameters during runtime. Our tool can be used during software development phase and in-orbit operation as well. Our implemented target agent is light weight and has small overhead using SpaceWire interface. Efforts to reduce overhead and to add other monitoring parameters are currently under research.

  17. Performance Evaluation of Java Based Object Relational Mapping Tools

    Directory of Open Access Journals (Sweden)

    Shoaib Mahmood Bhatti

    2013-04-01

    Full Text Available Object persistency is the hot issue in the form of ORM (Object Relational Mapping tools in industry as developers use these tools during software development. This paper presents the performance evaluation of Java based ORM tools. For this purpose, Hibernate, Ebean and TopLinkhave been selected as the ORM tools which are popular and open source. Their performance has been measured from execution point of view. The results show that ORM tools are the good option for the developers considering the system throughput in shorter setbacks and they can be used efficiently and effectively for performing mapping of the objects into the relational dominated world of database, thus creating a hope for a better and well dominated future of this technology.

  18. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    Science.gov (United States)

    Franco, P.; Estrems, M.; Faura, F.

    2007-04-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools.

  19. Numerical modelling of tool wear in turning with cemented carbide cutting tools

    International Nuclear Information System (INIS)

    Franco, P.; Estrems, M.; Faura, F.

    2007-01-01

    A numerical model is proposed for analysing the flank and crater wear resulting from the loss of material on cutting tool surface in turning processes due to wear mechanisms of adhesion, abrasion and fracture. By means of this model, the material loss along cutting tool surface can be analysed, and the worn surface shape during the workpiece machining can be determined. The proposed model analyses the gradual degradation of cutting tool during turning operation, and tool wear can be estimated as a function of cutting time. Wear-land width (VB) and crater depth (KT) can be obtained for description of material loss on cutting tool surface, and the effects of the distinct wear mechanisms on surface shape can be studied. The parameters required for the tool wear model are obtained from bibliography and experimental observation for AISI 4340 steel turning with WC-Co cutting tools

  20. Web tools for predictive toxicology model building.

    Science.gov (United States)

    Jeliazkova, Nina

    2012-07-01

    The development and use of web tools in chemistry has accumulated more than 15 years of history already. Powered by the advances in the Internet technologies, the current generation of web systems are starting to expand into areas, traditional for desktop applications. The web platforms integrate data storage, cheminformatics and data analysis tools. The ease of use and the collaborative potential of the web is compelling, despite the challenges. The topic of this review is a set of recently published web tools that facilitate predictive toxicology model building. The focus is on software platforms, offering web access to chemical structure-based methods, although some of the frameworks could also provide bioinformatics or hybrid data analysis functionalities. A number of historical and current developments are cited. In order to provide comparable assessment, the following characteristics are considered: support for workflows, descriptor calculations, visualization, modeling algorithms, data management and data sharing capabilities, availability of GUI or programmatic access and implementation details. The success of the Web is largely due to its highly decentralized, yet sufficiently interoperable model for information access. The expected future convergence between cheminformatics and bioinformatics databases provides new challenges toward management and analysis of large data sets. The web tools in predictive toxicology will likely continue to evolve toward the right mix of flexibility, performance, scalability, interoperability, sets of unique features offered, friendly user interfaces, programmatic access for advanced users, platform independence, results reproducibility, curation and crowdsourcing utilities, collaborative sharing and secure access.

  1. Monte Carlo tools for Beyond the Standard Model Physics , April 14-16

    DEFF Research Database (Denmark)

    Badger...[], Simon; Christensen, Christian Holm; Dalsgaard, Hans Hjersing

    2011-01-01

    This workshop aims to gather together theorists and experimentalists interested in developing and using Monte Carlo tools for Beyond the Standard Model Physics in an attempt to be prepared for the analysis of data focusing on the Large Hadron Collider. Since a large number of excellent tools....... To identify promising models (or processes) for which the tools have not yet been constructed and start filling up these gaps. To propose ways to streamline the process of going from models to events, i.e. to make the process more user-friendly so that more people can get involved and perform serious collider...

  2. Maintenance Personnel Performance Simulation (MAPPS) model: a human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: (1) the probability of successfully completing the task of interest; and (2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution. The MAPPS model was subjected to a number of evaluation efforts that focused upon its practicality, acceptability, usefulness, and validity. Methods used for these efforts included a case method approach, consensus estimation, and comparison with observed task performance measures at a NPP. Favorable results, such as close agreement between task duration times for two tasks observed in the field (67.0 and 119.8 minutes, respectively), and estimates by MAPPS (72.0 and 124.0 minutes, respectively) enhance the confidence in the future use of MAPPS. 8 refs., 1 fig

  3. A Tool for Performance Modeling of Parallel Programs

    Directory of Open Access Journals (Sweden)

    J.A. González

    2003-01-01

    Full Text Available Current performance prediction analytical models try to characterize the performance behavior of actual machines through a small set of parameters. In practice, substantial deviations are observed. These differences are due to factors as memory hierarchies or network latency. A natural approach is to associate a different proportionality constant with each basic block, and analogously, to associate different latencies and bandwidths with each "communication block". Unfortunately, to use this approach implies that the evaluation of parameters must be done for each algorithm. This is a heavy task, implying experiment design, timing, statistics, pattern recognition and multi-parameter fitting algorithms. Software support is required. We present a compiler that takes as source a C program annotated with complexity formulas and produces as output an instrumented code. The trace files obtained from the execution of the resulting code are analyzed with an interactive interpreter, giving us, among other information, the values of those parameters.

  4. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  5. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data.

    Science.gov (United States)

    García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio

    2016-01-28

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC-MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc . Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC-MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.

  6. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    Directory of Open Access Journals (Sweden)

    Paulino José García Nieto

    2016-01-01

    Full Text Available Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC in combination with multivariate adaptive regression splines (MARS technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed.

  7. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    Science.gov (United States)

    García Nieto, Paulino José; García-Gonzalo, Esperanza; Ordóñez Galán, Celestino; Bernardo Sánchez, Antonio

    2016-01-01

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism involved the parameter setting in the MARS training procedure, which significantly influences the regression accuracy. Therefore, an ABC–MARS-based model was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. Regression with optimal hyperparameters was performed and a determination coefficient of 0.94 was obtained. The ABC–MARS-based model's goodness of fit to experimental data confirmed the good performance of this model. This new model also allowed us to ascertain the most influential parameters on the milling tool flank wear with a view to proposing milling machine's improvements. Finally, conclusions of this study are exposed. PMID:28787882

  8. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  9. Performance improvement tool for Thai make–to–order manufacturing

    Directory of Open Access Journals (Sweden)

    Apichat Sopadang

    2012-02-01

    Full Text Available This paper presents a framework to improve a performance of shop floor control for Thai make-to-order (MTO smalland medium enterprises (SMEs. Integrated definition for function modeling is exploited to explore activities and relate ofcomponents. In-depth interview with experts and practitioners in the case study is provided useful information. The empiricalstudy is evaluated to suit for using the finalized SHEN model as a benchmark. Factor analysis is performed to find simplifiedinformation from variables. The data are collected from experience respondents by using a designed questionnaire. Eachobserved variable is assigned to test validity and reliability by factor loading and Cronbach’s alpha, respectively. The resultsshow that finalized SHEN can use as a performance improvement tool for Thai MTO SMEs. For example principle 11 is tested.Each observed variable has covariance value between 0.380-0.873. The value of reliability Cronbach’s alpha for this factoris shown 0.869. Based on the scree plot, it is asserted that 5 observed variables are correctly formed in the same principle.

  10. Agent-based modeling as a tool for program design and evaluation.

    Science.gov (United States)

    Lawlor, Jennifer A; McGirr, Sara

    2017-12-01

    Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. The Maintenance Personnel Performance Simulation (MAPPS) model: A human reliability analysis tool

    International Nuclear Information System (INIS)

    Knee, H.E.

    1985-01-01

    The Maintenance Personnel Performance Simulation (MAPPS) model is a computerized, stochastic, task-oriented human behavioral model developed to provide estimates of nuclear power plant (NPP) maintenance team performance measures. It is capable of addressing person-machine, person-environment, and person-person relationships, and accounts for interdependencies that exist between the subelements that make up the maintenance task of interest. The primary measures of performance estimated by MAPPS are: 1) the probability of successfully completing the task of interest and 2) the task duration time. MAPPS also estimates a host of other performance indices, including the probability of an undetected error, identification of the most- and least-likely error-prone subelements, and maintenance team stress profiles during task execution

  12. Scenario Evaluator for Electrical Resistivity survey pre-modeling tool

    Science.gov (United States)

    Terry, Neil; Day-Lewis, Frederick D.; Robinson, Judith L.; Slater, Lee D.; Halford, Keith J.; Binley, Andrew; Lane, John W.; Werkema, Dale D.

    2017-01-01

    Geophysical tools have much to offer users in environmental, water resource, and geotechnical fields; however, techniques such as electrical resistivity imaging (ERI) are often oversold and/or overinterpreted due to a lack of understanding of the limitations of the techniques, such as the appropriate depth intervals or resolution of the methods. The relationship between ERI data and resistivity is nonlinear; therefore, these limitations depend on site conditions and survey design and are best assessed through forward and inverse modeling exercises prior to field investigations. In this approach, proposed field surveys are first numerically simulated given the expected electrical properties of the site, and the resulting hypothetical data are then analyzed using inverse models. Performing ERI forward/inverse modeling, however, requires substantial expertise and can take many hours to implement. We present a new spreadsheet-based tool, the Scenario Evaluator for Electrical Resistivity (SEER), which features a graphical user interface that allows users to manipulate a resistivity model and instantly view how that model would likely be interpreted by an ERI survey. The SEER tool is intended for use by those who wish to determine the value of including ERI to achieve project goals, and is designed to have broad utility in industry, teaching, and research.

  13. An ensemble model of QSAR tools for regulatory risk assessment.

    Science.gov (United States)

    Pradeep, Prachi; Povinelli, Richard J; White, Shannon; Merrill, Stephen J

    2016-01-01

    Quantitative structure activity relationships (QSARs) are theoretical models that relate a quantitative measure of chemical structure to a physical property or a biological effect. QSAR predictions can be used for chemical risk assessment for protection of human and environmental health, which makes them interesting to regulators, especially in the absence of experimental data. For compatibility with regulatory use, QSAR models should be transparent, reproducible and optimized to minimize the number of false negatives. In silico QSAR tools are gaining wide acceptance as a faster alternative to otherwise time-consuming clinical and animal testing methods. However, different QSAR tools often make conflicting predictions for a given chemical and may also vary in their predictive performance across different chemical datasets. In a regulatory context, conflicting predictions raise interpretation, validation and adequacy concerns. To address these concerns, ensemble learning techniques in the machine learning paradigm can be used to integrate predictions from multiple tools. By leveraging various underlying QSAR algorithms and training datasets, the resulting consensus prediction should yield better overall predictive ability. We present a novel ensemble QSAR model using Bayesian classification. The model allows for varying a cut-off parameter that allows for a selection in the desirable trade-off between model sensitivity and specificity. The predictive performance of the ensemble model is compared with four in silico tools (Toxtree, Lazar, OECD Toolbox, and Danish QSAR) to predict carcinogenicity for a dataset of air toxins (332 chemicals) and a subset of the gold carcinogenic potency database (480 chemicals). Leave-one-out cross validation results show that the ensemble model achieves the best trade-off between sensitivity and specificity (accuracy: 83.8 % and 80.4 %, and balanced accuracy: 80.6 % and 80.8 %) and highest inter-rater agreement [kappa ( κ ): 0

  14. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  15. DsixTools: the standard model effective field theory toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Celis, Alejandro [Ludwig-Maximilians-Universitaet Muenchen, Fakultaet fuer Physik, Arnold Sommerfeld Center for Theoretical Physics, Munich (Germany); Fuentes-Martin, Javier; Vicente, Avelino [Universitat de Valencia-CSIC, Instituto de Fisica Corpuscular, Valencia (Spain); Virto, Javier [University of Bern, Albert Einstein Center for Fundamental Physics, Institute for Theoretical Physics, Bern (Switzerland)

    2017-06-15

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the ΔB = ΔS = 1, 2 and ΔB = ΔC = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale. (orig.)

  16. Challenges in Getting Building Performance Monitoring Tools for Everyday Use: User Experiences with A New Tool

    Directory of Open Access Journals (Sweden)

    Heikki Ihasalo

    2014-05-01

    Full Text Available There is a need for building performance monitoring because it is common that buildings do not perform as intended. A number of advanced tools for the purpose have been developed within the last tens of years. However, these tools have not been widely adopted in real use. A new tool presented here utilizes building automation data and transforms the data into a set of performance metrics, and is capable of visualizing building performance from energy, indoor conditions, and HVAC (heating, ventilation and air conditioning system perspectives. The purpose of this paper is to study the users’ perceptions of the use of tool. The research method was semi-structured interviews. Although the users were satisfied with the solution in general, it was not taken into operative use. The main challenges with the use of the solution were related to accessibility, trust, and management practices. The interviewees were struggling to manage with numerous information systems and therefore had problems in finding the solution and authenticating to it. All the interviewees did not fully trust the solution, since they did not entirely understand what the performance metrics meant or because the solution had limitations in assessing building performance. Management practices are needed to support the performance measurement philosophy.

  17. Importance Performance Analysis as a Trade Show Performance Evaluation and Benchmarking Tool

    OpenAIRE

    Tafesse, Wondwesen; Skallerud, Kåre; Korneliussen, Tor

    2010-01-01

    Author's accepted version (post-print). The purpose of this study is to introduce importance performance analysis as a trade show performance evaluation and benchmarking tool. Importance performance analysis considers exhibitors’ performance expectation and perceived performance in unison to evaluate and benchmark trade show performance. The present study uses data obtained from exhibitors of an international trade show to demonstrate how importance performance analysis can be used to eval...

  18. An evaluation of BPMN modeling tools

    NARCIS (Netherlands)

    Yan, Z.; Reijers, H.A.; Dijkman, R.M.; Mendling, J.; Weidlich, M.

    2010-01-01

    Various BPMN modeling tools are available and it is close to impossible to understand their functional differences without simply trying them out. This paper presents an evaluation framework and presents the outcomes of its application to a set of five BPMN modeling tools. We report on various

  19. Performance Contracting as a Performance Management Tool in the Public Sector in Kenya: Lessons of learning

    Science.gov (United States)

    Hope, Kempe Ronald, Sr.

    2013-01-01

    The purpose of this article is to provide an assessment and analysis of public sector performance contracting as a performance management tool in Kenya. It aims to demonstrate that performance contracting remains a viable and important tool for improving public sector performance as a key element of the on-going public sector transformation…

  20. Construction Of A Performance Assessment Model For Zakat Management Institutions

    Directory of Open Access Journals (Sweden)

    Sri Fadilah

    2016-12-01

    Full Text Available The objective of the research is to examine the performance evaluation using Balanced Scorecard model. The research is conducted due to a big gap existing between zakat (alms and religious tax in Islam with its potential earn of as much as 217 trillion rupiahs and the realization of the collected zakat fund that is only reached for three trillion. This indicates that the performance of zakat management organizations in collecting the zakat is still very low. On the other hand, the quantity and the quality of zakat management organizations have to be improved. This means the performance evaluation model as a tool to evaluate performance is needed. The model construct is making a performance evaluation model that can be implemented to zakat management organizations. The organizational performance with Balanced Scorecard evaluation model will be effective if it is supported by three aspects, namely:  PI, BO and TQM. This research uses explanatory method and data analysis tool of SEM/PLS. Data collecting technique are questionnaires, interviews and documentation. The result of this research shows that PI, BO and TQM simultaneously and partially gives a significant effect on organizational performance.

  1. Proposal for a Method for Business Model Performance Assessment: Toward an Experimentation Tool for Business Model Innovation

    Directory of Open Access Journals (Sweden)

    Antonio Batocchio

    2017-04-01

    Full Text Available The representation of business models has been recently widespread, especially in the pursuit of innovation. However, defining a company’s business model is sometimes limited to discussion and debates. This study observes the need for performance measurement so that business models can be data-driven. To meet this goal, the work proposed as a hypothesis the creation of a method that combines the practices of the Balanced Scorecard with a method of business models representation – the Business Model Canvas. Such a combination was based on study of conceptual adaptation, resulting in an application roadmap. A case study application was performed to check the functionality of the proposition, focusing on startup organizations. It was concluded that based on the performance assessment of the business model it is possible to propose the search for change through experimentation, a path that can lead to business model innovation.

  2. Performance Support Tools for Space Medical Operations

    Science.gov (United States)

    Byrne, Vicky E.; Schmidt, Josef; Barshi, Immanuel

    2009-01-01

    The early Constellation space missions are expected to have medical capabilities very similar to those currently on the Space Shuttle and International Space Station (ISS). For Crew Exploration Vehicle (CEV) missions to ISS, medical equipment will be located on ISS, and carried into CEV in the event of an emergency. Flight Surgeons (FS) on the ground in Mission Control will be expected to direct the Crew Medical Officer (CMO) during medical situations. If there is a loss of signal and the crew is unable to communicate with the ground, a CMO would be expected to carry out medical procedures without the aid of a FS. In these situations, performance support tools can be used to reduce errors and time to perform emergency medical tasks. Human factors personnel at Johnson Space Center have recently investigated medical performance support tools for CMOs on-orbit, and FSs on the ground. This area of research involved the feasibility of Just-in-time (JIT) training techniques and concepts for real-time medical procedures. In Phase 1, preliminary feasibility data was gathered for two types of prototype display technologies: a hand-held PDA, and a Head Mounted Display (HMD). The PDA and HMD were compared while performing a simulated medical procedure using ISS flight-like medical equipment. Based on the outcome of Phase 1, including data on user preferences, further testing was completed using the PDA only. Phase 2 explored a wrist-mounted PDA, and compared it to a paper cue card. For each phase, time to complete procedures, errors, and user satisfaction were captured. Information needed by the FS during ISS mission support, especially for an emergency situation (e.g. fire onboard ISS), may be located in many different places around the FS s console. A performance support tool prototype is being developed to address this issue by bringing all of the relevant information together in one place. The tool is designed to include procedures and other information needed by a FS

  3. System Advisor Model: Flat Plate Photovoltaic Performance Modeling Validation Report

    Energy Technology Data Exchange (ETDEWEB)

    Freeman, Janine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Whitmore, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Kaffine, Leah [National Renewable Energy Lab. (NREL), Golden, CO (United States); Blair, Nate [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron P. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    The System Advisor Model (SAM) is a free software tool that performs detailed analysis of both system performance and system financing for a variety of renewable energy technologies. This report provides detailed validation of the SAM flat plate photovoltaic performance model by comparing SAM-modeled PV system generation data to actual measured production data for nine PV systems ranging from 75 kW to greater than 25 MW in size. The results show strong agreement between SAM predictions and field data, with annualized prediction error below 3% for all fixed tilt cases and below 8% for all one axis tracked cases. The analysis concludes that snow cover and system outages are the primary sources of disagreement, and other deviations resulting from seasonal biases in the irradiation models and one axis tracking issues are discussed in detail.

  4. Modeling and Tool Wear in Routing of CFRP

    International Nuclear Information System (INIS)

    Iliescu, D.; Fernandez, A.; Gutierrez-Orrantia, M. E.; Lopez de Lacalle, L. N.; Girot, F.

    2011-01-01

    This paper presents the prediction and evaluation of feed force in routing of carbon composite material. In order to extend tool life and improve quality of the machined surface, a better understanding of uncoated and coated tool behaviors is required. This work describes (1) the optimization of the geometry of multiple teeth tools minimizing the tool wear and the feed force, (2) the optimization of tool coating and (3) the development of a phenomenological model between the feed force, the routing parameters and the tool wear. The experimental results indicate that the feed rate, the cutting speed and the tool wear are the most significant factors affecting the feed force. In the case of multiple teeth tools, a particular geometry with 14 teeth right helix right cut and 11 teeth left helix right cut gives the best results. A thick AlTiN coating or a diamond coating can dramatically improve the tool life while minimizing the axial force, roughness and delamination. A wear model has then been developed based on an abrasive behavior of the tool. The model links the feed rate to the tool geometry parameters (tool diameter), to the process parameters (feed rate, cutting speed and depth of cut) and to the wear. The model presented has been verified by experimental tests.

  5. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  6. A design and performance analysis tool for superconducting RF systems

    International Nuclear Information System (INIS)

    Schilcher, T.; Simrock, S.N.; Merminga, L.; Wang, D.X.

    1997-01-01

    Superconducting rf systems are usually operated with continuous rf power or with rf pulse lengths exceeding 1 ms to maximize the overall wall plug power efficiency. Typical examples are CEBAF at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) and the TESLA Test Facility at DESY. The long pulses allow for effective application of feedback to stabilize the accelerating field in presence of microphonics, Lorentz force detuning, and fluctuations of the beam current. In this paper the authors describe a set of tools to be used with MATLAB and SIMULINK, which allow to analyze the quality of field regulation for a given design. The tools include models for the cavities, the rf power source, the beam, sources of field perturbations, and the rf feedback system. The rf control relevant electrical and mechanical characteristics of the cavity are described in form of time-varying state space models. The power source is modeled as a current generator and includes saturation characteristics and noise.An arbitrary time structure can be imposed on the beam current to reflect a macro-pulse structure and bunch charge fluctuations. For rf feedback several schemes can be selected: Traditional amplitude and phase control as well as I/Q control. The choices for the feedback controller include analog or digital approaches and various choices of frequency response. Feed forward can be added to further suppress repetitive errors. The results of a performance analysis of the CEBAF and the TESLA Linac rf system using these tools are presented

  7. Performance Assessment as a Diagnostic Tool for Science Teachers

    Science.gov (United States)

    Kruit, Patricia; Oostdam, Ron; van den Berg, Ed; Schuitema, Jaap

    2018-04-01

    Information on students' development of science skills is essential for teachers to evaluate and improve their own education, as well as to provide adequate support and feedback to the learning process of individual students. The present study explores and discusses the use of performance assessments as a diagnostic tool for formative assessment to inform teachers and guide instruction of science skills in primary education. Three performance assessments were administered to more than 400 students in grades 5 and 6 of primary education. Students performed small experiments using real materials while following the different steps of the empirical cycle. The mutual relationship between the three performance assessments is examined to provide evidence for the value of performance assessments as useful tools for formative evaluation. Differences in response patterns are discussed, and the diagnostic value of performance assessments is illustrated with examples of individual student performances. Findings show that the performance assessments were difficult for grades 5 and 6 students but that much individual variation exists regarding the different steps of the empirical cycle. Evaluation of scores as well as a more substantive analysis of students' responses provided insight into typical errors that students make. It is concluded that performance assessments can be used as a diagnostic tool for monitoring students' skill performance as well as to support teachers in evaluating and improving their science lessons.

  8. A Valid and Reliable Tool to Assess Nursing Students` Clinical Performance

    OpenAIRE

    Mehrnoosh Pazargadi; Tahereh Ashktorab; Sharareh Khosravi; Hamid Alavi majd

    2013-01-01

    Background: The necessity of a valid and reliable assessment tool is one of the most repeated issues in nursing students` clinical evaluation. But it is believed that present tools are not mostly valid and can not assess students` performance properly.Objectives: This study was conducted to design a valid and reliable assessment tool for evaluating nursing students` performance in clinical education.Methods: In this methodological study considering nursing students` performance definition; th...

  9. Bayesian networks modeling for thermal error of numerical control machine tools

    Institute of Scientific and Technical Information of China (English)

    Xin-hua YAO; Jian-zhong FU; Zi-chen CHEN

    2008-01-01

    The interaction between the heat source location,its intensity,thermal expansion coefficient,the machine system configuration and the running environment creates complex thermal behavior of a machine tool,and also makes thermal error prediction difficult.To address this issue,a novel prediction method for machine tool thermal error based on Bayesian networks (BNs) was presented.The method described causal relationships of factors inducing thermal deformation by graph theory and estimated the thermal error by Bayesian statistical techniques.Due to the effective combination of domain knowledge and sampled data,the BN method could adapt to the change of running state of machine,and obtain satisfactory prediction accuracy.Ex-periments on spindle thermal deformation were conducted to evaluate the modeling performance.Experimental results indicate that the BN method performs far better than the least squares(LS)analysis in terms of modeling estimation accuracy.

  10. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    International Nuclear Information System (INIS)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin

    2014-01-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  11. Tools for Measuring and Improving Performance.

    Science.gov (United States)

    Jurow, Susan

    1993-01-01

    Explains the need for meaningful performance measures in libraries and the Total Quality Management (TQM) approach to data collection. Five tools representing different stages of a TQM inquiry are covered (i.e., the Shewhart Cycle, flowcharts, cause-and-effect diagrams, Pareto charts, and control charts), and benchmarking is addressed. (Contains…

  12. A Design Tool for Matching UAV Propeller and Power Plant Performance

    Science.gov (United States)

    Mangio, Arion L.

    A large body of knowledge is available for matching propellers to engines for large propeller driven aircraft. Small UAV's and model airplanes operate at much lower Reynolds numbers and use fixed pitch propellers so the information for large aircraft is not directly applicable. A design tool is needed that takes into account Reynolds number effects, allows for gear reduction, and the selection of a propeller optimized for the airframe. The tool developed in this thesis does this using propeller performance data generated from vortex theory or wind tunnel experiments and combines that data with an engine power curve. The thrust, steady state power, RPM, and tip Mach number vs. velocity curves are generated. The Reynolds number vs. non dimensional radial station at an operating point is also found. The tool is then used to design a geared power plant for the SAE Aero Design competition. To measure the power plant performance, a purpose built engine test stand was built. The characteristics of the engine test stand are also presented. The engine test stand was then used to characterize the geared power plant. The power plant uses a 26x16 propeller, 100/13 gear ratio, and an LRP 0.30 cubic inch engine turning at 28,000 RPM and producing 2.2 HP. Lastly, the measured power plant performance is presented. An important result is that 17 lbf of static thrust is produced.

  13. Performance evaluation of paper embossing tools produced by fused deposition modelling additive manufacturing technology

    Directory of Open Access Journals (Sweden)

    Gordana Delić

    2017-12-01

    Full Text Available From its beginnings, up to a few years ago, additive manufacturing technology was able to produce models or prototypes which have limited use, because of materials mechanical properties. With advancement and invention of new materials, this is changing. Now, it is possible to create 3D prints that can be used as final products or functional tools, using technology and materials with low environmental impact. The goal of this study was to examine opportunities for production of paper embossing tools by fused deposition modelling (FDM 3D printing. This study emphasises the use of environmentally friendly poly-lactic acid (PLA materials in FDM technology, contrary to the conventional method using metal alloys and acids. Embossing of line elements and letters using 3D printed embossing tools was done on six different types of paper. Embossing force was applied using SHIMADZU EZ-LX Compact Tabletop Testing Machine. Each type of paper was repeatedly embossed using different values of embossing force (in 250 N increments, starting at 1000 N to determine the optimal embossing force for each specific paper type. When determined, the optimal embossing force was used on ten samples for each paper type. Results of embossing were analysed and evaluated. The analysis consisted of investigating the effects of the applied embossing force and characteristics such as paper basis weight, paper structure, surface characteristic and fibre direction of the paper. Results show that paper characteristics determine the embossing force required for achieving a good embossing result. This means that with the right amount of embossing force, letters and borderlines can be equally well formed by the embossing process regardless of paper weight, surface characteristics, etc. Embossing tools produced in this manner can be used in case of the embossing elements that are not complex. The reason for this is the limitation of FDM technology and lack of precision needed for fine

  14. NASCENT: an automatic protein interaction network generation tool for non-model organisms.

    Science.gov (United States)

    Banky, Daniel; Ordog, Rafael; Grolmusz, Vince

    2009-04-24

    Large quantity of reliable protein interaction data are available for model organisms in public depositories (e.g., MINT, DIP, HPRD, INTERACT). Most data correspond to experiments with the proteins of Saccharomyces cerevisiae, Drosophila melanogaster, Homo sapiens, Caenorhabditis elegans, Escherichia coli and Mus musculus. For other important organisms the data availability is poor or non-existent. Here we present NASCENT, a completely automatic web-based tool and also a downloadable Java program, capable of modeling and generating protein interaction networks even for non-model organisms. The tool performs protein interaction network modeling through gene-name mapping, and outputs the resulting network in graphical form and also in computer-readable graph-forms, directly applicable by popular network modeling software. http://nascent.pitgroup.org.

  15. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Science.gov (United States)

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  16. Hybrid ABC Optimized MARS-Based Modeling of the Milling Tool Wear from Milling Run Experimental Data

    OpenAIRE

    Garc?a Nieto, Paulino Jos?; Garc?a-Gonzalo, Esperanza; Ord??ez Gal?n, Celestino; Bernardo S?nchez, Antonio

    2016-01-01

    Milling cutters are important cutting tools used in milling machines to perform milling operations, which are prone to wear and subsequent failure. In this paper, a practical new hybrid model to predict the milling tool wear in a regular cut, as well as entry cut and exit cut, of a milling tool is proposed. The model was based on the optimization tool termed artificial bee colony (ABC) in combination with multivariate adaptive regression splines (MARS) technique. This optimization mechanism i...

  17. Comparison of two different modelling tools

    DEFF Research Database (Denmark)

    Brix, Wiebke; Elmegaard, Brian

    2009-01-01

    In this paper a test case is solved using two different modelling tools, Engineering Equation Solver (EES) and WinDali, in order to compare the tools. The system of equations solved, is a static model of an evaporator used for refrigeration. The evaporator consists of two parallel channels......, and it is investigated how a non-uniform airflow influences the refrigerant mass flow rate distribution and the total cooling capacity of the heat exchanger. It is shown that the cooling capacity decreases significantly with increasing maldistribution of the airflow. Comparing the two simulation tools it is found...

  18. Machining of high performance workpiece materials with CBN coated cutting tools

    International Nuclear Information System (INIS)

    Uhlmann, E.; Fuentes, J.A. Oyanedel; Keunecke, M.

    2009-01-01

    The machining of high performance workpiece materials requires significantly harder cutting materials. In hard machining, the early tool wear occurs due to high process forces and temperatures. The hardest known material is the diamond, but steel materials cannot be machined with diamond tools because of the reactivity of iron with carbon. Cubic boron nitride (cBN) is the second hardest of all known materials. The supply of such PcBN indexable inserts, which are only geometrically simple and available, requires several work procedures and is cost-intensive. The development of a cBN coating for cutting tools, combine the advantages of a thin film system and of cBN. Flexible cemented carbide tools, in respect to the geometry can be coated. The cBN films with a thickness of up to 2 μm on cemented carbide substrates show excellent mechanical and physical properties. This paper describes the results of the machining of various workpiece materials in turning and milling operations regarding the tool life, resultant cutting force components and workpiece surface roughness. In turning tests of Inconel 718 and milling tests of chrome steel the high potential of cBN coatings for dry machining was proven. The results of the experiments were compared with common used tool coatings for the hard machining. Additionally, the wear mechanisms adhesion, abrasion, surface fatigue and tribo-oxidation were researched in model wear experiments.

  19. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  20. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  1. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 2 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  2. High Performance Multiphase Combustion Tool Using Level Set-Based Primary Atomization Coupled with Flamelet Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovative methodologies proposed in this STTR Phase 1 project will enhance Loci-STREAM which is a high performance, high fidelity simulation tool already being...

  3. Evaluating the Usability of a Professional Modeling Tool Repurposed for Middle School Learning

    Science.gov (United States)

    Peters, Vanessa L.; Songer, Nancy Butler

    2013-10-01

    This paper reports the results of a three-stage usability test of a modeling tool designed to support learners' deep understanding of the impacts of climate change on ecosystems. The design process involved repurposing an existing modeling technology used by professional scientists into a learning tool specifically designed for middle school students. To evaluate usability, we analyzed students' task performance and task completion time as they worked on an activity with the repurposed modeling technology. In stage 1, we conducted remote testing of an early modeling prototype with urban middle school students (n = 84). In stages 2 and 3, we used screencasting software to record students' mouse and keyboard movements during collaborative think-alouds (n = 22) and conducted a qualitative analysis of their peer discussions. Taken together, the study findings revealed two kinds of usability issues that interfered with students' productive use of the tool: issues related to the use of data and information, and issues related to the use of the modeling technology. The study findings resulted in design improvements that led to stronger usability outcomes and higher task performance among students. In this paper, we describe our methods for usability testing, our research findings, and our design solutions for supporting students' use of the modeling technology and use of data. The paper concludes with implications for the design and study of modeling technologies for science learning.

  4. Performance indicators: A tool for continuous quality improvement

    Directory of Open Access Journals (Sweden)

    Nidhi M Bhatnagar

    2016-01-01

    Full Text Available Background: Performance monitoring is an important tool which can be used for setting priorities for process improvement. At our centre, we have been monitoring every step in the processes, right from inventory of consumables (both critical and routine to number of donors reactive for TTI. We conducted a study to measure the impact of monitoring Performance Indicators and how it could be used as a tool for Continuous Quality Improvement (CQI. Materials and Methods: The present study was a retrospective study where the performance indicator (PI data of blood bank was analyzed for over four years. For certain parameters, benchmarks or thresholds were set that represented warning limits or action limits. The yearly data were collated from monthly data. "Shifts" or "Trends", if any, were identified and Corrective and Preventive Action (CAPA taken accordingly. At the end, outcomes of the analysis were charted. Results: After the yearly data evaluation, outcomes obtained were used to plan, correct and amend processes and systems in the blood center. It was observed that the workload of the center showed an upward trend. This helped us to plan for the purchase of consumables and management of manpower. The monitoring of usage and discard of blood helped in the efficient management of blood stocks. The need for any new equipment could also be judged by the trends in workload. Conclusion: Performance indicators are indispensible tools which various stakeholders in the Blood Transfusion centres should implement to improve on quality performance.

  5. Performance indicators: A tool for continuous quality improvement.

    Science.gov (United States)

    Bhatnagar, Nidhi M; Soni, Shital; Gajjar, Maitrey; Shah, Mamta; Shah, Sangita; Patel, Vaidehi

    2016-01-01

    Performance monitoring is an important tool which can be used for setting priorities for process improvement. At our centre, we have been monitoring every step in the processes, right from inventory of consumables (both critical and routine) to number of donors reactive for TTI. We conducted a study to measure the impact of monitoring Performance Indicators and how it could be used as a tool for Continuous Quality Improvement (CQI). The present study was a retrospective study where the performance indicator (PI) data of blood bank was analyzed for over four years. For certain parameters, benchmarks or thresholds were set that represented warning limits or action limits. The yearly data were collated from monthly data. Shifts or Trends, if any, were identified and Corrective and Preventive Action (CAPA) taken accordingly. At the end, outcomes of the analysis were charted. After the yearly data evaluation, outcomes obtained were used to plan, correct and amend processes and systems in the blood center. It was observed that the workload of the center showed an upward trend. This helped us to plan for the purchase of consumables and management of manpower. The monitoring of usage and discard of blood helped in the efficient management of blood stocks. The need for any new equipment could also be judged by the trends in workload. Performance indicators are indispensible tools which various stakeholders in the Blood Transfusion centres should implement to improve on quality performance.

  6. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    Energy Technology Data Exchange (ETDEWEB)

    Maile, Tobias; Bazjanac, Vladimir; O' Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  7. Transparent Model Transformation: Turning Your Favourite Model Editor into a Transformation Tool

    DEFF Research Database (Denmark)

    Acretoaie, Vlad; Störrle, Harald; Strüber, Daniel

    2015-01-01

    Current model transformation languages are supported by dedicated editors, often closely coupled to a single execution engine. We introduce Transparent Model Transformation, a paradigm enabling modelers to specify transformations using a familiar tool: their model editor. We also present VMTL, th...... model transformation tool sharing the model editor’s benefits, transparently....

  8. Processes, Performance Drivers and ICT Tools in Human Resources Management

    OpenAIRE

    Oškrdal Václav; Pavlíček Antonín; Jelínková Petra

    2011-01-01

    This article presents an insight to processes, performance drivers and ICT tools in human resources (HR) management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes ...

  9. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  10. Flexible global ocean-atmosphere-land system model. A modeling tool for the climate change research community

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Tianjun; Yu, Yongqiang; Liu, Yimin; Wang, Bin (eds.) [Chinese Academy of Sciences, Beijing, (China). Inst. of Atmospheric Physics

    2014-04-01

    First book available on systematic evaluations of the performance of the global climate model FGOALS. Covers the whole field, ranging from the development to the applications of this climate system model. Provide an outlook for the future development of the FGOALS model system. Offers brief introduction about how to run FGOALS. Coupled climate system models are of central importance for climate studies. A new model known as FGOALS (the Flexible Global Ocean-Atmosphere-Land System model), has been developed by the State Key Laboratory of Numerical Modeling for Atmospheric Sciences and Geophysical Fluid Dynamics, Institute of Atmospheric Physics, Chinese Academy of Sciences (LASG/IAP, CAS), a first-tier national geophysical laboratory. It serves as a powerful tool, both for deepening our understanding of fundamental mechanisms of the climate system and for making decadal prediction and scenario projections of future climate change. ''Flexible Global Ocean-Atmosphere-Land System Model: A Modeling Tool for the Climate Change Research Community'' is the first book to offer systematic evaluations of this model's performance. It is comprehensive in scope, covering both developmental and application-oriented aspects of this climate system model. It also provides an outlook of future development of FGOALS and offers an overview of how to employ the model. It represents a valuable reference work for researchers and professionals working within the related areas of climate variability and change.

  11. The impact of process variables and wear characteristics on the cutting tool performance using Finite Element Analysis

    OpenAIRE

    Jiang, Xiaoheng

    2016-01-01

    The frequent failure of cutting tool in the cutting process may cause a huge loss of money and time especially for hard to machine materials such as titanium alloys. Thus this study is mainly focused on the impact of wear characteristics and process variables on the cutting tool which is ignored by most of researchers. A thermo-mechanical finite element model of orthogonal metal cutting with segment chip formation is presented. This model can be used to predict the process performance in the ...

  12. Effect of cutting edge preparation on tool performance in hard-turning of DF-3 tool steel with ceramic tools

    Energy Technology Data Exchange (ETDEWEB)

    Davoudinejad, A.; Noordin, M. Y. [Universiti Teknologi Malaysia, Skudai (Malaysia)

    2014-11-15

    This study presents an experimental investigation on turning hardened DF-3 tool steel (∼ 58HRC) with PVD-TiN coated mixed ceramic. We focused on the effect of chamfer and honed edge geometry on tool wear, tool life, cutting forces and surface finish of the machined work piece. The effects of the process parameters on performance characteristics were investigated using ANOVA. It was found that longer tool life was recorded with chamfered edge geometry at various cutting conditions. The typical damage observed as flank and crater wear for ceramic tools and abrasive wear was found as the main mechanism.The optimal cutting speed was 155 m/min, with which a tolerable tool life and volume of material removal was obtained for both edges geometry. Finer machined surface was left by chamfered tool with feeds and speeds in the range of 0.125-0.05 mm/rev and 155-210 m/min, respectively; also, cutting forces decrease with increased cutting speed. The obtained consequence of cutting forces shows that tool wear has a considerable effect on cutting forces and greater forces values recorded with honed tools.

  13. Effect of cutting edge preparation on tool performance in hard-turning of DF-3 tool steel with ceramic tools

    International Nuclear Information System (INIS)

    Davoudinejad, A.; Noordin, M. Y.

    2014-01-01

    This study presents an experimental investigation on turning hardened DF-3 tool steel (∼ 58HRC) with PVD-TiN coated mixed ceramic. We focused on the effect of chamfer and honed edge geometry on tool wear, tool life, cutting forces and surface finish of the machined work piece. The effects of the process parameters on performance characteristics were investigated using ANOVA. It was found that longer tool life was recorded with chamfered edge geometry at various cutting conditions. The typical damage observed as flank and crater wear for ceramic tools and abrasive wear was found as the main mechanism.The optimal cutting speed was 155 m/min, with which a tolerable tool life and volume of material removal was obtained for both edges geometry. Finer machined surface was left by chamfered tool with feeds and speeds in the range of 0.125-0.05 mm/rev and 155-210 m/min, respectively; also, cutting forces decrease with increased cutting speed. The obtained consequence of cutting forces shows that tool wear has a considerable effect on cutting forces and greater forces values recorded with honed tools.

  14. Continued development of modeling tools and theory for RF heating

    International Nuclear Information System (INIS)

    1998-01-01

    Mission Research Corporation (MRC) is pleased to present the Department of Energy (DOE) with its renewal proposal to the Continued Development of Modeling Tools and Theory for RF Heating program. The objective of the program is to continue and extend the earlier work done by the proposed principal investigator in the field of modeling (Radio Frequency) RF heating experiments in the large tokamak fusion experiments, particularly the Tokamak Fusion Test Reactor (TFTR) device located at Princeton Plasma Physics Laboratory (PPPL). An integral part of this work is the investigation and, in some cases, resolution of theoretical issues which pertain to accurate modeling. MRC is nearing the successful completion of the specified tasks of the Continued Development of Modeling Tools and Theory for RF Heating project. The following tasks are either completed or nearing completion. (1) Anisotropic temperature and rotation upgrades; (2) Modeling for relativistic ECRH; (3) Further documentation of SHOOT and SPRUCE. As a result of the progress achieved under this project, MRC has been urged to continue this effort. Specifically, during the performance of this project two topics were identified by PPPL personnel as new applications of the existing RF modeling tools. These two topics concern (a) future fast-wave current drive experiments on the large tokamaks including TFTR and (c) the interpretation of existing and future RF probe data from TFTR. To address each of these topics requires some modification or enhancement of the existing modeling tools, and the first topic requires resolution of certain theoretical issues to produce self-consistent results. This work falls within the scope of the original project and is more suited to the project's renewal than to the initiation of a new project

  15. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  16. Nanocomposites for Machining Tools

    Directory of Open Access Journals (Sweden)

    Daria Sidorenko

    2017-10-01

    Full Text Available Machining tools are used in many areas of production. To a considerable extent, the performance characteristics of the tools determine the quality and cost of obtained products. The main materials used for producing machining tools are steel, cemented carbides, ceramics and superhard materials. A promising way to improve the performance characteristics of these materials is to design new nanocomposites based on them. The application of micromechanical modeling during the elaboration of composite materials for machining tools can reduce the financial and time costs for development of new tools, with enhanced performance. This article reviews the main groups of nanocomposites for machining tools and their performance.

  17. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  18. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this project is model-based design (MBD) tools for predicting the performance and useful life of commercial-off-the-shelf (COTS) components and...

  19. Assessing teamwork performance in obstetrics: A systematic search and review of validated tools.

    Science.gov (United States)

    Fransen, Annemarie F; de Boer, Liza; Kienhorst, Dieneke; Truijens, Sophie E; van Runnard Heimel, Pieter J; Oei, S Guid

    2017-09-01

    Teamwork performance is an essential component for the clinical efficiency of multi-professional teams in obstetric care. As patient safety is related to teamwork performance, it has become an important learning goal in simulation-based education. In order to improve teamwork performance, reliable assessment tools are required. These can be used to provide feedback during training courses, or to compare learning effects between different types of training courses. The aim of the current study is to (1) identify the available assessment tools to evaluate obstetric teamwork performance in a simulated environment, and (2) evaluate their psychometric properties in order to identify the most valuable tool(s) to use. We performed a systematic search in PubMed, MEDLINE, and EMBASE to identify articles describing assessment tools for the evaluation of obstetric teamwork performance in a simulated environment. In order to evaluate the quality of the identified assessment tools the standards and grading rules have been applied as recommended by the Accreditation Council for Graduate Medical Education (ACGME) Committee on Educational Outcomes. The included studies were also assessed according to the Oxford Centre for Evidence Based Medicine (OCEBM) levels of evidence. This search resulted in the inclusion of five articles describing the following six tools: Clinical Teamwork Scale, Human Factors Rating Scale, Global Rating Scale, Assessment of Obstetric Team Performance, Global Assessment of Obstetric Team Performance, and the Teamwork Measurement Tool. Based on the ACGME guidelines we assigned a Class 3, level C of evidence, to all tools. Regarding the OCEBM levels of evidence, a level 3b was assigned to two studies and a level 4 to four studies. The Clinical Teamwork Scale demonstrated the most comprehensive validation, and the Teamwork Measurement Tool demonstrated promising results, however it is recommended to further investigate its reliability. Copyright © 2017

  20. A corpus of full-text journal articles is a robust evaluation tool for revealing differences in performance of biomedical natural language processing tools.

    Science.gov (United States)

    Verspoor, Karin; Cohen, Kevin Bretonnel; Lanfranchi, Arrick; Warner, Colin; Johnson, Helen L; Roeder, Christophe; Choi, Jinho D; Funk, Christopher; Malenkiy, Yuriy; Eckert, Miriam; Xue, Nianwen; Baumgartner, William A; Bada, Michael; Palmer, Martha; Hunter, Lawrence E

    2012-08-17

    We introduce the linguistic annotation of a corpus of 97 full-text biomedical publications, known as the Colorado Richly Annotated Full Text (CRAFT) corpus. We further assess the performance of existing tools for performing sentence splitting, tokenization, syntactic parsing, and named entity recognition on this corpus. Many biomedical natural language processing systems demonstrated large differences between their previously published results and their performance on the CRAFT corpus when tested with the publicly available models or rule sets. Trainable systems differed widely with respect to their ability to build high-performing models based on this data. The finding that some systems were able to train high-performing models based on this corpus is additional evidence, beyond high inter-annotator agreement, that the quality of the CRAFT corpus is high. The overall poor performance of various systems indicates that considerable work needs to be done to enable natural language processing systems to work well when the input is full-text journal articles. The CRAFT corpus provides a valuable resource to the biomedical natural language processing community for evaluation and training of new models for biomedical full text publications.

  1. Model-Based Design Tools for Extending COTS Components To Extreme Environments, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation in this Phase I project is to prove the feasibility of using model-based design (MBD) tools to predict the performance and useful life of...

  2. Molecular modeling for the design of novel performance chemicals and materials

    CERN Document Server

    Rai, Beena

    2012-01-01

    Molecular modeling (MM) tools offer significant benefits in the design of industrial chemical plants and material processing operations. While the role of MM in biological fields is well established, in most cases MM works as an accessory in novel products/materials development rather than a tool for direct innovation. As a result, MM engineers and practitioners are often seized with the question: ""How do I leverage these tools to develop novel materials or chemicals in my industry?"" Molecular Modeling for the Design of Novel Performance Chemicals and Materials answers this important questio

  3. CyNC - towards a General Tool for Performance Analysis of Complex Distributed Real Time Systems

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Jessen, Jan Jakob; Nielsen, Jens F. Dalsgaard

    2005-01-01

    The paper addresses the current state and the ongoing activities of a tool for performance analysis of complex real time systems. The tool named CyNC is based on network calculus allowing for the computation of backlogs and delays in a system from specified lower and upper bounds of external...... workflow and computational resources. The current version of the tool implements an extension to previous work in that it allows for general workflow and resource bounds and provides optimal solutions even to systems with cyclic dependencies. Despite the virtues of the current tool, improvements...... and extensions still remain, which are in focus of ongoing activities. Improvements include accounting for phase information to improve bounds, whereas the tool awaits extension to include flow control models, which both depend on the possibility of accounting for propagation delay. Since the current version...

  4. Paramedir: A Tool for Programmable Performance Analysis

    Science.gov (United States)

    Jost, Gabriele; Labarta, Jesus; Gimenez, Judit

    2004-01-01

    Performance analysis of parallel scientific applications is time consuming and requires great expertise in areas such as programming paradigms, system software, and computer hardware architectures. In this paper we describe a tool that facilitates the programmability of performance metric calculations thereby allowing the automation of the analysis and reducing the application development time. We demonstrate how the system can be used to capture knowledge and intuition acquired by advanced parallel programmers in order to be transferred to novice users.

  5. The Cryosphere Model Comparison Tool (CmCt): Ice Sheet Model Validation and Comparison Tool for Greenland and Antarctica

    Science.gov (United States)

    Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.

    2017-12-01

    The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.

  6. An Evaluation of Growth Models as Predictive Tools for Estimates at Completion (EAC)

    National Research Council Canada - National Science Library

    Trahan, Elizabeth N

    2009-01-01

    ...) as the Estimates at Completion (EAC). Our research evaluates the prospect of nonlinear growth modeling as an alternative to the current predictive tools used for calculating EAC, such as the Cost Performance Index (CPI...

  7. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  8. Formal Implementation of a Performance Evaluation Model for the Face Recognition System

    Directory of Open Access Journals (Sweden)

    Yong-Nyuo Shin

    2008-01-01

    Full Text Available Due to usability features, practical applications, and its lack of intrusiveness, face recognition technology, based on information, derived from individuals' facial features, has been attracting considerable attention recently. Reported recognition rates of commercialized face recognition systems cannot be admitted as official recognition rates, as they are based on assumptions that are beneficial to the specific system and face database. Therefore, performance evaluation methods and tools are necessary to objectively measure the accuracy and performance of any face recognition system. In this paper, we propose and formalize a performance evaluation model for the biometric recognition system, implementing an evaluation tool for face recognition systems based on the proposed model. Furthermore, we performed evaluations objectively by providing guidelines for the design and implementation of a performance evaluation system, formalizing the performance test process.

  9. Comparisons of Faulting-Based Pavement Performance Prediction Models

    Directory of Open Access Journals (Sweden)

    Weina Wang

    2017-01-01

    Full Text Available Faulting prediction is the core of concrete pavement maintenance and design. Highway agencies are always faced with the problem of lower accuracy for the prediction which causes costly maintenance. Although many researchers have developed some performance prediction models, the accuracy of prediction has remained a challenge. This paper reviews performance prediction models and JPCP faulting models that have been used in past research. Then three models including multivariate nonlinear regression (MNLR model, artificial neural network (ANN model, and Markov Chain (MC model are tested and compared using a set of actual pavement survey data taken on interstate highway with varying design features, traffic, and climate data. It is found that MNLR model needs further recalibration, while the ANN model needs more data for training the network. MC model seems a good tool for pavement performance prediction when the data is limited, but it is based on visual inspections and not explicitly related to quantitative physical parameters. This paper then suggests that the further direction for developing the performance prediction model is incorporating the advantages and disadvantages of different models to obtain better accuracy.

  10. Optimizing Performance in Psychology Students - Neurofeedback as a performance enhancing tool

    OpenAIRE

    Elvebredd, Pernille Malene Sandberg

    2014-01-01

    Neurofeedback has been shown to be successful in treating epilepsy and ADHD and in enhancing performance in musicians and dancers. The objective of the current study was to examine the effect of a neurofeedback beta1/theta protocol as a tool for optimizing performance in healthy psychology students. To achieve this, 19-channel EEG was recorded during a visual Go/NoGo task at two time points, both prior to and following either ten sessions of neurofeedback training (10 individuals) or ten sess...

  11. Predicting the Consequences of Workload Management Strategies with Human Performance Modeling

    Science.gov (United States)

    Mitchell, Diane Kuhl; Samma, Charneta

    2011-01-01

    Human performance modelers at the US Army Research Laboratory have developed an approach for establishing Soldier high workload that can be used for analyses of proposed system designs. Their technique includes three key components. To implement the approach in an experiment, the researcher would create two experimental conditions: a baseline and a design alternative. Next they would identify a scenario in which the test participants perform all their representative concurrent interactions with the system. This scenario should include any events that would trigger a different set of goals for the human operators. They would collect workload values during both the control and alternative design condition to see if the alternative increased workload and decreased performance. They have successfully implemented this approach for military vehicle. designs using the human performance modeling tool, IMPRINT. Although ARL researches use IMPRINT to implement their approach, it can be applied to any workload analysis. Researchers using other modeling and simulations tools or conducting experiments or field tests can use the same approach.

  12. Acoustic/seismic signal propagation and sensor performance modeling

    Science.gov (United States)

    Wilson, D. Keith; Marlin, David H.; Mackay, Sean

    2007-04-01

    Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).

  13. A tool to convert CAD models for importation into Geant4

    Science.gov (United States)

    Vuosalo, C.; Carlsmith, D.; Dasu, S.; Palladino, K.; LUX-ZEPLIN Collaboration

    2017-10-01

    The engineering design of a particle detector is usually performed in a Computer Aided Design (CAD) program, and simulation of the detector’s performance can be done with a Geant4-based program. However, transferring the detector design from the CAD program to Geant4 can be laborious and error-prone. SW2GDML is a tool that reads a design in the popular SOLIDWORKS CAD program and outputs Geometry Description Markup Language (GDML), used by Geant4 for importing and exporting detector geometries. Other methods for outputting CAD designs are available, such as the STEP format, and tools exist to convert these formats into GDML. However, these conversion methods produce very large and unwieldy designs composed of tessellated solids that can reduce Geant4 performance. In contrast, SW2GDML produces compact, human-readable GDML that employs standard geometric shapes rather than tessellated solids. This paper will describe the development and current capabilities of SW2GDML and plans for its enhancement. The aim of this tool is to automate importation of detector engineering models into Geant4-based simulation programs to support rapid, iterative cycles of detector design, simulation, and optimization.

  14. Anesthesia report card - a customizable tool for performance improvement.

    Science.gov (United States)

    Peccora, Christian D; Gimlich, Robert; Cornell, Richard P; Vacanti, Charles A; Ehrenfeld, Jesse M; Urman, Richard D

    2014-09-01

    Measuring and providing performance feedback to physicians has gained momentum not only as a way to comply with regulatory requirements, but also as a way to improve patient care. Measurement of structural, process, and outcome metrics in a reliable, evidence-based, specialty-specific manner maximizes the probability of improving physician performance. The manner in which feedback is provided influences whether the measurement tool will be successful in changing behavior. We created an innovative reporting tool template for anesthesiology practitioners designed to provide detailed, continuous feedback covering many aspects of clinical practice. The literature regarding quality metric measurement and feedback strategies was examined to design a reporting tool that could provide high quality information and result in improved performance of clinical and academic tasks. A committee of department leaders and information technology professionals was tasked with determining the measurement criteria and infrastructure needed to generate these reports. Data was collected in a systematic, unbiased manner, and reports were populated with information from multiple databases and software systems. Feedback would be based on frequently updated information and allow for analysis of historical performance as well as comparison amongst peers. A template for an anesthesia report card was created. Categories included compliance, credentialing and qualifications, education, clinical and operating room responsibilities, and academic achievements. Physicians were able to choose to be evaluated in some of the categories and had to meet a minimum number of criteria within each category. This allowed for customization to each practitioner's practice. Criteria were derived from the measures of academic and clinical proficiency, as well as quality metrics. Criteria were objective measures and data gathering was often automated. Reports could be generated that were updated daily and provided

  15. Influence of the worn tool affected by built-up edge (BUE) on micro end-milling process performance: A 3D finite element modeling investigation

    DEFF Research Database (Denmark)

    Davoudinejad, Ali; Tosello, Guido; Annoni, Massimiliano

    2017-01-01

    on the process performance is investigated by comparing the predicted 3d chip flow shape, burr formation and cutting forces with experiments conducted on an ultra-high precision micro milling center. Simulations indicate that BUE has significant impact on the chip shape and chip load for different teeth......Micro milling process has been utilized for several decades due to the flexibility of the process in producing complex components. The small size of the process makes the comprehension of cutting phenomenon details more difficult. This study presents a 3D finite element modeling (3D FEM) approach...... for the micro end-milling process of Aluminum material (Al6082-T6). 3D FEM simulations are carried out in full slot micro end-milling and contour up milling. The model first implements the actual tool geometry and then the effect of typical built-up edge (BUE) on the milling tool. The influence of BUE...

  16. Examining the Effects of Field Dependence-Independence on Learners' Problem-Solving Performance and Interaction with a Computer Modeling Tool: Implications for the Design of Joint Cognitive Systems

    Science.gov (United States)

    Angeli, Charoula

    2013-01-01

    An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…

  17. A study on human performance enhancement plan in maintenance field by survey on actual condition of human performance tools - 15035

    International Nuclear Information System (INIS)

    Park, J.; Jeong, H.; Kim, Y.

    2015-01-01

    Human errors in nuclear power plant are one of the important factors that may cause reactor trip. Most operating companies of nuclear power plants manage human factor systematically through tools like HPES (Human Performance Enhancement), PSR (Periodic Safety Review), OE (Operating Experience), human performance tools, safety culture assessment and CAP (Corrective Action Program). But human factors are managed passively in maintenance field, because maintenance works are carrying out by partner companies. KHNP also contracts the maintenance work to the partner companies, and advise them to use human performance tools. But the actual condition on partner companies has not been surveyed. This paper suggests some plans that can improve human performance by analyzing the opinion of partner company employees about the causes and solutions of human errors, by analyzing utilization of human performance tools and by comparing the results of the partner companies survey with the results of the operating company survey. The survey is conducted to 3 partner companies by similar contents and categories in order to compare partner company with operating company, and the main analysis fields are the following: -1) Level of understanding and utilization of the human performance tools, -2) Difficulties of applying the human performance tools, -3) Level of employee's training (or education) in the use of the human performance tools, and -4) Root causes of human errors and countermeasures. (authors)

  18. Designing tools for oil exploration using nuclear modeling

    Science.gov (United States)

    Mauborgne, Marie-Laure; Allioli, Françoise; Manclossi, Mauro; Nicoletti, Luisa; Stoller, Chris; Evans, Mike

    2017-09-01

    When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  19. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  20. The effect of ergonomic laparoscopic tool handle design on performance and efficiency.

    Science.gov (United States)

    Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S

    2015-09-01

    Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion

  1. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  2. 2B-Alert Web: An Open-Access Tool for Predicting the Effects of Sleep/Wake Schedules and Caffeine Consumption on Neurobehavioral Performance.

    Science.gov (United States)

    Reifman, Jaques; Kumar, Kamal; Wesensten, Nancy J; Tountas, Nikolaos A; Balkin, Thomas J; Ramakrishnan, Sridhar

    2016-12-01

    Computational tools that predict the effects of daily sleep/wake amounts on neurobehavioral performance are critical components of fatigue management systems, allowing for the identification of periods during which individuals are at increased risk for performance errors. However, none of the existing computational tools is publicly available, and the commercially available tools do not account for the beneficial effects of caffeine on performance, limiting their practical utility. Here, we introduce 2B-Alert Web, an open-access tool for predicting neurobehavioral performance, which accounts for the effects of sleep/wake schedules, time of day, and caffeine consumption, while incorporating the latest scientific findings in sleep restriction, sleep extension, and recovery sleep. We combined our validated Unified Model of Performance and our validated caffeine model to form a single, integrated modeling framework instantiated as a Web-enabled tool. 2B-Alert Web allows users to input daily sleep/wake schedules and caffeine consumption (dosage and time) to obtain group-average predictions of neurobehavioral performance based on psychomotor vigilance tasks. 2B-Alert Web is accessible at: https://2b-alert-web.bhsai.org. The 2B-Alert Web tool allows users to obtain predictions for mean response time, mean reciprocal response time, and number of lapses. The graphing tool allows for simultaneous display of up to seven different sleep/wake and caffeine schedules. The schedules and corresponding predicted outputs can be saved as a Microsoft Excel file; the corresponding plots can be saved as an image file. The schedules and predictions are erased when the user logs off, thereby maintaining privacy and confidentiality. The publicly accessible 2B-Alert Web tool is available for operators, schedulers, and neurobehavioral scientists as well as the general public to determine the impact of any given sleep/wake schedule, caffeine consumption, and time of day on performance of a

  3. Prospective performance evaluation of selected common virtual screening tools. Case study: Cyclooxygenase (COX) 1 and 2.

    Science.gov (United States)

    Kaserer, Teresa; Temml, Veronika; Kutil, Zsofia; Vanek, Tomas; Landa, Premysl; Schuster, Daniela

    2015-01-01

    Computational methods can be applied in drug development for the identification of novel lead candidates, but also for the prediction of pharmacokinetic properties and potential adverse effects, thereby aiding to prioritize and identify the most promising compounds. In principle, several techniques are available for this purpose, however, which one is the most suitable for a specific research objective still requires further investigation. Within this study, the performance of several programs, representing common virtual screening methods, was compared in a prospective manner. First, we selected top-ranked virtual screening hits from the three methods pharmacophore modeling, shape-based modeling, and docking. For comparison, these hits were then additionally predicted by external pharmacophore- and 2D similarity-based bioactivity profiling tools. Subsequently, the biological activities of the selected hits were assessed in vitro, which allowed for evaluating and comparing the prospective performance of the applied tools. Although all methods performed well, considerable differences were observed concerning hit rates, true positive and true negative hits, and hitlist composition. Our results suggest that a rational selection of the applied method represents a powerful strategy to maximize the success of a research project, tightly linked to its aims. We employed cyclooxygenase as application example, however, the focus of this study lied on highlighting the differences in the virtual screening tool performances and not in the identification of novel COX-inhibitors. Copyright © 2015 The Authors. Published by Elsevier Masson SAS.. All rights reserved.

  4. Modelling of Tool Wear and Residual Stress during Machining of AISI H13 Tool Steel

    Science.gov (United States)

    Outeiro, José C.; Umbrello, Domenico; Pina, José C.; Rizzuti, Stefania

    2007-05-01

    Residual stresses can enhance or impair the ability of a component to withstand loading conditions in service (fatigue, creep, stress corrosion cracking, etc.), depending on their nature: compressive or tensile, respectively. This poses enormous problems in structural assembly as this affects the structural integrity of the whole part. In addition, tool wear issues are of critical importance in manufacturing since these affect component quality, tool life and machining cost. Therefore, prediction and control of both tool wear and the residual stresses in machining are absolutely necessary. In this work, a two-dimensional Finite Element model using an implicit Lagrangian formulation with an automatic remeshing was applied to simulate the orthogonal cutting process of AISI H13 tool steel. To validate such model the predicted and experimentally measured chip geometry, cutting forces, temperatures, tool wear and residual stresses on the machined affected layers were compared. The proposed FE model allowed us to investigate the influence of tool geometry, cutting regime parameters and tool wear on residual stress distribution in the machined surface and subsurface of AISI H13 tool steel. The obtained results permit to conclude that in order to reduce the magnitude of surface residual stresses, the cutting speed should be increased, the uncut chip thickness (or feed) should be reduced and machining with honed tools having large cutting edge radii produce better results than chamfered tools. Moreover, increasing tool wear increases the magnitude of surface residual stresses.

  5. Modeling and performance assessment in QinetiQ of EO and IR airborne reconnaissance systems

    Science.gov (United States)

    Williams, John W.; Potter, Gary E.

    2002-11-01

    QinetiQ are the technical authority responsible for specifying the performance requirements for the procurement of airborne reconnaissance systems, on behalf of the UK MoD. They are also responsible for acceptance of delivered systems, overseeing and verifying the installed system performance as predicted and then assessed by the contractor. Measures of functional capability are central to these activities. The conduct of these activities utilises the broad technical insight and wide range of analysis tools and models available within QinetiQ. This paper focuses on the tools, methods and models that are applicable to systems based on EO and IR sensors. The tools, methods and models are described, and representative output for systems that QinetiQ has been responsible for is presented. The principle capability applicable to EO and IR airborne reconnaissance systems is the STAR (Simulation Tools for Airborne Reconnaissance) suite of models. STAR generates predictions of performance measures such as GRD (Ground Resolved Distance) and GIQE (General Image Quality) NIIRS (National Imagery Interpretation Rating Scales). It also generates images representing sensor output, using the scene generation software CAMEO-SIM and the imaging sensor model EMERALD. The simulated image 'quality' is fully correlated with the predicted non-imaging performance measures. STAR also generates image and table data that is compliant with STANAG 7023, which may be used to test ground station functionality.

  6. ATR performance modeling concepts

    Science.gov (United States)

    Ross, Timothy D.; Baker, Hyatt B.; Nolan, Adam R.; McGinnis, Ryan E.; Paulson, Christopher R.

    2016-05-01

    Performance models are needed for automatic target recognition (ATR) development and use. ATRs consume sensor data and produce decisions about the scene observed. ATR performance models (APMs) on the other hand consume operating conditions (OCs) and produce probabilities about what the ATR will produce. APMs are needed for many modeling roles of many kinds of ATRs (each with different sensing modality and exploitation functionality combinations); moreover, there are different approaches to constructing the APMs. Therefore, although many APMs have been developed, there is rarely one that fits a particular need. Clarified APM concepts may allow us to recognize new uses of existing APMs and identify new APM technologies and components that better support coverage of the needed APMs. The concepts begin with thinking of ATRs as mapping OCs of the real scene (including the sensor data) to reports. An APM is then a mapping from explicit quantized OCs (represented with less resolution than the real OCs) and latent OC distributions to report distributions. The roles of APMs can be distinguished by the explicit OCs they consume. APMs used in simulations consume the true state that the ATR is attempting to report. APMs used online with the exploitation consume the sensor signal and derivatives, such as match scores. APMs used in sensor management consume neither of those, but estimate performance from other OCs. This paper will summarize the major building blocks for APMs, including knowledge sources, OC models, look-up tables, analytical and learned mappings, and tools for signal synthesis and exploitation.

  7. ASPECT (Automated System-level Performance Evaluation and Characterization Tool), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — SSCI has developed a suite of SAA tools and an analysis capability referred to as ASPECT (Automated System-level Performance Evaluation and Characterization Tool)....

  8. Designing tools for oil exploration using nuclear modeling

    Directory of Open Access Journals (Sweden)

    Mauborgne Marie-Laure

    2017-01-01

    Full Text Available When designing nuclear tools for oil exploration, one of the first steps is typically nuclear modeling for concept evaluation and initial characterization. Having an accurate model, including the availability of accurate cross sections, is essential to reduce or avoid time consuming and costly design iterations. During tool response characterization, modeling is benchmarked with experimental data and then used to complement and to expand the database to make it more detailed and inclusive of more measurement environments which are difficult or impossible to reproduce in the laboratory. We present comparisons of our modeling results obtained using the ENDF/B-VI and ENDF/B-VII cross section data bases, focusing on the response to a few elements found in the tool, borehole and subsurface formation. For neutron-induced inelastic and capture gamma ray spectroscopy, major obstacles may be caused by missing or inaccurate cross sections for essential materials. We show examples of the benchmarking of modeling results against experimental data obtained during tool characterization and discuss observed discrepancies.

  9. Tribological and Wear Performance of Carbide Tools with TiB2 PVD Coating under Varying Machining Conditions of TiAl6V4 Aerospace Alloy

    Directory of Open Access Journals (Sweden)

    Jose Mario Paiva

    2017-11-01

    Full Text Available Tribological phenomena and tool wear mechanisms during machining of hard-to-cut TiAl6V4 aerospace alloy have been investigated in detail. Since cutting tool wear is directly affected by tribological phenomena occurring between the surfaces of the workpiece and the cutting tool, the performance of the cutting tool is strongly associated with the conditions of the machining process. The present work shows the effect of different machining conditions on the tribological and wear performance of TiB2-coated cutting tools compared to uncoated carbide tools. FEM modeling of the temperature profile on the friction surface was performed for wet machining conditions under varying cutting parameters. Comprehensive characterization of the TiB2 coated vs. uncoated cutting tool wear performance was made using optical 3D imaging, SEM/EDX and XPS methods respectively. The results obtained were linked to the FEM modeling. The studies carried out show that during machining of the TiAl6V4 alloy, the efficiency of the TiB2 coating application for carbide cutting tools strongly depends on cutting conditions. The TiB2 coating is very efficient under roughing at low speeds (with strong buildup edge formation. In contrast, it shows similar wear performance to the uncoated tool under finishing operations at higher cutting speeds when cratering wear predominates.

  10. Evaluation of clinical information modeling tools.

    Science.gov (United States)

    Moreno-Conde, Alberto; Austin, Tony; Moreno-Conde, Jesús; Parra-Calderón, Carlos L; Kalra, Dipak

    2016-11-01

    Clinical information models are formal specifications for representing the structure and semantics of the clinical content within electronic health record systems. This research aims to define, test, and validate evaluation metrics for software tools designed to support the processes associated with the definition, management, and implementation of these models. The proposed framework builds on previous research that focused on obtaining agreement on the essential requirements in this area. A set of 50 conformance criteria were defined based on the 20 functional requirements agreed by that consensus and applied to evaluate the currently available tools. Of the 11 initiative developing tools for clinical information modeling identified, 9 were evaluated according to their performance on the evaluation metrics. Results show that functionalities related to management of data types, specifications, metadata, and terminology or ontology bindings have a good level of adoption. Improvements can be made in other areas focused on information modeling and associated processes. Other criteria related to displaying semantic relationships between concepts and communication with terminology servers had low levels of adoption. The proposed evaluation metrics were successfully tested and validated against a representative sample of existing tools. The results identify the need to improve tool support for information modeling and software development processes, especially in those areas related to governance, clinician involvement, and optimizing the technical validation of testing processes. This research confirmed the potential of these evaluation metrics to support decision makers in identifying the most appropriate tool for their organization. Los Modelos de Información Clínica son especificaciones para representar la estructura y características semánticas del contenido clínico en los sistemas de Historia Clínica Electrónica. Esta investigación define, prueba y valida

  11. Clarity versus complexity: land-use modeling as a practical tool for decision-makers

    Science.gov (United States)

    Sohl, Terry L.; Claggett, Peter

    2013-01-01

    The last decade has seen a remarkable increase in the number of modeling tools available to examine future land-use and land-cover (LULC) change. Integrated modeling frameworks, agent-based models, cellular automata approaches, and other modeling techniques have substantially improved the representation of complex LULC systems, with each method using a different strategy to address complexity. However, despite the development of new and better modeling tools, the use of these tools is limited for actual planning, decision-making, or policy-making purposes. LULC modelers have become very adept at creating tools for modeling LULC change, but complicated models and lack of transparency limit their utility for decision-makers. The complicated nature of many LULC models also makes it impractical or even impossible to perform a rigorous analysis of modeling uncertainty. This paper provides a review of land-cover modeling approaches and the issues causes by the complicated nature of models, and provides suggestions to facilitate the increased use of LULC models by decision-makers and other stakeholders. The utility of LULC models themselves can be improved by 1) providing model code and documentation, 2) through the use of scenario frameworks to frame overall uncertainties, 3) improving methods for generalizing key LULC processes most important to stakeholders, and 4) adopting more rigorous standards for validating models and quantifying uncertainty. Communication with decision-makers and other stakeholders can be improved by increasing stakeholder participation in all stages of the modeling process, increasing the transparency of model structure and uncertainties, and developing user-friendly decision-support systems to bridge the link between LULC science and policy. By considering these options, LULC science will be better positioned to support decision-makers and increase real-world application of LULC modeling results.

  12. Econometric model as a regulatory tool in electricity distribution - Case Network Performance Assessment Model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost-effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent available at the time when analysis was done. However, since NPAM is under development, the parameters have been constantly changing. Therefore slightly changes in the results can occur if calculations were made with latest parameters. However, main conclusions are same and do not depend on exact parameters. (orig.)

  13. The performability tool P'ility

    NARCIS (Netherlands)

    Cloth, L.; Haverkort, Boudewijn R.H.M.

    The performability distribution is the distribution of accumulated reward in a Markov reward model (MRM) with state reward rates. Since its introduction, several algorithms for the numerical evaluation of the performability distribution have been proposed. Many of these algorithms only solve

  14. Exploring the effect of at-risk case management compensation on hospital pay-for-performance outcomes: tools for change.

    Science.gov (United States)

    Granata, Randy L; Hamilton, Karen

    2015-01-01

    Acute care nurse case managers are charged with compliance oversight, managing throughput, and ensuring safe care transitions. Leveraging the roles of nurse case managers and social workers during care transitions translates into improved fiscal performance under the Affordable Care Act. This article aims to equip leaders in the field of case management with tools to facilitate the alignment of case management systems with hospital pay-for-performance measures. A quality improvement project was implemented at a hospital in south Alabama to examine the question: for acute care case managers, what is the effect of key performance indictors using an at-risk compensation model in comparison to past nonincentive models on hospital readmissions, lengths of stay, and patient satisfaction surrounding the discharge process. Inpatient acute care hospital. The implementation of an at-risk compensation model using key performance indicators, Lean Six Sigma methodology, and Creative Health Care Management's Relationship-Based Care framework demonstrated reduced length of stay, hospital readmissions, and improved patient experiences. Regulatory changes and new models of reimbursement in the acute care environment have created the perfect storm for case management leaders. Hospital fiscal performance is dependent on effective case management processes and the ability to optimize scarce resources. The quality improvement project aimed to further align case management systems and structures with hospital pay-for-performance measures. Tools for change were presented to assist leaders with the change acceleration process.

  15. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  16. Processes, Performance Drivers and ICT Tools in Human Resources Management

    Directory of Open Access Journals (Sweden)

    Oškrdal Václav

    2011-06-01

    Full Text Available This article presents an insight to processes, performance drivers and ICT tools in human resources (HR management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes are further enhanced with results obtained from a survey among Czech companies. This article was written with kind courtesy of finances provided by VŠE IGA grant „IGA – 32/2010“.

  17. Model Checking Markov Chains: Techniques and Tools

    NARCIS (Netherlands)

    Zapreev, I.S.

    2008-01-01

    This dissertation deals with four important aspects of model checking Markov chains: the development of efficient model-checking tools, the improvement of model-checking algorithms, the efficiency of the state-space reduction techniques, and the development of simulation-based model-checking

  18. Using high-performance mathematical modelling tools to predict erosion and sediment fluxes in peri-urban catchments

    Science.gov (United States)

    Pereira, André; Conde, Daniel; Ferreira, Carla S. S.; Walsh, Rory; Ferreira, Rui M. L.

    2017-04-01

    Deforestation and urbanization generally lead to increased soil erosion andthrough the indirect effect of increased overland flow and peak flood discharges. Mathematical modelling tools can be helpful for predicting the spatial distribution of erosion and the morphological changes on the channel network. This is especially useful to predict the impacts of land-use changes in parts of the watershed, namely due to urbanization. However, given the size of the computational domain (normally the watershed itself), the need for high spatial resolution data to model accurately sediment transport processes and possible need to model transcritical flows, the computational cost is high and requires high-performance computing techniques. The aim of this work is to present the latest developments of the hydrodynamic and morphological model STAV2D and its applicability to predict runoff and erosion at watershed scale. STAV2D was developed at CEris - Instituto Superior Técnico, Universidade de Lisboa - as a tool particularly appropriated to model strong transient flows in complex and dynamic geometries. It is based on an explicit, first-order 2DH finite-volume discretization scheme for unstructured triangular meshes, in which a flux-splitting technique is paired with a reviewed Roe-Riemann solver, yielding a model applicable to discontinuous flows over time-evolving geometries. STAV2D features solid transport in both Euleran and Lagrangian forms, with the aim of describing the transport of fine natural sediments and then the large individual debris. The model has been validated with theoretical solutions and laboratory experiments (Canelas et al., 2013 & Conde et al., 2015). STAV-2D now supports fully distributed and heterogeneous simulations where multiple different hardware devices can be used to accelerate computation time within a unified Object-Oriented approach: the source code for CPU and GPU has the same compilation units and requires no device specific branches, like

  19. A CONCEPTUAL TOOL FOR ASSESSING CLIENT PERFORMANCE IN THE CONSTRUCTION PROJECT COALITION

    Directory of Open Access Journals (Sweden)

    Gary D. Holt

    2002-01-01

    Full Text Available Due to the significant impact of client performance on overall project performance and the interdependence of participant%5C%27s performance in the construction project coalition %28i.e. clients%2C designers and constructors%29%2C there is a need to establish client performance measures. Based on data collected from in-depth interviews with nineteen UK architects and nine UK contractors%2C a generic tool for the on-going formal assessment of client performance is presented. It was found that this approach to performance assessment %28i.e. from the view point of other%2C non-client coalition participants%29 should lead to improved project relationships. Data analysis showed that in addition to %5C%27harder%5C%27 measures such as understanding of project requirements and finance%2C other%2C %5C%27softer%5C%27 measures of client performance %28e.g. attitude%29 were worthy of consideration since they determine the quality of participant relationships. It is recommended that the tool be used to promote more effective client performance and thus enhance coalition relationships%2C enabling continuous improvement. The ultimate aim is to develop similar tools for the assessment of all coalition participants based on a culture of openness and trust. Abstract in Bahasa Indonesia : assessment+tool%2C+coalition+participants%2C+client+performance%2C+perceptions%2C+performance+measures%2C+satisfaction.

  20. Performance Analysis Tool for HPC and Big Data Applications on Scientific Clusters

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Wucherl [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Koo, Michelle [Univ. of California, Berkeley, CA (United States); Cao, Yu [California Inst. of Technology (CalTech), Pasadena, CA (United States); Sim, Alex [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Univ. of California, Berkeley, CA (United States); Wu, Kesheng [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-09-17

    Big data is prevalent in HPC computing. Many HPC projects rely on complex workflows to analyze terabytes or petabytes of data. These workflows often require running over thousands of CPU cores and performing simultaneous data accesses, data movements, and computation. It is challenging to analyze the performance involving terabytes or petabytes of workflow data or measurement data of the executions, from complex workflows over a large number of nodes and multiple parallel task executions. To help identify performance bottlenecks or debug the performance issues in large-scale scientific applications and scientific clusters, we have developed a performance analysis framework, using state-ofthe- art open-source big data processing tools. Our tool can ingest system logs and application performance measurements to extract key performance features, and apply the most sophisticated statistical tools and data mining methods on the performance data. It utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of the big data analysis framework, we conduct case studies on the workflows from an astronomy project known as the Palomar Transient Factory (PTF) and the job logs from the genome analysis scientific cluster. Our study processed many terabytes of system logs and application performance measurements collected on the HPC systems at NERSC. The implementation of our tool is generic enough to be used for analyzing the performance of other HPC systems and Big Data workows.

  1. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  2. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...

  3. Prototype road weather performance management tool : installation instructions & user manual.

    Science.gov (United States)

    2016-07-20

    This document is the Installation Instructions and User Manual for the Road Weather Performance Management (RW-PM) Tool developed for the project on Development and Demonstration of a Prototype Road Weather Performance Management Application that Use...

  4. Evaluating hydrological model performance using information theory-based metrics

    Science.gov (United States)

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic m...

  5. Basic performance metrics of in-line inspection tools

    Energy Technology Data Exchange (ETDEWEB)

    Timashev, Sviatoslav A. [Russian Academy of Sciences (Russian Federation). Ural Branch. Science and Engineering Center

    2003-07-01

    The paper discusses current possibilities and drawbacks of in-line inspection (ILI) in detecting, identifying, locating and sizing of all types of defects in oil and gas pipelines. A full set of consistent and universal ILI tool performance metrics is constructed. A holistic methodology that extracts maximum value from the ILI measurements in defect detecting, locating, identifying, sizing and verifying the results of ILI is presented. The outlined approach is being implemented as a software component of a multi-purpose HR MFL ILI tool and is proposed for the new API 1163 ILI Qualification Standard. (author)

  6. Prototype road weather performance management tool : project report : draft report.

    Science.gov (United States)

    2016-09-30

    This report is the Project Report for the Road Weather Performance Management (RW-PM) Tool developed for the project on Development and Demonstration of a Prototype Road Weather Performance Management Application that Uses Connected Vehicle Data (RW-...

  7. A comparison of tools for modeling freshwater ecosystem services.

    Science.gov (United States)

    Vigerstol, Kari L; Aukema, Juliann E

    2011-10-01

    Interest in ecosystem services has grown tremendously among a wide range of sectors, including government agencies, NGO's and the business community. Ecosystem services entailing freshwater (e.g. flood control, the provision of hydropower, and water supply), as well as carbon storage and sequestration, have received the greatest attention in both scientific and on-the-ground applications. Given the newness of the field and the variety of tools for predicting water-based services, it is difficult to know which tools to use for different questions. There are two types of freshwater-related tools--traditional hydrologic tools and newer ecosystem services tools. Here we review two of the most prominent tools of each type and their possible applications. In particular, we compare the data requirements, ease of use, questions addressed, and interpretability of results among the models. We discuss the strengths, challenges and most appropriate applications of the different models. Traditional hydrological tools provide more detail whereas ecosystem services tools tend to be more accessible to non-experts and can provide a good general picture of these ecosystem services. We also suggest gaps in the modeling toolbox that would provide the greatest advances by improving existing tools. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Econometric model as a regulatory tool in electricity distribution. Case network performance assessment model

    International Nuclear Information System (INIS)

    Honkapuro, S.; Lassila, J.; Viljainen, S.; Tahvanainen, K.; Partanen, J.

    2004-01-01

    Electricity distribution companies operate in the state of natural monopolies since building of parallel networks is not cost- effective. Monopoly companies do not have pressure from the open markets to keep their prices and costs at reasonable level. The regulation of these companies is needed to prevent the misuse of the monopoly position. Regulation is usually focused either on the profit of company or on the price of electricity. Regulation method which determines allowed income for each company with generic computation model can be seen as an econometric model. In this document, the usability of an econometric model in the regulation of electricity distribution companies is evaluated. As the special case of an econometric model, the method called Network Performance Assessment Model, NPAM (Naetnyttomodellen in Swedish), is analysed. NPAM is developed by Swedish Energy Agency (STEM) for the regulation of electricity distribution companies. Both theoretical analysis and calculations of an example network area are presented in this document to find the major directing effects of the model. The parameters of NPAM, which are used in the calculations of this research report, were dated on 30th of March 2004. These parameters were most recent ones available at the time when analysis was done. However, since NPAM have been under development, the parameters have been constantly changing. Therefore slight changes might occur in the numerical results of calculations if they were made with the latest set of parameters. However, main conclusions are same and do not depend on exact parameters

  9. Green Infrastructure Models and Tools

    Science.gov (United States)

    The objective of this project is to modify and refine existing models and develop new tools to support decision making for the complete green infrastructure (GI) project lifecycle, including the planning and implementation of stormwater control in urban and agricultural settings,...

  10. ExEP yield modeling tool and validation test results

    Science.gov (United States)

    Morgan, Rhonda; Turmon, Michael; Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Liu, Xiang Cate; Nunez, Paul

    2017-09-01

    EXOSIMS is an open-source simulation tool for parametric modeling of the detection yield and characterization of exoplanets. EXOSIMS has been adopted by the Exoplanet Exploration Programs Standards Definition and Evaluation Team (ExSDET) as a common mechanism for comparison of exoplanet mission concept studies. To ensure trustworthiness of the tool, we developed a validation test plan that leverages the Python-language unit-test framework, utilizes integration tests for selected module interactions, and performs end-to-end crossvalidation with other yield tools. This paper presents the test methods and results, with the physics-based tests such as photometry and integration time calculation treated in detail and the functional tests treated summarily. The test case utilized a 4m unobscured telescope with an idealized coronagraph and an exoplanet population from the IPAC radial velocity (RV) exoplanet catalog. The known RV planets were set at quadrature to allow deterministic validation of the calculation of physical parameters, such as working angle, photon counts and integration time. The observing keepout region was tested by generating plots and movies of the targets and the keepout zone over a year. Although the keepout integration test required the interpretation of a user, the test revealed problems in the L2 halo orbit and the parameterization of keepout applied to some solar system bodies, which the development team was able to address. The validation testing of EXOSIMS was performed iteratively with the developers of EXOSIMS and resulted in a more robust, stable, and trustworthy tool that the exoplanet community can use to simulate exoplanet direct-detection missions from probe class, to WFIRST, up to large mission concepts such as HabEx and LUVOIR.

  11. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  12. DETRA: Model description and evaluation of model performance

    International Nuclear Information System (INIS)

    Suolanen, V.

    1996-01-01

    The computer code DETRA is a generic tool for environmental transfer analyses of radioactive or stable substances. The code has been applied for various purposes, mainly problems related to the biospheric transfer of radionuclides both in safety analyses of disposal of nuclear wastes and in consideration of foodchain exposure pathways in the analyses of off-site consequences of reactor accidents. For each specific application an individually tailored conceptual model can be developed. The biospheric transfer analyses performed by the code are typically carried out for terrestrial, aquatic and food chain applications. 21 refs, 35 figs, 15 tabs

  13. The european Trans-Tools transport model

    NARCIS (Netherlands)

    Rooijen, T. van; Burgess, A.

    2008-01-01

    The paper presents the use of ArcGIS in the Transtools Transport Model, TRANS-TOOLS, created by an international consortium for the European Commission. The model describe passenger as well as freight transport in Europe with all medium and long distance modes (cars, vans, trucks, train, inland

  14. Modeling Tools for Drilling, Reservoir Navigation, and Formation Evaluation

    Directory of Open Access Journals (Sweden)

    Sushant Dutta

    2012-06-01

    Full Text Available The oil and gas industry routinely uses borehole tools for measuring or logging rock and fluid properties of geologic formations to locate hydrocarbons and maximize their production. Pore fluids in formations of interest are usually hydrocarbons or water. Resistivity logging is based on the fact that oil and gas have a substantially higher resistivity than water. The first resistivity log was acquired in 1927, and resistivity logging is still the foremost measurement used for drilling and evaluation. However, the acquisition and interpretation of resistivity logging data has grown in complexity over the years. Resistivity logging tools operate in a wide range of frequencies (from DC to GHz and encounter extremely high (several orders of magnitude conductivity contrast between the metal mandrel of the tool and the geologic formation. Typical challenges include arbitrary angles of tool inclination, full tensor electric and magnetic field measurements, and interpretation of complicated anisotropic formation properties. These challenges combine to form some of the most intractable computational electromagnetic problems in the world. Reliable, fast, and convenient numerical modeling of logging tool responses is critical for tool design, sensor optimization, virtual prototyping, and log data inversion. This spectrum of applications necessitates both depth and breadth of modeling software—from blazing fast one-dimensional (1-D modeling codes to advanced threedimensional (3-D modeling software, and from in-house developed codes to commercial modeling packages. In this paper, with the help of several examples, we demonstrate our approach for using different modeling software to address different drilling and evaluation applications. In one example, fast 1-D modeling provides proactive geosteering information from a deep-reading azimuthal propagation resistivity measurement. In the second example, a 3-D model with multiple vertical resistive fractures

  15. Generating Performance Models for Irregular Applications

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Ryan D.; Tallent, Nathan R.; Vishnu, Abhinav; Kerbyson, Darren J.; Hoisie, Adolfy

    2017-05-30

    Many applications have irregular behavior --- non-uniform input data, input-dependent solvers, irregular memory accesses, unbiased branches --- that cannot be captured using today's automated performance modeling techniques. We describe new hierarchical critical path analyses for the \\Palm model generation tool. To create a model's structure, we capture tasks along representative MPI critical paths. We create a histogram of critical tasks with parameterized task arguments and instance counts. To model each task, we identify hot instruction-level sub-paths and model each sub-path based on data flow, instruction scheduling, and data locality. We describe application models that generate accurate predictions for strong scaling when varying CPU speed, cache speed, memory speed, and architecture. We present results for the Sweep3D neutron transport benchmark; Page Rank on multiple graphs; Support Vector Machine with pruning; and PFLOTRAN's reactive flow/transport solver with domain-induced load imbalance.

  16. Tool-Body Assimilation Model Based on Body Babbling and Neurodynamical System

    Directory of Open Access Journals (Sweden)

    Kuniyuki Takahashi

    2015-01-01

    Full Text Available We propose the new method of tool use with a tool-body assimilation model based on body babbling and a neurodynamical system for robots to use tools. Almost all existing studies for robots to use tools require predetermined motions and tool features; the motion patterns are limited and the robots cannot use novel tools. Other studies fully search for all available parameters for novel tools, but this leads to massive amounts of calculations. To solve these problems, we took the following approach: we used a humanoid robot model to generate random motions based on human body babbling. These rich motion experiences were used to train recurrent and deep neural networks for modeling a body image. Tool features were self-organized in parametric bias, modulating the body image according to the tool in use. Finally, we designed a neural network for the robot to generate motion only from the target image. Experiments were conducted with multiple tools for manipulating a cylindrical target object. The results show that the tool-body assimilation model is capable of motion generation.

  17. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...

  18. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  19. Storm Water Management Model Climate Adjustment Tool (SWMM-CAT)

    Science.gov (United States)

    The US EPA’s newest tool, the Stormwater Management Model (SWMM) – Climate Adjustment Tool (CAT) is meant to help municipal stormwater utilities better address potential climate change impacts affecting their operations. SWMM, first released in 1971, models hydrology and hydrauli...

  20. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  1. cmpXLatt: Westinghouse automated testing tool for nodal cross section models

    International Nuclear Information System (INIS)

    Guimaraes, Petri Forslund; Rönnberg, Kristian

    2011-01-01

    The procedure for evaluating the merits of different nodal cross section representation models is normally both cumbersome and time consuming, and includes many manual steps when preparing appropriate benchmark problems. Therefore, a computer tool called cmpXLatt has been developed at Westinghouse in order to facilitate the process of performing comparisons between nodal diffusion theory results and corresponding transport theory results on a single node basis. Due to the large number of state points that can be evaluated by cmpXLatt, a systematic and comprehensive way of performing verification and validation of nodal cross section models is provided. This paper presents the main features of cmpXLatt and demonstrates the benefits of using cmpXLatt in a real life application. (author)

  2. Surface modeling of workpiece and tool trajectory planning for spray painting robot.

    Directory of Open Access Journals (Sweden)

    Yang Tang

    Full Text Available Automated tool trajectory planning for spray-painting robots is still a challenging problem, especially for a large free-form surface. A grid approximation of a free-form surface is adopted in CAD modeling in this paper. A free-form surface model is approximated by a set of flat patches. We describe here an efficient and flexible tool trajectory optimization scheme using T-Bézier curves calculated in a new way from trigonometrical bases. The distance between the spray gun and the free-form surface along the normal vector is varied. Automotive body parts, which are large free-form surfaces, are used to test the scheme. The experimental results show that the trajectory planning algorithm achieves satisfactory performance. This algorithm can also be extended to other applications.

  3. Tool to assess teamwork performance in higher education Tool to assess teamwork performance in higher education Herramienta para evaluar el funcionamiento de los equipos de trabajo en entornos docentes

    Directory of Open Access Journals (Sweden)

    Carmen Jaca García

    2013-01-01

    Full Text Available Purpose: In this article, we present a tool for assessing the performance of teamwork activities. The results obtained from this tool can be used to guide teamwork learning and mentor students by giving them feedback on their work.Design/methodology/approach: The tool was based on a model of teamwork effectiveness from the literature. In this sense, analyzing a series of activities performed of by the same team can allow us to know whether the students involved in these activities are acquiring the ability to effectively engage in teamwork.Findings: The tool was used with 53 teams in a Bachelor’s Degree Programme inIndustrial Engineering. In this case, the results were used to analyze whether theproposed tool is suitable for assessing teamwork effectiveness in the university context.Practical implications: This tool will serve to provide feedback to students about the acquisition of their teamwork competency throughout the course of a Bachelor’s Degree.Social implications: Teamwork is a professional ability required by a great number of organizations, and universities should incorporate the development and assessment o fteamwork skills into their degree programmes.Originality/value: Developing this competence requires different social and personalskills, which are very difficult for students to acquire only by engaging in teamwork-building activities. The crucial step of providing feedbaPurpose: In this article, we present a tool for assessing the performance of teamwork activities. The results obtained from this tool can be used to guide teamwork learning and mentor students by giving them feedback on their work.Design/methodology/approach: The tool was based on a model of teamwork effectiveness from the literature. In this sense, analyzing a series of activities performed of by the same team can allow us to know whether the students involved in these activities are acquiring the ability to effectively engage in teamwork.Findings: The tool

  4. Integrating decision management with UML modeling concepts and tools

    DEFF Research Database (Denmark)

    Könemann, Patrick

    2009-01-01

    , but also for guiding the user by proposing subsequent decisions. In model-based software development, many decisions directly affect the structural and behavioral models used to describe and develop a software system and its architecture. However, the decisions are typically not connected to these models...... of formerly disconnected tools could improve tool usability as well as decision maker productivity....

  5. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  6. Spatial Modeling Tools for Cell Biology

    National Research Council Canada - National Science Library

    Przekwas, Andrzej; Friend, Tom; Teixeira, Rodrigo; Chen, Z. J; Wilkerson, Patrick

    2006-01-01

    .... Scientific potentials and military relevance of computational biology and bioinformatics have inspired DARPA/IPTO's visionary BioSPICE project to develop computational framework and modeling tools for cell biology...

  7. Force Sensor Based Tool Condition Monitoring Using a Heterogeneous Ensemble Learning Model

    Directory of Open Access Journals (Sweden)

    Guofeng Wang

    2014-11-01

    Full Text Available Tool condition monitoring (TCM plays an important role in improving machining efficiency and guaranteeing workpiece quality. In order to realize reliable recognition of the tool condition, a robust classifier needs to be constructed to depict the relationship between tool wear states and sensory information. However, because of the complexity of the machining process and the uncertainty of the tool wear evolution, it is hard for a single classifier to fit all the collected samples without sacrificing generalization ability. In this paper, heterogeneous ensemble learning is proposed to realize tool condition monitoring in which the support vector machine (SVM, hidden Markov model (HMM and radius basis function (RBF are selected as base classifiers and a stacking ensemble strategy is further used to reflect the relationship between the outputs of these base classifiers and tool wear states. Based on the heterogeneous ensemble learning classifier, an online monitoring system is constructed in which the harmonic features are extracted from force signals and a minimal redundancy and maximal relevance (mRMR algorithm is utilized to select the most prominent features. To verify the effectiveness of the proposed method, a titanium alloy milling experiment was carried out and samples with different tool wear states were collected to build the proposed heterogeneous ensemble learning classifier. Moreover, the homogeneous ensemble learning model and majority voting strategy are also adopted to make a comparison. The analysis and comparison results show that the proposed heterogeneous ensemble learning classifier performs better in both classification accuracy and stability.

  8. Fish habitat simulation models and integrated assessment tools

    International Nuclear Information System (INIS)

    Harby, A.; Alfredsen, K.

    1999-01-01

    Because of human development water use increases in importance, and this worldwide trend is leading to an increasing number of user conflicts with a strong need for assessment tools to measure the impacts both on the ecosystem and the different users and user groups. The quantitative tools must allow a comparison of alternatives, different user groups, etc., and the tools must be integrated while impact assessments includes different disciplines. Fish species, especially young ones, are indicators of the environmental state of a riverine system and monitoring them is a way to follow environmental changes. The direct and indirect impacts on the ecosystem itself are measured, and impacts on user groups is not included. Fish habitat simulation models are concentrated on, and methods and examples are considered from Norway. Some ideas on integrated modelling tools for impact assessment studies are included. One dimensional hydraulic models are rapidly calibrated and do not require any expert knowledge in hydraulics. Two and three dimensional models require a bit more skilled users, especially if the topography is very heterogeneous. The advantages of using two and three dimensional models include: they do not need any calibration, just validation; they are predictive; and they can be more cost effective than traditional habitat hydraulic models when combined with modern data acquisition systems and tailored in a multi-disciplinary study. Suitable modelling model choice should be based on available data and possible data acquisition, available manpower, computer, and software resources, and needed output and accuracy in the output. 58 refs

  9. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  10. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  11. The relationship between quality management practices and organisational performance: A structural equation modelling approach

    Science.gov (United States)

    Jamaluddin, Z.; Razali, A. M.; Mustafa, Z.

    2015-02-01

    The purpose of this paper is to examine the relationship between the quality management practices (QMPs) and organisational performance for the manufacturing industry in Malaysia. In this study, a QMPs and organisational performance framework is developed according to a comprehensive literature review which cover aspects of hard and soft quality factors in manufacturing process environment. A total of 11 hypotheses have been put forward to test the relationship amongst the six constructs, which are management commitment, training, process management, quality tools, continuous improvement and organisational performance. The model is analysed using Structural Equation Modeling (SEM) with AMOS software version 18.0 using Maximum Likelihood (ML) estimation. A total of 480 questionnaires were distributed, and 210 questionnaires were valid for analysis. The results of the modeling analysis using ML estimation indicate that the fits statistics of QMPs and organisational performance model for manufacturing industry is admissible. From the results, it found that the management commitment have significant impact on the training and process management. Similarly, the training had significant effect to the quality tools, process management and continuous improvement. Furthermore, the quality tools have significant influence on the process management and continuous improvement. Likewise, the process management also has a significant impact to the continuous improvement. In addition the continuous improvement has significant influence the organisational performance. However, the results of the study also found that there is no significant relationship between management commitment and quality tools, and between the management commitment and continuous improvement. The results of the study can be used by managers to prioritize the implementation of QMPs. For instances, those practices that are found to have positive impact on organisational performance can be recommended to

  12. Modeling and Performance Considerations for Automated Fault Isolation in Complex Systems

    Science.gov (United States)

    Ferrell, Bob; Oostdyk, Rebecca

    2010-01-01

    The purpose of this paper is to document the modeling considerations and performance metrics that were examined in the development of a large-scale Fault Detection, Isolation and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FDIR team members developed a set of operational requirements for the models that would be used for fault isolation and worked closely with the vendor of the software tools selected for fault isolation to ensure that the software was able to meet the requirements. Once the requirements were established, example models of sufficient complexity were used to test the performance of the software. The results of the performance testing demonstrated the need for enhancements to the software in order to meet the demands of the full-scale ground and vehicle FDIR system. The paper highlights the importance of the development of operational requirements and preliminary performance testing as a strategy for identifying deficiencies in highly scalable systems and rectifying those deficiencies before they imperil the success of the project

  13. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    Science.gov (United States)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  14. Model-based setup assistant for progressive tools

    Science.gov (United States)

    Springer, Robert; Gräler, Manuel; Homberg, Werner; Henke, Christian; Trächtler, Ansgar

    2018-05-01

    In the field of production systems, globalization and technological progress lead to increasing requirements regarding part quality, delivery time and costs. Hence, today's production is challenged much more than a few years ago: it has to be very flexible and produce economically small batch sizes to satisfy consumer's demands and avoid unnecessary stock. Furthermore, a trend towards increasing functional integration continues to lead to an ongoing miniaturization of sheet metal components. In the industry of electric connectivity for example, the miniaturized connectors are manufactured by progressive tools, which are usually used for very large batches. These tools are installed in mechanical presses and then set up by a technician, who has to manually adjust a wide range of punch-bending operations. Disturbances like material thickness, temperatures, lubrication or tool wear complicate the setup procedure. In prospect of the increasing demand of production flexibility, this time-consuming process has to be handled more and more often. In this paper, a new approach for a model-based setup assistant is proposed as a solution, which is exemplarily applied in combination with a progressive tool. First, progressive tools, more specifically, their setup process is described and based on that, the challenges are pointed out. As a result, a systematic process to set up the machines is introduced. Following, the process is investigated with an FE-Analysis regarding the effects of the disturbances. In the next step, design of experiments is used to systematically develop a regression model of the system's behaviour. This model is integrated within an optimization in order to calculate optimal machine parameters and the following necessary adjustment of the progressive tool due to the disturbances. Finally, the assistant is tested in a production environment and the results are discussed.

  15. [The Brazilian National Health Surveillance Agency performance evaluation at the management contract model].

    Science.gov (United States)

    Moreira, Elka Maltez de Miranda; Costa, Ediná Alves

    2010-11-01

    The Brazilian National Health Surveillance Agency (Anvisa) is supervised by the Ministry of Health by means of a management contract, a performance evaluation tool. This case study was aimed at describing and analyzing Anvisa's performance evaluation model based on the agency's institutional purpose, according to the following analytical categories: the management contract formalization, evaluation tools, evaluators and institutional performance. Semi-structured interviews and document analysis revealed that Anvisa signed only one management contract with the Ministry of Health in 1999, updated by four additive terms. The Collegiate Board of Directors and the Advisory Center for Strategic Management play the role of Anvisa's internal evaluators and an Assessing Committee, comprising the Ministry of Health, constitutes its external evaluator. Three phases were identified in the evaluation model: the structuring of the new management model (1999-2000), legitimation regarding the productive segment (2001-2004) and widespread legitimation (2005). The best performance was presented in 2000 (86.05%) and the worst in 2004 (40.00%). The evaluation model was shown to have contributed little towards the agency's institutional purpose and the effectiveness measurement of the implemented actions.

  16. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  17. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Directory of Open Access Journals (Sweden)

    Matthew B Biggs

    Full Text Available Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  18. Novel multiscale modeling tool applied to Pseudomonas aeruginosa biofilm formation.

    Science.gov (United States)

    Biggs, Matthew B; Papin, Jason A

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid model correctly recapitulates oxygen-limited biofilm metabolic activity and predicts increased growth rate via anaerobic respiration with the addition of nitrate to the growth media. In addition, a genome-wide survey of metabolic mutants and biofilm formation exemplifies the powerful analyses that are enabled by this computational modeling tool.

  19. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  20. Predictions of titanium alloy properties using thermodynamic modeling tools

    Science.gov (United States)

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.

    2005-12-01

    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  1. Performance Implications of Business Model Change: A Case Study

    Directory of Open Access Journals (Sweden)

    Jana Poláková

    2015-01-01

    Full Text Available The paper deals with changes in performance level introduced by the change of business model. The selected case is a small family business undergoing through substantial changes in reflection of structural changes of its markets. The authors used the concept of business model to describe value creation processes within the selected family business and by contrasting the differences between value creation processes before and after the change introduced they prove the role of business model as the performance differentiator. This is illustrated with the use of business model canvas constructed on the basis interviews, observations and document analysis. The two business model canvases allow for explanation of cause-and-effect relationships within the business leading to change in performance. The change in the performance is assessed by financial analysis of the business conducted over the period of 2006–2012 demonstrates changes in performance (comparing development of ROA, ROE and ROS having their lowest levels before the change of business model was introduced, growing after the introduction of the change, as well as the activity indicators with similar developments of the family business. The described case study contributes to the concept of business modeling with the arguments supporting its value as strategic tool facilitating decisions related to value creation within the business.

  2. Coaching as a Performance Improvement Tool at School

    Science.gov (United States)

    Yirci, Ramazan; Karakose, Turgut; Kocabas, Ibrahim

    2016-01-01

    The purpose of this study is to examine the current literature and have an insight about coaching as a performance improvement tool at school. In today's world, schools have to survive and keep their organizational success in the highest level because of the high expectations from school stakeholders. Taking place in such a fierce competitive…

  3. Performance improvement studies for cutting tools with perforated surface in turning of titanium alloy

    Directory of Open Access Journals (Sweden)

    Charitha Rao

    2018-01-01

    Full Text Available In turning process, the cutting tool is essential for shaping materials. The cutting tools with various perforated surfaces help to increase the cutting tool life. Also, advances in CNC machining technologies have enhanced the productivity of machining process. One of the best or futuristic approaches in modern manufacturing engineering is the use of FEM Simulation for the metal cutting process. FEM simulation helps in understanding the metal deformation process and also helps in the reduction of experiments. The simulation helps the researchers to predict the major influencing cutting variable values without carrying out any experiment which is time-consuming and expensive. This research presents the simulation study of the performance of micro-hole patterned Polycrystalline Diamond cutting insert in machining Titanium alloy (Ti-6Al-4V. Micro-holes are drilled using Electrical Discharge Wire Drilling machine on the rake face of Polycrystalline Diamond (PCD cutting inserts. FEM analysis is carried out to evaluate the effect of perforations on the mechanical integrity of insert. The micro-hole patterned insert is modeled in PRO-E modeler and simulated using DEFORM-3D software. The effective stress, strain, and temperature distribution are analyzed and the results are compared with the normal insert.

  4. MUMAX: A new high-performance micromagnetic simulation tool

    International Nuclear Information System (INIS)

    Vansteenkiste, A.; Van de Wiele, B.

    2011-01-01

    We present MUMAX, a general-purpose micromagnetic simulation tool running on graphical processing units (GPUs). MUMAX is designed for high-performance computations and specifically targets large simulations. In that case speedups of over a factor 100 x can be obtained compared to the CPU-based OOMMF program developed at NIST. MUMAX aims to be general and broadly applicable. It solves the classical Landau-Lifshitz equation taking into account the magnetostatic, exchange and anisotropy interactions, thermal effects and spin-transfer torque. Periodic boundary conditions can optionally be imposed. A spatial discretization using finite differences in two or three dimensions can be employed. MUMAX is publicly available as open-source software. It can thus be freely used and extended by community. Due to its high computational performance, MUMAX should open up the possibility of running extensive simulations that would be nearly inaccessible with typical CPU-based simulators. - Highlights: → Novel, open-source micromagnetic simulator on GPU hardware. → Speedup of ∝100x compared to other widely used tools. → Extensively validated against standard problems. → Makes previously infeasible simulations accessible.

  5. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  6. Tools for the performance assessment and improvement of food safety management systems ; review

    NARCIS (Netherlands)

    Jacxsens, L.; Luning, P.A.; Marcelis, W.J.; Boekel, van M.A.J.S.; Rovira, J.; Oses Gomez, S.; Kousta, M.; Drosinos, E.H.; Jasson, V.; Uyttendaele, M.

    2011-01-01

    Food business operators are challenged to combine requirements from different stakeholders (e.g. government, retailers) into a company specific Food Safety Management System (FSMS). Tools to diagnose the performance of an implemented FSMS (diagnostic tools), tools to help a selection process

  7. Development of a Generic Performance Measurement Model in an Emergency Department

    DEFF Research Database (Denmark)

    Sørup, Christian Michel; Lundager Forberg, Jakob

    , and the use of triage. All of the mentioned initiatives are new and not well validated to date. It would be desirable to enable measurement of each of the initiative’s effects. The goal of this PhD project was to develop a performance measurement model for EDs. The new model comprises only the most important...... performance measures that provide an estimate for overall ED performance levels. Furthermore, a thorough analysis of the interdependencies between the included performance measures was conducted in order to gain deeper knowledge of the ED as a system. The model enables monitoring of how well the ED performs...... over time, including how performance is impacted by the various initiatives. In the end, the developed model will be an important management tool to meet the management’s vision of providing the best possible care for the acute patient meanwhile achieving the highest possible utilisation of resources....

  8. Performance Models and Risk Management in Communications Systems

    CERN Document Server

    Harrison, Peter; Rüstem, Berç

    2011-01-01

    This volume covers recent developments in the design, operation, and management of telecommunication and computer network systems in performance engineering and addresses issues of uncertainty, robustness, and risk. Uncertainty regarding loading and system parameters leads to challenging optimization and robustness issues. Stochastic modeling combined with optimization theory ensures the optimum end-to-end performance of telecommunication or computer network systems. In view of the diverse design options possible, supporting models have many adjustable parameters and choosing the best set for a particular performance objective is delicate and time-consuming. An optimization based approach determines the optimal possible allocation for these parameters. Researchers and graduate students working at the interface of telecommunications and operations research will benefit from this book. Due to the practical approach, this book will also serve as a reference tool for scientists and engineers in telecommunication ...

  9. Novel Multiscale Modeling Tool Applied to Pseudomonas aeruginosa Biofilm Formation

    OpenAIRE

    Biggs, Matthew B.; Papin, Jason A.

    2013-01-01

    Multiscale modeling is used to represent biological systems with increasing frequency and success. Multiscale models are often hybrids of different modeling frameworks and programming languages. We present the MATLAB-NetLogo extension (MatNet) as a novel tool for multiscale modeling. We demonstrate the utility of the tool with a multiscale model of Pseudomonas aeruginosa biofilm formation that incorporates both an agent-based model (ABM) and constraint-based metabolic modeling. The hybrid mod...

  10. Performance Analysis of the Capability Assessment Tool for Sustainable Manufacturing

    Directory of Open Access Journals (Sweden)

    Enda Crossin

    2013-08-01

    Full Text Available This paper explores the performance of a novel capability assessment tool, developed to identify capability gaps and associated training and development requirements across the supply chain for environmentally-sustainable manufacturing. The tool was developed to assess 170 capabilities that have been clustered with respect to key areas of concern such as managing energy, water, material resources, carbon emissions and waste as well as environmental management practices for sustainability. Two independent expert teams used the tool to assess a sample group of five first and second tier sports apparel and footwear suppliers within the supply chain of a global sporting goods manufacturer in Asia. The paper addresses the reliability and robustness of the developed assessment method by formulating the expected links between the assessment results. The management practices of the participating suppliers were shown to be closely connected to their performance in managing their resources and emissions. The companies’ initiatives in implementing energy efficiency measures were found to be generally related to their performance in carbon emissions management. The suppliers were also asked to undertake a self-assessment by using a short questionnaire. The large gap between the comprehensive assessment and these in-house self-assessments revealed the suppliers’ misconceptions about their capabilities.

  11. Cool Farm Tool – Potato: Model Description and Performance of Four Production Systems

    NARCIS (Netherlands)

    Haverkort, A.J.; Hillier, J.G.

    2011-01-01

    The Cool Farm Tool – Potato (CFT-Potato) is a spreadsheet programme that allows the calculation of the amount of CO2 equivalents that it costs to produce 1 t of potato. The spreadsheet was adapted from an original generic version of the tool, and completed for potato production in diverse production

  12. Prototype road weather performance management (RWPM) tool installation instructions & user manual.

    Science.gov (United States)

    2016-07-20

    This document is the Installation Instructions and User Manual for the Road Weather Performance Management (RW-PM) Tool developed for the project on Development and Demonstration of a Prototype Road Weather Performance Management Application that Use...

  13. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    , connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control......This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built...

  14. Introducing the Evaluation Tools for HSE Management System Performance Using Balanced Score Card Model

    Directory of Open Access Journals (Sweden)

    Ali Mohammadi

    2016-12-01

    Full Text Available Background: The performance of the HSE units has various dimensions Leading to different performances. Thus, any industry should be capable of evaluating these systems. The aim of this study was to design a standard questionnaire in the field of performance evaluation of HSE management system employing Balanced Score Card model. Methods: In this study we, first determined the criteria to be evaluated in the framework of Balanced Score Card model based on the objectives and strategies of HSE Management System and existing standards, and then designed questions on every criterion. We used content validity and Cronbach's Alpha to determine the reliability and validity of the questionnaire. Results: The primary questionnaire was comprised of 126 questions some of which were omitted regarding the results obtained from the CVR and CVI values. We obtained the CVI average of environmental dimension to be 0.75 and its CVI average 0.71. Conclusion: With respect to the results of the reliability and validity of this questionnaire,and its standardized design we can suggest using it for evaluation of HSE management system performance in organizations and industries with the mentioned system.

  15. Modeling performance measurement applications and implementation issues in DEA

    CERN Document Server

    Cook, Wade D

    2005-01-01

    Addresses advanced/new DEA methodology and techniques that are developed for modeling unique and new performance evaluation issuesPesents new DEA methodology and techniques via discussions on how to solve managerial problemsProvides an easy-to-use DEA software - DEAFrontier (www.deafrontier.com) which is an excellent tool for both DEA researchers and practitioners.

  16. Laser Cladding of CPM Tool Steels on Hardened H13 Hot-Work Steel for Low-Cost High-Performance Automotive Tooling

    Science.gov (United States)

    Chen, J.; Xue, L.

    2012-06-01

    This paper summarizes our research on laser cladding of high-vanadium CPM® tool steels (3V, 9V, and 15V) onto the surfaces of low-cost hardened H13 hot-work tool steel to substantially enhance resistance against abrasive wear. The results provide great potential for fabricating high-performance automotive tooling (including molds and dies) at affordable cost. The microstructure and hardness development of the laser-clad tool steels so obtained are presented as well.

  17. Effects of Different Cutting Patterns and Experimental Conditions on the Performance of a Conical Drag Tool

    Science.gov (United States)

    Copur, Hanifi; Bilgin, Nuh; Balci, Cemal; Tumac, Deniz; Avunduk, Emre

    2017-06-01

    This study aims at determining the effects of single-, double-, and triple-spiral cutting patterns; the effects of tool cutting speeds on the experimental scale; and the effects of the method of yield estimation on cutting performance by performing a set of full-scale linear cutting tests with a conical cutting tool. The average and maximum normal, cutting and side forces; specific energy; yield; and coarseness index are measured and compared in each cutting pattern at a 25-mm line spacing, at varying depths of cut per revolution, and using two cutting speeds on five different rock samples. The results indicate that the optimum specific energy decreases by approximately 25% with an increasing number of spirals from the single- to the double-spiral cutting pattern for the hard rocks, whereas generally little effect was observed for the soft- and medium-strength rocks. The double-spiral cutting pattern appeared to be more effective than the single- or triple-spiral cutting pattern and had an advantage of lower side forces. The tool cutting speed had no apparent effect on the cutting performance. The estimation of the specific energy by the yield based on the theoretical swept area was not significantly different from that estimated by the yield based on the muck weighing, especially for the double- and triple-spiral cutting patterns and with the optimum ratio of line spacing to depth of cut per revolution. This study also demonstrated that the cutterhead and mechanical miner designs, semi-theoretical deterministic computer simulations and empirical performance predictions and optimization models should be based on realistic experimental simulations. Studies should be continued to obtain more reliable results by creating a larger database of laboratory tests and field performance records for mechanical miners using drag tools.

  18. IMPACT OF INFORMATION TECHNOLOGY (IT TOOLS, PARTNER RELATIONSHIP AND SUPPLY CHAIN PERFORMANCE

    Directory of Open Access Journals (Sweden)

    Ramayah T.

    2008-01-01

    Full Text Available This paper proposes a model that assesses the usage of information technology (IT tools, commitment of partner relationships, and supply chain performance in the Malaysian manufacturing industry. A total of 250 questionnaires were distributed to manufacturingcompanies located in Penang, which is in the northern region of Malaysia. Applying multiple regression analysis, the study indicated that a higher level of supply chain partner commitment leads to a higher level of supply chain reliability and flexibility. Trust among supply chain partners also contributes to improving supply chain flexibility. The study focused only on inter-organisational relationships. Future research should examine how both inter- and intra-organisational collaboration impact supply chain performance. Although researchers have recognised that IT is an enabler of supply chain management (SCM activities, there has been limited research that has directly associated the usage of IT and partner relationships to supply chain performance. This research is essential in order to ascertain the impact of IT usage and partnerrelationships on improving supply chain performance, particularly in the Malaysian manufacturing environment.

  19. Collaborative Inquiry Learning: Models, tools, and challenges

    Science.gov (United States)

    Bell, Thorsten; Urhahne, Detlef; Schanze, Sascha; Ploetzner, Rolf

    2010-02-01

    Collaborative inquiry learning is one of the most challenging and exciting ventures for today's schools. It aims at bringing a new and promising culture of teaching and learning into the classroom where students in groups engage in self-regulated learning activities supported by the teacher. It is expected that this way of learning fosters students' motivation and interest in science, that they learn to perform steps of inquiry similar to scientists and that they gain knowledge on scientific processes. Starting from general pedagogical reflections and science standards, the article reviews some prominent models of inquiry learning. This comparison results in a set of inquiry processes being the basis for cooperation in the scientific network NetCoIL. Inquiry learning is conceived in several ways with emphasis on different processes. For an illustration of the spectrum, some main conceptions of inquiry and their focuses are described. In the next step, the article describes exemplary computer tools and environments from within and outside the NetCoIL network that were designed to support processes of collaborative inquiry learning. These tools are analysed by describing their functionalities as well as effects on student learning known from the literature. The article closes with challenges for further developments elaborated by the NetCoIL network.

  20. Using key performance indicators as knowledge-management tools at a regional health-care authority level.

    Science.gov (United States)

    Berler, Alexander; Pavlopoulos, Sotiris; Koutsouris, Dimitris

    2005-06-01

    The advantages of the introduction of information and communication technologies in the complex health-care sector are already well-known and well-stated in the past. It is, nevertheless, paradoxical that although the medical community has embraced with satisfaction most of the technological discoveries allowing the improvement in patient care, this has not happened when talking about health-care informatics. Taking the above issue of concern, our work proposes an information model for knowledge management (KM) based upon the use of key performance indicators (KPIs) in health-care systems. Based upon the use of the balanced scorecard (BSC) framework (Kaplan/Norton) and quality assurance techniques in health care (Donabedian), this paper is proposing a patient journey centered approach that drives information flow at all levels of the day-to-day process of delivering effective and managed care, toward information assessment and knowledge discovery. In order to persuade health-care decision-makers to assess the added value of KM tools, those should be used to propose new performance measurement and performance management techniques at all levels of a health-care system. The proposed KPIs are forming a complete set of metrics that enable the performance management of a regional health-care system. In addition, the performance framework established is technically applied by the use of state-of-the-art KM tools such as data warehouses and business intelligence information systems. In that sense, the proposed infrastructure is, technologically speaking, an important KM tool that enables knowledge sharing amongst various health-care stakeholders and between different health-care groups. The use of BSC is an enabling framework toward a KM strategy in health care.

  1. A Micro-Grid Simulator Tool (SGridSim) using Effective Node-to-Node Complex Impedance (EN2NCI) Models

    Energy Technology Data Exchange (ETDEWEB)

    Udhay Ravishankar; Milos manic

    2013-08-01

    This paper presents a micro-grid simulator tool useful for implementing and testing multi-agent controllers (SGridSim). As a common engineering practice it is important to have a tool that simplifies the modeling of the salient features of a desired system. In electric micro-grids, these salient features are the voltage and power distributions within the micro-grid. Current simplified electric power grid simulator tools such as PowerWorld, PowerSim, Gridlab, etc, model only the power distribution features of a desired micro-grid. Other power grid simulators such as Simulink, Modelica, etc, use detailed modeling to accommodate the voltage distribution features. This paper presents a SGridSim micro-grid simulator tool that simplifies the modeling of both the voltage and power distribution features in a desired micro-grid. The SGridSim tool accomplishes this simplified modeling by using Effective Node-to-Node Complex Impedance (EN2NCI) models of components that typically make-up a micro-grid. The term EN2NCI models means that the impedance based components of a micro-grid are modeled as single impedances tied between their respective voltage nodes on the micro-grid. Hence the benefit of the presented SGridSim tool are 1) simulation of a micro-grid is performed strictly in the complex-domain; 2) faster simulation of a micro-grid by avoiding the simulation of detailed transients. An example micro-grid model was built using the SGridSim tool and tested to simulate both the voltage and power distribution features with a total absolute relative error of less than 6%.

  2. Selection criteria for building performance simulation tools : contrasting architects' and engineers' needs

    NARCIS (Netherlands)

    Attia, S.G.; Hensen, J.L.M.; Beltran, L.; De Herde, A.

    2012-01-01

    This article summarises a study undertaken to reveal potential challenges and opportunities for using building performance simulation (BPS) tools. The article reviews current trends in building simulation and outlines major criteria for BPS tool selection and evaluation based on analysing users'

  3. Micro-Vibration Performance Prediction of SEPTA24 Using SMeSim (RUAG Space Mechanism Simulator Tool)

    Science.gov (United States)

    Omiciuolo, Manolo; Lang, Andreas; Wismer, Stefan; Barth, Stephan; Szekely, Gerhard

    2013-09-01

    Scientific space missions are currently challenging the performances of their payloads. The performances can be dramatically restricted by micro-vibration loads generated by any moving parts of the satellites, thus by Solar Array Drive Assemblies too. Micro-vibration prediction of SADAs is therefore very important to support their design and optimization in the early stages of a programme. The Space Mechanism Simulator (SMeSim) tool, developed by RUAG, enhances the capability of analysing the micro-vibration emissivity of a Solar Array Drive Assembly (SADA) under a specified set of boundary conditions. The tool is developed in the Matlab/Simulink® environment throughout a library of blocks simulating the different components a SADA is made of. The modular architecture of the blocks, assembled by the user, and the set up of the boundary conditions allow time-domain and frequency-domain analyses of a rigid multi-body model with concentrated flexibilities and coupled- electronic control of the mechanism. SMeSim is used to model the SEPTA24 Solar Array Drive Mechanism and predict its micro-vibration emissivity. SMeSim and the return of experience earned throughout its development and use can now support activities like verification by analysis of micro-vibration emissivity requirements and/or design optimization to minimize the micro- vibration emissivity of a SADA.

  4. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  5. The Selected Method and Tools for Performance Measurement in the Green Supply Chain—Survey Analysis in Poland

    Directory of Open Access Journals (Sweden)

    Blanka Tundys

    2018-02-01

    Full Text Available The methods and tools for the performance measurement and evaluation of the green supply chain management are very important elements for the construction and function of this type of supply chain. The result is a presentation of the considerations underlying a very general model, which presents some selected tools, but no breakdown of individual industries. The considerations undertaken are important and have scientific added value as usually in practice, a very large number of tools are used to assess the supply chain, which are not always correlated or adapted to the specificity of the chain. It is worth pointing out which of the already used or completely new tools and methods will be most useful for assessing the green supply chain. The structure of the paper covers the theoretical and empirical. It includes an introduction, our goals and hypotheses, state of the art, methodology, empirical findings, and discussion. We present the definitional differences between green and sustainable supply chains and focus on the selection and identification of methods for the framework model for evaluating the green supply chain. In the next step, the theoretical and selected method and tools were compared to a survey of Poland. On the basis of the survey, we present the findings and discussions found in this area. The main methodology used includes a literature review, a survey analysis using a questionnaire and statistical tools. The survey was carried out in 2015 in sample organizations in Poland. The research results showed that organizations were aware of the environmental elements of measuring and assessing the supply chain from an environmental point of view, but their use depended on many factors: the area, size of the organization, or the industry. If certain boundary conditions are met and the organizations are aware of the essence of environmental aspects in the chain, then they are applying green measures to the supply chain. These findings

  6. Finite Element Modelling of the effect of tool rake angle on tool temperature and cutting force during high speed machining of AISI 4340 steel

    International Nuclear Information System (INIS)

    Sulaiman, S; Roshan, A; Ariffin, M K A

    2013-01-01

    In this paper, a Finite Element Method (FEM) based on the ABAQUS explicit software which involves Johnson-Cook material model was used to simulate cutting force and tool temperature during high speed machining (HSM) of AISI 4340 steel. In this simulation work, a tool rake angle ranging from 0° to 20° and a range of cutting speeds between 300 to 550 m/min was investigated. The purpose of this simulation analysis was to find optimum tool rake angle where cutting force is smallest as well as tool temperature is lowest during high speed machining. It was found that cutting forces to have a decreasing trend as rake angle increased to positive direction. The optimum rake angle observed between 10° and 18° due to decrease of cutting force as 20% for all simulated cutting speeds. In addition, increasing cutting tool rake angle over its optimum value had negative influence on tool's performance and led to an increase in cutting temperature. The results give a better understanding and recognition of the cutting tool design for high speed machining processes

  7. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    Science.gov (United States)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  8. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  9. Hybrid Neural Network Approach Based Tool for the Modelling of Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Antonino Laudani

    2015-01-01

    Full Text Available A hybrid neural network approach based tool for identifying the photovoltaic one-diode model is presented. The generalization capabilities of neural networks are used together with the robustness of the reduced form of one-diode model. Indeed, from the studies performed by the authors and the works present in the literature, it was found that a direct computation of the five parameters via multiple inputs and multiple outputs neural network is a very difficult task. The reduced form consists in a series of explicit formulae for the support to the neural network that, in our case, is aimed at predicting just two parameters among the five ones identifying the model: the other three parameters are computed by reduced form. The present hybrid approach is efficient from the computational cost point of view and accurate in the estimation of the five parameters. It constitutes a complete and extremely easy tool suitable to be implemented in a microcontroller based architecture. Validations are made on about 10000 PV panels belonging to the California Energy Commission database.

  10. Dynamic wind turbine models in power system simulation tool DIgSILENT

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, A.C.; Jauch, C.; Soerensen, P.; Iov, F.; Blaabjerg, F.

    2003-12-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built-in models for the electrical components of a grid connected wind turbine (e.g. induction generators, power converters, transformers) and the models developed by the user, in the dynamic simulation language DSL of DIgSILENT, for the non-electrical components of the wind turbine (wind model, aerodynamic model, mechanical model). The initialisation issues on the wind turbine models into the power system simulation are also presented. However, the main attention in this report is drawn to the modelling at the system level of two wind turbine concepts: 1. Active stall wind turbine with induction generator 2. Variable speed, variable pitch wind turbine with doubly fed induction generator. These wind turbine concept models can be used and even extended for the study of different aspects, e.g. the assessment of power quality, control strategies, connection of the wind turbine at different types of grid and storage systems. For both these two concepts, control strategies are developed and implemented, their performance assessed and discussed by means of simulations. (au)

  11. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  12. Higher mental workload is associated with poorer laparoscopic performance as measured by the NASA-TLX tool.

    Science.gov (United States)

    Yurko, Yuliya Y; Scerbo, Mark W; Prabhu, Ajita S; Acker, Christina E; Stefanidis, Dimitrios

    2010-10-01

    Increased workload during task performance may increase fatigue and facilitate errors. The National Aeronautics and Space Administration-Task Load Index (NASA-TLX) is a previously validated tool for workload self-assessment. We assessed the relationship of workload and performance during simulator training on a complex laparoscopic task. NASA-TLX workload data from three separate trials were analyzed. All participants were novices (n = 28), followed the same curriculum on the fundamentals of laparoscopic surgery suturing model, and were tested in the animal operating room (OR) on a Nissen fundoplication model after training. Performance and workload scores were recorded at baseline, after proficiency achievement, and during the test. Performance, NASA-TLX scores, and inadvertent injuries during the test were analyzed and compared. Workload scores declined during training and mirrored performance changes. NASA-TLX scores correlated significantly with performance scores (r = -0.5, P NASA-TLX questionnaire accurately reflects workload changes during simulator training and may identify individuals more likely to experience high workload and more prone to errors during skill transfer to the clinical environment.

  13. Clinical Prediction Model and Tool for Assessing Risk of Persistent Pain After Breast Cancer Surgery

    DEFF Research Database (Denmark)

    Meretoja, Tuomo J; Andersen, Kenneth Geving; Bruce, Julie

    2017-01-01

    are missing. The aim was to develop a clinically applicable risk prediction tool. Methods The prediction models were developed and tested using three prospective data sets from Finland (n = 860), Denmark (n = 453), and Scotland (n = 231). Prediction models for persistent pain of moderate to severe intensity......), high body mass index ( P = .039), axillary lymph node dissection ( P = .008), and more severe acute postoperative pain intensity at the seventh postoperative day ( P = .003) predicted persistent pain in the final prediction model, which performed well in the Danish (ROC-AUC, 0.739) and Scottish (ROC......-AUC, 0.740) cohorts. At the 20% risk level, the model had 32.8% and 47.4% sensitivity and 94.4% and 82.4% specificity in the Danish and Scottish cohorts, respectively. Conclusion Our validated prediction models and an online risk calculator provide clinicians and researchers with a simple tool to screen...

  14. Using the IEA ETSAP modelling tools for Denmark

    DEFF Research Database (Denmark)

    Grohnheit, Poul Erik

    signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, "Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems" for the period 2005 to 2007. The main activity is semi......-annual workshops focusing on presentations of model analyses and use of the ETSAP' tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project ”NEEDS - New Energy Externalities Developments for Sustainability. ETSAP is contributing to a part of NEEDS that develops......, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model...

  15. ePRO-MP: A Tool for Profiling and Optimizing Energy and Performance of Mobile Multiprocessor Applications

    Directory of Open Access Journals (Sweden)

    Wonil Choi

    2009-01-01

    Full Text Available For mobile multiprocessor applications, achieving high performance with low energy consumption is a challenging task. In order to help programmers to meet these design requirements, system development tools play an important role. In this paper, we describe one such development tool, ePRO-MP, which profiles and optimizes both performance and energy consumption of multi-threaded applications running on top of Linux for ARM11 MPCore-based embedded systems. One of the key features of ePRO-MP is that it can accurately estimate the energy consumption of multi-threaded applications without requiring a power measurement equipment, using a regression-based energy model. We also describe another key benefit of ePRO-MP, an automatic optimization function, using two example problems. Using the automatic optimization function, ePRO-MP can achieve high performance and low power consumption without programmer intervention. Our experimental results show that ePRO-MP can improve the performance and energy consumption by 6.1% and 4.1%, respectively, over a baseline version for the co-running applications optimization example. For the producer-consumer application optimization example, ePRO-MP improves the performance and energy consumption by 60.5% and 43.3%, respectively over a baseline version.

  16. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  17. Simplified Method for Modeling the Impact of Arbitrary Partial Shading Conditions on PV Array Performance: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    MacAlpine, Sara; Deline, Chris

    2015-09-15

    It is often difficult to model the effects of partial shading conditions on PV array performance, as shade losses are nonlinear and depend heavily on a system's particular configuration. This work describes and implements a simple method for modeling shade loss: a database of shade impact results (loss percentages), generated using a validated, detailed simulation tool and encompassing a wide variety of shading scenarios. The database is intended to predict shading losses in crystalline silicon PV arrays and is accessed using basic inputs generally available in any PV simulation tool. Performance predictions using the database are within 1-2% of measured data for several partially shaded PV systems, and within 1% of those predicted by the full, detailed simulation tool on an annual basis. The shade loss database shows potential to considerably improve performance prediction for partially shaded PV systems.

  18. Modelling of tunnelling processes and rock cutting tool wear with the particle finite element method

    Science.gov (United States)

    Carbonell, Josep Maria; Oñate, Eugenio; Suárez, Benjamín

    2013-09-01

    Underground construction involves all sort of challenges in analysis, design, project and execution phases. The dimension of tunnels and their structural requirements are growing, and so safety and security demands do. New engineering tools are needed to perform a safer planning and design. This work presents the advances in the particle finite element method (PFEM) for the modelling and the analysis of tunneling processes including the wear of the cutting tools. The PFEM has its foundation on the Lagrangian description of the motion of a continuum built from a set of particles with known physical properties. The method uses a remeshing process combined with the alpha-shape technique to detect the contacting surfaces and a finite element method for the mechanical computations. A contact procedure has been developed for the PFEM which is combined with a constitutive model for predicting the excavation front and the wear of cutting tools. The material parameters govern the coupling of frictional contact and wear between the interacting domains at the excavation front. The PFEM allows predicting several parameters which are relevant for estimating the performance of a tunnelling boring machine such as wear in the cutting tools, the pressure distribution on the face of the boring machine and the vibrations produced in the machinery and the adjacent soil/rock. The final aim is to help in the design of the excavating tools and in the planning of the tunnelling operations. The applications presented show that the PFEM is a promising technique for the analysis of tunnelling problems.

  19. Quantitative Modeling of Human Performance in Information Systems. Technical Research Note 232.

    Science.gov (United States)

    Baker, James D.

    1974-01-01

    A general information system model was developed which focuses on man and considers the computer only as a tool. The ultimate objective is to produce a simulator which will yield measures of system performance under different mixes of equipment, personnel, and procedures. The model is structured around three basic dimensions: (1) data flow and…

  20. Evaluation model based on FAHP for nuclear power project contract performance

    International Nuclear Information System (INIS)

    Liu Bohang; Cheng Jing

    2012-01-01

    Fuzzy Comprehensive Evaluation is a common tool to analyze comprehensive integration. Fuzzy Analytic Hierarchy Process is an improvement for Analytic Hierarchy Process. Firstly the paper pointed out the concept of FAHP, and then used FAHP to setup an evaluation system model for nuclear power project contract performance. Based on this model, all the evaluation factors were assigned to different weightiness. By weighting the score of each factor, output would be the result which could evaluate the contract performance. On the basis of the research, the paper gave the principle of evaluating contract performance of nuclear power suppliers, which can assure the procurement process. (authors)

  1. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  2. SBML-PET-MPI: a parallel parameter estimation tool for Systems Biology Markup Language based models.

    Science.gov (United States)

    Zi, Zhike

    2011-04-01

    Parameter estimation is crucial for the modeling and dynamic analysis of biological systems. However, implementing parameter estimation is time consuming and computationally demanding. Here, we introduced a parallel parameter estimation tool for Systems Biology Markup Language (SBML)-based models (SBML-PET-MPI). SBML-PET-MPI allows the user to perform parameter estimation and parameter uncertainty analysis by collectively fitting multiple experimental datasets. The tool is developed and parallelized using the message passing interface (MPI) protocol, which provides good scalability with the number of processors. SBML-PET-MPI is freely available for non-commercial use at http://www.bioss.uni-freiburg.de/cms/sbml-pet-mpi.html or http://sites.google.com/site/sbmlpetmpi/.

  3. Implementation and testing of a fault detection software tool for improving control system performance in a large commercial building

    Energy Technology Data Exchange (ETDEWEB)

    Salsbury, T.I.; Diamond, R.C.

    2000-05-01

    This paper describes a model-based, feedforward control scheme that can detect faults in the controlled process and improve control performance over traditional PID control. The tool uses static simulation models of the system under control to generate feed-forward control action, which acts as a reference of correct operation. Faults that occur in the system cause discrepancies between the feedforward models and the controlled process. The scheme facilitates detection of faults by monitoring the level of these discrepancies. We present results from the first phase of tests on a dual-duct air-handling unit installed in a large office building in San Francisco. We demonstrate the ability of the tool to detect a number of preexisting faults in the system and discuss practical issues related to implementation.

  4. The Fishery Performance Indicators: A Management Tool for Triple Bottom Line Outcomes

    Science.gov (United States)

    Anderson, James L.; Anderson, Christopher M.; Chu, Jingjie; Meredith, Jennifer; Asche, Frank; Sylvia, Gil; Smith, Martin D.; Anggraeni, Dessy; Arthur, Robert; Guttormsen, Atle; McCluney, Jessica K.; Ward, Tim; Akpalu, Wisdom; Eggert, Håkan; Flores, Jimely; Freeman, Matthew A.; Holland, Daniel S.; Knapp, Gunnar; Kobayashi, Mimako; Larkin, Sherry; MacLauchlin, Kari; Schnier, Kurt; Soboil, Mark; Tveteras, Sigbjorn; Uchida, Hirotsugu; Valderrama, Diego

    2015-01-01

    Pursuit of the triple bottom line of economic, community and ecological sustainability has increased the complexity of fishery management; fisheries assessments require new types of data and analysis to guide science-based policy in addition to traditional biological information and modeling. We introduce the Fishery Performance Indicators (FPIs), a broadly applicable and flexible tool for assessing performance in individual fisheries, and for establishing cross-sectional links between enabling conditions, management strategies and triple bottom line outcomes. Conceptually separating measures of performance, the FPIs use 68 individual outcome metrics—coded on a 1 to 5 scale based on expert assessment to facilitate application to data poor fisheries and sectors—that can be partitioned into sector-based or triple-bottom-line sustainability-based interpretative indicators. Variation among outcomes is explained with 54 similarly structured metrics of inputs, management approaches and enabling conditions. Using 61 initial fishery case studies drawn from industrial and developing countries around the world, we demonstrate the inferential importance of tracking economic and community outcomes, in addition to resource status. PMID:25946194

  5. Evaluation and comparison of models and modelling tools simulating nitrogen processes in treatment wetlands

    DEFF Research Database (Denmark)

    Edelfeldt, Stina; Fritzson, Peter

    2008-01-01

    with Modelica 2.1 (Wiley-IEEE Press, USA, 2004).] and an associated tool. The differences and similarities between the MathModelica Model Editor and three other ecological modelling tools have also been evaluated. The results show that the models can well be modelled and simulated in the MathModelica Model...... Editor, and that nitrogen decrease in a constructed treatment wetland should be described and simulated using the Nitrification/Denitrification model as this model has the highest overall quality score and provides a more variable environment.......In this paper, two ecological models of nitrogen processes in treatment wetlands have been evaluated and compared. These models were implemented, simulated, and visualized using the Modelica modelling and simulation language [P. Fritzson, Principles of Object-Oriented Modelling and Simulation...

  6. An artificial intelligence tool for complex age-depth models

    Science.gov (United States)

    Bradley, E.; Anderson, K. A.; de Vesine, L. R.; Lai, V.; Thomas, M.; Nelson, T. H.; Weiss, I.; White, J. W. C.

    2017-12-01

    CSciBox is an integrated software system for age modeling of paleoenvironmental records. It incorporates an array of data-processing and visualization facilities, ranging from 14C calibrations to sophisticated interpolation tools. Using CSciBox's GUI, a scientist can build custom analysis pipelines by composing these built-in components or adding new ones. Alternatively, she can employ CSciBox's automated reasoning engine, Hobbes, which uses AI techniques to perform an in-depth, autonomous exploration of the space of possible age-depth models and presents the results—both the models and the reasoning that was used in constructing and evaluating them—to the user for her inspection. Hobbes accomplishes this using a rulebase that captures the knowledge of expert geoscientists, which was collected over the course of more than 100 hours of interviews. It works by using these rules to generate arguments for and against different age-depth model choices for a given core. Given a marine-sediment record containing uncalibrated 14C dates, for instance, Hobbes tries CALIB-style calibrations using a choice of IntCal curves, with reservoir age correction values chosen from the 14CHRONO database using the lat/long information provided with the core, and finally composes the resulting age points into a full age model using different interpolation methods. It evaluates each model—e.g., looking for outliers or reversals—and uses that information to guide the next steps of its exploration, and presents the results to the user in human-readable form. The most powerful of CSciBox's built-in interpolation methods is BACON, a Bayesian sedimentation-rate algorithm—a powerful but complex tool that can be difficult to use. Hobbes adjusts BACON's many parameters autonomously to match the age model to the expectations of expert geoscientists, as captured in its rulebase. It then checks the model against the data and iteratively re-calculates until it is a good fit to the data.

  7. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    Science.gov (United States)

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  8. A Performance Measurement Tool Leading Wastewater Treatment Plants toward Economic Efficiency and Sustainability

    Directory of Open Access Journals (Sweden)

    Andrea Guerrini

    2016-11-01

    Full Text Available Wastewater treatment is an important link in the water cycle that allows for water sanitation and reuse, facilitates energy generation, and allows for the recovery of products from waste. The scientific community has paid significant attention to wastewater treatment, especially from a technical point of view. Extensive literature is available on new technologies, processes, and materials to improve wastewater treatment. However, scant studies have been conducted in the management field focusing on the development of a performance measurement tool that supports plant managers. The current article addresses this literature gap, developing a reporting tool that integrates technical and cost measures and implements it in a large wastewater utility. The tool successfully identifies cause and effect linkages among key plant performance drivers and supports management in finding activities with poor performance and allows them to delay non-relevant measures of control.

  9. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  10. Interdisciplinary semantic model for managing the design of a steam-assisted gravity drainage tooling system

    Directory of Open Access Journals (Sweden)

    Michael Leitch

    2018-01-01

    Full Text Available Complex engineering systems often require extensive coordination between different expert areas in order to avoid costly design iterations and rework. Cyber-physics system (CPS engineering methods could provide valuable insights to help model these interactions and optimize the design of such systems. In this work, steam assisted gravity drainage (SAGD, a complex oil extraction process that requires deep understanding of several physical-chemical phenomena, is examined whereby the complexities and interdependencies of the system are explored. Based on an established unified feature modeling scheme, a software modeling framework is proposed to manage the design process of the production tools used for SAGD oil extraction. Applying CPS methods to unify complex phenomenon and engineering models, the proposed CPS model combines effective simulation with embedded knowledge of completion tooling design in order to optimize reservoir performance. The system design is expressed using graphical diagrams of the unified modelling language (UML convention. To demonstrate the capability of this system, a distributed research group is described, and their activities coordinated using the described CPS model.

  11. A performance assessment review tool for the proposed radioactive waste repository at Yucca Mountain, Nevada, USA

    International Nuclear Information System (INIS)

    Mohanty, Sitakanta; Codell, Richard

    2000-01-01

    The U.S. Nuclear Regulatory Commission (NRC), with the assistance of the Center for Nuclear Waste Regulatory Analyses, has developed a Total-system Performance Assessment (TPA) Code to assist in evaluating the performance of the Yucca Mountain (YM) High-Level Waste Repository in Nevada, proposed by the U.S. Department of Energy (DOE). The proposed YM repository would be built in a thick sequence of partially saturated volcanic tuff above the water table. Among the unique challenges of this environment are (1) the transport of radionuclides would take place partially through highly heterogeneous unsaturated rock; (2) the waste packages (WPs) would be generally exposed to oxidizing conditions, and (3) water either infiltrating from the surface or recirculating because of decay heat may drip onto the WPs. Tools such as the TPA code and embedded techniques for evaluating YM performance are aimed at (1) determining the parameters and key parts of the repository system that have the most influence on repository performance; (2) performing alternative conceptual models studies, especially with bounding models; (3) estimating the relative importance of the physical phenomena that lead to human exposure to radionuclides; and (4) improving NRC staff capabilities in performance assessment and associated license application reviews. This paper presents an overview of the NRC conceptual framework, approach to conducting system-level sensitivity analyses for determining influential parameters, and alternative conceptual model studies to investigate the effect of model uncertainties. (author)

  12. Using ProModel as a simulation tools to assist plant layout design and planning: Case study plastic packaging factory

    OpenAIRE

    Pochamarn Tearwattanarattikal; Suwadee Namphacharoen; Chonthicha Chamrasporn

    2008-01-01

    This study is about the application of a Simulation Model to assist decision making on expanding capacity and plant layout design and planning. The plant layout design concept is performed first to create the physical layouts then the simulation model used to test the capability of plant to meet various demand forecast scena. The study employed ProModel package as a tool, using the model to compare the performances in term of % utilization, characteristics of WIP and ability to meet due date....

  13. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user's guide

    International Nuclear Information System (INIS)

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ''big picture'' and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a '' top down'' approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ''top down'' approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers

  14. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  15. A Fast Tool for Assessing the Power Performance of Large WEC arrays

    DEFF Research Database (Denmark)

    Ruiz, Pau Mercadé

    In the present work, a tool for computing wave energy converter array hydrodynamic forces and power performance is developed. The tool leads to a significant reduction on computation time compared with standard boundary element method based codes while keeping similar levels of accuracy. This mak...... it suitable for array layout optimization, where large numbers of simulations are required. Furthermore, the tool is developed within an open-source environment such as Python 2.7 so that it is fully accessible to anyone willing to make use of it....

  16. Model of practical skill performance as an instrument for supervision and formative assessment

    DEFF Research Database (Denmark)

    Nielsen, Carsten; Sommer, Irene; Larsen, Karin

    2012-01-01

    as during practice, performance and formative assessment of practical skills learning. It provided a common language about practical skills and enhanced the participants’ understanding of professionalism in practical nursing skill. In conclusion, the model helped to highlight the complexity in mastering......There are still weaknesses in the practical skills of newly graduated nurses. There is also an escalating pressure on existing clinical placements due to increasing student numbers and structural changes in health services. Innovative educational practices and the use of tools that might support...... learning are sparsely researched in the field of clinical education for nursing students. This paper reports on an action research study that promoted and investigated use of The Model of Practical Skill Performance as a learning tool during nursing students’ clinical placement. Clinical supervisors...

  17. COMSOL-PHREEQC: a tool for high performance numerical simulation of reactive transport phenomena

    International Nuclear Information System (INIS)

    Nardi, Albert; Vries, Luis Manuel de; Trinchero, Paolo; Idiart, Andres; Molinero, Jorge

    2012-01-01

    this tool. An example of 3D reactive transport simulations in clay material is shown to illustrate the capabilities and potential of the numerical tool. The example is based on the thermo-hydrochemical (THC) evolution of a deposition hole in the SKB KBS-3V concept of HLRW repository. Figure 1 shows the entire model domain of the performed THC simulation. Figure 2 shows the numerical FE refined mesh used in the clayey materials of the backfilled gallery and deposition hole, where variable saturated flow and reactive transport simulations were performed to study the expected geochemical behaviour of the clay barriers. (authors)

  18. Modeling, methodologies and tools for molecular and nano-scale communications modeling, methodologies and tools

    CERN Document Server

    Nakano, Tadashi; Moore, Michael

    2017-01-01

    (Preliminary) The book presents the state of art in the emerging field of molecular and nanoscale communication. It gives special attention to fundamental models, and advanced methodologies and tools used in the field. It covers a wide range of applications, e.g. nanomedicine, nanorobot communication, bioremediation and environmental managements. It addresses advanced graduate students, academics and professionals working at the forefront in their fields and at the interfaces between different areas of research, such as engineering, computer science, biology and nanotechnology.

  19. Aligning building information model tools and construction management methods

    NARCIS (Netherlands)

    Hartmann, Timo; van Meerveld, H.J.; Vossebeld, N.; Adriaanse, Adriaan Maria

    2012-01-01

    Few empirical studies exist that can explain how different Building Information Model (BIM) based tool implementation strategies work in practical contexts. To help overcoming this gap, this paper describes the implementation of two BIM based tools, the first, to support the activities at an

  20. CFD modelling of hydrogen stratification in enclosures: Model validation and application to PAR performance

    Energy Technology Data Exchange (ETDEWEB)

    Hoyes, J.R., E-mail: james.hoyes@hsl.gsi.gov.uk; Ivings, M.J.

    2016-12-15

    Highlights: • The ability of CFD to predict hydrogen stratification phenomena is investigated. • Contrary to expectation, simulations on tetrahedral meshes under-predict mixing. • Simulations on structured meshes give good agreement with experimental data. • CFD model used to investigate the effects of stratification on PAR performance. • Results show stratification can have a significant effect on PAR performance. - Abstract: Computational Fluid Dynamics (CFD) models are maturing into useful tools for supporting safety analyses. This paper investigates the capabilities of CFD models for predicting hydrogen stratification in a containment vessel using data from the NEA/OECD SETH2 MISTRA experiments. Further simulations are then carried out to illustrate the qualitative effects of hydrogen stratification on the performance of Passive Autocatalytic Recombiner (PAR) units. The MISTRA experiments have well-defined initial and boundary conditions which makes them well suited for use in a validation study. Results are presented for the sensitivity to mesh resolution and mesh type. Whilst the predictions are shown to be largely insensitive to the mesh resolution they are surprisingly sensitive to the mesh type. In particular, tetrahedral meshes are found to induce small unphysical convection currents that result in molecular diffusion and turbulent mixing being under-predicted. This behaviour is not unique to the CFD model used here (ANSYS CFX) and furthermore, it may affect simulations run on other non-aligned meshes (meshes that are not aligned perpendicular to gravity), including non-aligned structured meshes. Following existing best practice guidelines can help to identify potential unphysical predictions, but as an additional precaution consideration should be given to using gravity-aligned meshes for modelling stratified flows. CFD simulations of hydrogen recombination in the Becker Technologies THAI facility are presented with high and low PAR positions

  1. Performance and Sizing Tool for Quadrotor Biplane Tailsitter UAS

    Science.gov (United States)

    Strom, Eric

    The Quadrotor-Biplane-Tailsitter (QBT) configuration is the basis for a mechanically simplistic rotorcraft capable of both long-range, high-speed cruise as well as hovering flight. This work presents the development and validation of a set of preliminary design tools built specifically for this aircraft to enable its further development, including: a QBT weight model, preliminary sizing framework, and vehicle analysis tools. The preliminary sizing tool presented here shows the advantage afforded by QBT designs in missions with aggressive cruise requirements, such as offshore wind turbine inspections, wherein transition from a quadcopter configuration to a QBT allows for a 5:1 trade of battery weight for wing weight. A 3D, unsteady panel method utilizing a nonlinear implementation of the Kutta-Joukowsky condition is also presented as a means of computing aerodynamic interference effects and, through the implementation of rotor, body, and wing geometry generators, is prepared for coupling with a comprehensive rotor analysis package.

  2. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  3. Incorporation of the equilibrium temperature approach in a Soil and Water Assessment Tool hydroclimatological stream temperature model

    Science.gov (United States)

    Du, Xinzhong; Shrestha, Narayan Kumar; Ficklin, Darren L.; Wang, Junye

    2018-04-01

    Stream temperature is an important indicator for biodiversity and sustainability in aquatic ecosystems. The stream temperature model currently in the Soil and Water Assessment Tool (SWAT) only considers the impact of air temperature on stream temperature, while the hydroclimatological stream temperature model developed within the SWAT model considers hydrology and the impact of air temperature in simulating the water-air heat transfer process. In this study, we modified the hydroclimatological model by including the equilibrium temperature approach to model heat transfer processes at the water-air interface, which reflects the influences of air temperature, solar radiation, wind speed and streamflow conditions on the heat transfer process. The thermal capacity of the streamflow is modeled by the variation of the stream water depth. An advantage of this equilibrium temperature model is the simple parameterization, with only two parameters added to model the heat transfer processes. The equilibrium temperature model proposed in this study is applied and tested in the Athabasca River basin (ARB) in Alberta, Canada. The model is calibrated and validated at five stations throughout different parts of the ARB, where close to monthly samplings of stream temperatures are available. The results indicate that the equilibrium temperature model proposed in this study provided better and more consistent performances for the different regions of the ARB with the values of the Nash-Sutcliffe Efficiency coefficient (NSE) greater than those of the original SWAT model and the hydroclimatological model. To test the model performance for different hydrological and environmental conditions, the equilibrium temperature model was also applied to the North Fork Tolt River Watershed in Washington, United States. The results indicate a reasonable simulation of stream temperature using the model proposed in this study, with minimum relative error values compared to the other two models

  4. Surviving the present: Modeling tools for organizational change

    International Nuclear Information System (INIS)

    Pangaro, P.

    1992-01-01

    The nuclear industry, like the rest of modern American business, is beset by a confluence of economic, technological, competitive, regulatory, and political pressures. For better or worse, business schools and management consultants have leapt to the rescue, offering the most modern conveniences that they can purvey. Recent advances in the study of organizations have led to new tools for their analysis, revision, and repair. There are two complementary tools that do not impose values or injunctions in themselves. One, called the organization modeler, captures the hierarchy of purposes that organizations and their subparts carry out. Any deficiency or pathology is quickly illuminated, and requirements for repair are made clear. The second, called THOUGHTSTICKER, is used to capture the semantic content of the conversations that occur across the interactions of parts of an organization. The distinctions and vocabulary in the language of an organization, and the relations within that domain, are elicited from the participants so that all three are available for debate and refinement. The product of the applications of these modeling tools is not the resulting models but rather the enhancement of the organization as a consequence of the process of constructing them

  5. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  6. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  7. Tav4SB: integrating tools for analysis of kinetic models of biological systems.

    Science.gov (United States)

    Rybiński, Mikołaj; Lula, Michał; Banasik, Paweł; Lasota, Sławomir; Gambin, Anna

    2012-04-05

    Progress in the modeling of biological systems strongly relies on the availability of specialized computer-aided tools. To that end, the Taverna Workbench eases integration of software tools for life science research and provides a common workflow-based framework for computational experiments in Biology. The Taverna services for Systems Biology (Tav4SB) project provides a set of new Web service operations, which extend the functionality of the Taverna Workbench in a domain of systems biology. Tav4SB operations allow you to perform numerical simulations or model checking of, respectively, deterministic or stochastic semantics of biological models. On top of this functionality, Tav4SB enables the construction of high-level experiments. As an illustration of possibilities offered by our project we apply the multi-parameter sensitivity analysis. To visualize the results of model analysis a flexible plotting operation is provided as well. Tav4SB operations are executed in a simple grid environment, integrating heterogeneous software such as Mathematica, PRISM and SBML ODE Solver. The user guide, contact information, full documentation of available Web service operations, workflows and other additional resources can be found at the Tav4SB project's Web page: http://bioputer.mimuw.edu.pl/tav4sb/. The Tav4SB Web service provides a set of integrated tools in the domain for which Web-based applications are still not as widely available as for other areas of computational biology. Moreover, we extend the dedicated hardware base for computationally expensive task of simulating cellular models. Finally, we promote the standardization of models and experiments as well as accessibility and usability of remote services.

  8. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  9. Using the IEA ETSAP modelling tools for Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Grohnheit, Poul Erik

    2008-12-15

    An important part of the cooperation within the IEA (International Energy Agency) is organised through national contributions to 'Implementation Agreements' on energy technology and energy analyses. One of them is ETSAP (Energy Technology Systems Analysis Programme), started in 1976. Denmark has signed the agreement and contributed to some early annexes. This project is motivated by an invitation to participate in ETSAP Annex X, 'Global Energy Systems and Common Analyses: Climate friendly, Secure and Productive Energy Systems' for the period 2005 to 2007. The main activity is semi-annual workshops focusing on presentations of model analyses and use of the ETSAP tools (the MARKAL/TIMES family of models). The project was also planned to benefit from the EU project 'NEEDS - New Energy Externalities Developments for Sustainability'. ETSAP is contributing to a part of NEEDS that develops the TIMES model for 29 European countries with assessment of future technologies. An additional project 'Monitoring and Evaluation of the RES directives: implementation in EU27 and policy recommendations for 2020' (RES2020) under Intelligent Energy Europe was added, as well as the Danish 'Centre for Energy, Environment and Health (CEEH), starting from January 2007. This report summarises the activities under ETSAP Annex X and related project, emphasising the development of modelling tools that will be useful for modelling the Danish energy system. It is also a status report for the development of a model for Denmark, focusing on the tools and features that allow comparison with other countries and, particularly, to evaluate assumptions and results in international models covering Denmark. (au)

  10. A tool for urban soundscape evaluation applying Support Vector Machines for developing a soundscape classification model.

    Science.gov (United States)

    Torija, Antonio J; Ruiz, Diego P; Ramos-Ridao, Angel F

    2014-06-01

    To ensure appropriate soundscape management in urban environments, the urban-planning authorities need a range of tools that enable such a task to be performed. An essential step during the management of urban areas from a sound standpoint should be the evaluation of the soundscape in such an area. In this sense, it has been widely acknowledged that a subjective and acoustical categorization of a soundscape is the first step to evaluate it, providing a basis for designing or adapting it to match people's expectations as well. In this sense, this work proposes a model for automatic classification of urban soundscapes. This model is intended for the automatic classification of urban soundscapes based on underlying acoustical and perceptual criteria. Thus, this classification model is proposed to be used as a tool for a comprehensive urban soundscape evaluation. Because of the great complexity associated with the problem, two machine learning techniques, Support Vector Machines (SVM) and Support Vector Machines trained with Sequential Minimal Optimization (SMO), are implemented in developing model classification. The results indicate that the SMO model outperforms the SVM model in the specific task of soundscape classification. With the implementation of the SMO algorithm, the classification model achieves an outstanding performance (91.3% of instances correctly classified). © 2013 Elsevier B.V. All rights reserved.

  11. Method and apparatus for characterizing and enhancing the functional performance of machine tools

    Science.gov (United States)

    Barkman, William E; Babelay, Jr., Edwin F; Smith, Kevin Scott; Assaid, Thomas S; McFarland, Justin T; Tursky, David A; Woody, Bethany; Adams, David

    2013-04-30

    Disclosed are various systems and methods for assessing and improving the capability of a machine tool. The disclosure applies to machine tools having at least one slide configured to move along a motion axis. Various patterns of dynamic excitation commands are employed to drive the one or more slides, typically involving repetitive short distance displacements. A quantification of a measurable merit of machine tool response to the one or more patterns of dynamic excitation commands is typically derived for the machine tool. Examples of measurable merits of machine tool performance include workpiece surface finish, and the ability to generate chips of the desired length.

  12. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  13. Model-Based Fault Diagnosis Techniques Design Schemes, Algorithms and Tools

    CERN Document Server

    Ding, Steven X

    2013-01-01

    Guaranteeing a high system performance over a wide operating range is an important issue surrounding the design of automatic control systems with successively increasing complexity. As a key technology in the search for a solution, advanced fault detection and identification (FDI) is receiving considerable attention. This book introduces basic model-based FDI schemes, advanced analysis and design algorithms, and mathematical and control-theoretic tools. This second edition of Model-Based Fault Diagnosis Techniques contains: ·         new material on fault isolation and identification, and fault detection in feedback control loops; ·         extended and revised treatment of systematic threshold determination for systems with both deterministic unknown inputs and stochastic noises; addition of the continuously-stirred tank heater as a representative process-industrial benchmark; and ·         enhanced discussion of residual evaluation in stochastic processes. Model-based Fault Diagno...

  14. Development of Low Global Warming Potential Refrigerant Solutions for Commercial Refrigeration Systems using a Life Cycle Climate Performance Design Tool

    Energy Technology Data Exchange (ETDEWEB)

    Abdelaziz, Omar [ORNL; Fricke, Brian A [ORNL; Vineyard, Edward Allan [ORNL

    2012-01-01

    Commercial refrigeration systems are known to be prone to high leak rates and to consume large amounts of electricity. As such, direct emissions related to refrigerant leakage and indirect emissions resulting from primary energy consumption contribute greatly to their Life Cycle Climate Performance (LCCP). In this paper, an LCCP design tool is used to evaluate the performance of a typical commercial refrigeration system with alternative refrigerants and minor system modifications to provide lower Global Warming Potential (GWP) refrigerant solutions with improved LCCP compared to baseline systems. The LCCP design tool accounts for system performance, ambient temperature, and system load; system performance is evaluated using a validated vapor compression system simulation tool while ambient temperature and system load are devised from a widely used building energy modeling tool (EnergyPlus). The LCCP design tool also accounts for the change in hourly electricity emission rate to yield an accurate prediction of indirect emissions. The analysis shows that conventional commercial refrigeration system life cycle emissions are largely due to direct emissions associated with refrigerant leaks and that system efficiency plays a smaller role in the LCCP. However, as a transition occurs to low GWP refrigerants, the indirect emissions become more relevant. Low GWP refrigerants may not be suitable for drop-in replacements in conventional commercial refrigeration systems; however some mixtures may be introduced as transitional drop-in replacements. These transitional refrigerants have a significantly lower GWP than baseline refrigerants and as such, improved LCCP. The paper concludes with a brief discussion on the tradeoffs between refrigerant GWP, efficiency and capacity.

  15. Modeling and Evaluating Pilot Performance in NextGen: Review of and Recommendations Regarding Pilot Modeling Efforts, Architectures, and Validation Studies

    Science.gov (United States)

    Wickens, Christopher; Sebok, Angelia; Keller, John; Peters, Steve; Small, Ronald; Hutchins, Shaun; Algarin, Liana; Gore, Brian Francis; Hooey, Becky Lee; Foyle, David C.

    2013-01-01

    NextGen operations are associated with a variety of changes to the national airspace system (NAS) including changes to the allocation of roles and responsibilities among operators and automation, the use of new technologies and automation, additional information presented on the flight deck, and the entire concept of operations (ConOps). In the transition to NextGen airspace, aviation and air operations designers need to consider the implications of design or system changes on human performance and the potential for error. To ensure continued safety of the NAS, it will be necessary for researchers to evaluate design concepts and potential NextGen scenarios well before implementation. One approach for such evaluations is through human performance modeling. Human performance models (HPMs) provide effective tools for predicting and evaluating operator performance in systems. HPMs offer significant advantages over empirical, human-in-the-loop testing in that (1) they allow detailed analyses of systems that have not yet been built, (2) they offer great flexibility for extensive data collection, (3) they do not require experimental participants, and thus can offer cost and time savings. HPMs differ in their ability to predict performance and safety with NextGen procedures, equipment and ConOps. Models also vary in terms of how they approach human performance (e.g., some focus on cognitive processing, others focus on discrete tasks performed by a human, while others consider perceptual processes), and in terms of their associated validation efforts. The objectives of this research effort were to support the Federal Aviation Administration (FAA) in identifying HPMs that are appropriate for predicting pilot performance in NextGen operations, to provide guidance on how to evaluate the quality of different models, and to identify gaps in pilot performance modeling research, that could guide future research opportunities. This research effort is intended to help the FAA

  16. FAST: a combined NOC and transient fuel performance model using a commercial FEM environment

    Energy Technology Data Exchange (ETDEWEB)

    Prudil, A.; Bell, J.; Oussoren, A.; Chan, P. [Royal Military College of Canada, Kingston, ON (Canada); Lewis, B. [Univ. of Ontario Inst. of Tech., Oshawa, ON (Canada)

    2014-07-01

    The Fuel And Sheath modelling Tool (FAST) is a combined normal operating conditions (NOC) and transient fuel performance code developed on the COMSOL Multiphysics finite-element platform. The FAST code has demonstrated excellent performance in proof of concept validation tests against experimental data and comparison to the ELESIM, ELESTRES and ELOCA fuel performance codes. In this paper we discuss ongoing efforts to expand the capabilities of the code to include multiple pellet geometries, model stress-corrosion cracking phenomena and modelling of advanced fuels composed of mixed oxides of thorium, uranium, and plutonium for the Canadian Supercritical Water Reactor (SCWR). (author)

  17. Human performance tools in nuclear power plants. Introduction, implementation and experiences; Human Performance Tools in Kernkraftwerken. Einfuehrung, Umsetzung und Erfahrungen

    Energy Technology Data Exchange (ETDEWEB)

    Dexheimer, Kai; Bassing, Gerd [Dexcon Consulting GmbH, Neuhausen (Switzerland); Kreuzer, Peter [E.ON Kernkraft GmbH, Essenbach (Germany). Kernkraftwerk Isar

    2015-06-01

    The basis of safe nuclear power plant operation (NPP) and a strong safety culture is the professional application of Human Performance Optimisation Tools (HPO). HPO trainings have been carried out by German NPPs for a number of years and recently also by Swiss NPPs. This article describes the origination, the bases, experiences and thereby the special features of the HPO training programme applied by German NPP operators. Moreover, this article provides an outlook on future developments - in particular when considering the requirements of the ongoing phase out of nuclear energy in Germany.

  18. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  19. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  20. Comparison of four modeling tools for the prediction of potential distribution for non-indigenous weeds in the United States

    Science.gov (United States)

    Magarey, Roger; Newton, Leslie; Hong, Seung C.; Takeuchi, Yu; Christie, Dave; Jarnevich, Catherine S.; Kohl, Lisa; Damus, Martin; Higgins, Steven I.; Miller, Leah; Castro, Karen; West, Amanda; Hastings, John; Cook, Gericke; Kartesz, John; Koop, Anthony

    2018-01-01

    This study compares four models for predicting the potential distribution of non-indigenous weed species in the conterminous U.S. The comparison focused on evaluating modeling tools and protocols as currently used for weed risk assessment or for predicting the potential distribution of invasive weeds. We used six weed species (three highly invasive and three less invasive non-indigenous species) that have been established in the U.S. for more than 75 years. The experiment involved providing non-U. S. location data to users familiar with one of the four evaluated techniques, who then developed predictive models that were applied to the United States without knowing the identity of the species or its U.S. distribution. We compared a simple GIS climate matching technique known as Proto3, a simple climate matching tool CLIMEX Match Climates, the correlative model MaxEnt, and a process model known as the Thornley Transport Resistance (TTR) model. Two experienced users ran each modeling tool except TTR, which had one user. Models were trained with global species distribution data excluding any U.S. data, and then were evaluated using the current known U.S. distribution. The influence of weed species identity and modeling tool on prevalence and sensitivity effects was compared using a generalized linear mixed model. Each modeling tool itself had a low statistical significance, while weed species alone accounted for 69.1 and 48.5% of the variance for prevalence and sensitivity, respectively. These results suggest that simple modeling tools might perform as well as complex ones in the case of predicting potential distribution for a weed not yet present in the United States. Considerations of model accuracy should also be balanced with those of reproducibility and ease of use. More important than the choice of modeling tool is the construction of robust protocols and testing both new and experienced users under blind test conditions that approximate operational conditions.

  1. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  2. Assessment of wear dependence parameters in complex model of cutting tool wear

    Science.gov (United States)

    Antsev, A. V.; Pasko, N. I.; Antseva, N. V.

    2018-03-01

    This paper addresses wear dependence of the generic efficient life period of cutting tools taken as an aggregate of the law of tool wear rate distribution and dependence of parameters of this law's on the cutting mode, factoring in the random factor as exemplified by the complex model of wear. The complex model of wear takes into account the variance of cutting properties within one batch of tools, variance in machinability within one batch of workpieces, and the stochastic nature of the wear process itself. A technique of assessment of wear dependence parameters in a complex model of cutting tool wear is provided. The technique is supported by a numerical example.

  3. Optimizing the ASC WAN: evaluating network performance tools for comparing transport protocols.

    Energy Technology Data Exchange (ETDEWEB)

    Lydick, Christopher L.

    2007-07-01

    The Advanced Simulation & Computing Wide Area Network (ASC WAN), which is a high delay-bandwidth network connection between US Department of Energy National Laboratories, is constantly being examined and evaluated for efficiency. One of the current transport-layer protocols which is used, TCP, was developed for traffic demands which are different from that on the ASC WAN. The Stream Control Transport Protocol (SCTP), on the other hand, has shown characteristics which make it more appealing to networks such as these. Most important, before considering a replacement for TCP on any network, a testing tool that performs well against certain criteria needs to be found. In order to try to find such a tool, two popular networking tools (Netperf v.2.4.3 & v.2.4.6 (OpenSS7 STREAMS), and Iperf v.2.0.6) were tested. These tools implement both TCP and SCTP and were evaluated using four metrics: (1) How effectively can the tool reach a throughput near the bandwidth? (2) How much of the CPU does the tool utilize during operation? (3) Is the tool freely and widely available? And, (4) Is the tool actively developed? Following the analysis of those tools, this paper goes further into explaining some recommendations and ideas for future work.

  4. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  5. Eddy current NDE performance demonstrations using simulation tools

    International Nuclear Information System (INIS)

    Maurice, L.; Costan, V.; Guillot, E.; Thomas, P.

    2013-01-01

    To carry out performance demonstrations of the Eddy-Current NDE processes applied on French nuclear power plants, EDF studies the possibility of using simulation tools as an alternative to measurements on steam generator tube mocks-up. This paper focuses on the strategy led by EDF to assess and use code C armel3D and Civa, on the case of Eddy-Current NDE on wears problem which may appear in the U-shape region of steam generator tubes due to the rubbing of anti-vibration bars.

  6. Thermomechanical modelling of laser surface glazing for H13 tool steel

    Science.gov (United States)

    Kabir, I. R.; Yin, D.; Tamanna, N.; Naher, S.

    2018-03-01

    A two-dimensional thermomechanical finite element (FE) model of laser surface glazing (LSG) has been developed for H13 tool steel. The direct coupling technique of ANSYS 17.2 (APDL) has been utilised to solve the transient thermomechanical process. A H13 tool steel cylindrical cross-section has been modelled for laser power 200 W and 300 W at constant 0.2 mm beam width and 0.15 ms residence time. The model can predict temperature distribution, stress-strain increments in elastic and plastic region with time and space. The crack formation tendency also can be assumed by analysing the von Mises stress in the heat-concentrated zone. Isotropic and kinematic hardening models have been applied separately to predict the after-yield phenomena. At 200 W laser power, the peak surface temperature achieved is 1520 K which is below the melting point (1727 K) of H13 tool steel. For laser power 300 W, the peak surface temperature is 2523 K. Tensile residual stresses on surface have been found after cooling, which are in agreement with literature. Isotropic model shows higher residual stress that increases with laser power. Conversely, kinematic model gives lower residual stress which decreases with laser power. Therefore, both plasticity models could work in LSG for H13 tool steel.

  7. GEOQUIMICO : an interactive tool for comparing sorption conceptual models (surface complexation modeling versus K[D])

    International Nuclear Information System (INIS)

    Hammond, Glenn E.; Cygan, Randall Timothy

    2007-01-01

    Within reactive geochemical transport, several conceptual models exist for simulating sorption processes in the subsurface. Historically, the K D approach has been the method of choice due to ease of implementation within a reactive transport model and straightforward comparison with experimental data. However, for modeling complex sorption phenomenon (e.g. sorption of radionuclides onto mineral surfaces), this approach does not systematically account for variations in location, time, or chemical conditions, and more sophisticated methods such as a surface complexation model (SCM) must be utilized. It is critical to determine which conceptual model to use; that is, when the material variation becomes important to regulatory decisions. The geochemical transport tool GEOQUIMICO has been developed to assist in this decision-making process. GEOQUIMICO provides a user-friendly framework for comparing the accuracy and performance of sorption conceptual models. The model currently supports the K D and SCM conceptual models. The code is written in the object-oriented Java programming language to facilitate model development and improve code portability. The basic theory underlying geochemical transport and the sorption conceptual models noted above is presented in this report. Explanations are provided of how these physicochemical processes are instrumented in GEOQUIMICO and a brief verification study comparing GEOQUIMICO results to data found in the literature is given

  8. Repository Integration Program: RIP performance assessment and strategy evaluation model theory manual and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    This report describes the theory and capabilities of RIP (Repository Integration Program). RIP is a powerful and flexible computational tool for carrying out probabilistic integrated total system performance assessments for geologic repositories. The primary purpose of RIP is to provide a management tool for guiding system design and site characterization. In addition, the performance assessment model (and the process of eliciting model input) can act as a mechanism for integrating the large amount of available information into a meaningful whole (in a sense, allowing one to keep the ``big picture`` and the ultimate aims of the project clearly in focus). Such an integration is useful both for project managers and project scientists. RIP is based on a `` top down`` approach to performance assessment that concentrates on the integration of the entire system, and utilizes relatively high-level descriptive models and parameters. The key point in the application of such a ``top down`` approach is that the simplified models and associated high-level parameters must incorporate an accurate representation of their uncertainty. RIP is designed in a very flexible manner such that details can be readily added to various components of the model without modifying the computer code. Uncertainty is also handled in a very flexible manner, and both parameter and model (process) uncertainty can be explicitly considered. Uncertainty is propagated through the integrated PA model using an enhanced Monte Carlo method. RIP must rely heavily on subjective assessment (expert opinion) for much of its input. The process of eliciting the high-level input parameters required for RIP is critical to its successful application. As a result, in order for any project to successfully apply a tool such as RIP, an enormous amount of communication and cooperation must exist between the data collectors, the process modelers, and the performance. assessment modelers.

  9. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  10. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  11. Temporal diagnostic analysis of the SWAT model to detect dominant periods of poor model performance

    Science.gov (United States)

    Guse, Björn; Reusser, Dominik E.; Fohrer, Nicola

    2013-04-01

    Hydrological models generally include thresholds and non-linearities, such as snow-rain-temperature thresholds, non-linear reservoirs, infiltration thresholds and the like. When relating observed variables to modelling results, formal methods often calculate performance metrics over long periods, reporting model performance with only few numbers. Such approaches are not well suited to compare dominating processes between reality and model and to better understand when thresholds and non-linearities are driving model results. We present a combination of two temporally resolved model diagnostic tools to answer when a model is performing (not so) well and what the dominant processes are during these periods. We look at the temporal dynamics of parameter sensitivities and model performance to answer this question. For this, the eco-hydrological SWAT model is applied in the Treene lowland catchment in Northern Germany. As a first step, temporal dynamics of parameter sensitivities are analyzed using the Fourier Amplitude Sensitivity test (FAST). The sensitivities of the eight model parameters investigated show strong temporal variations. High sensitivities were detected for two groundwater (GW_DELAY, ALPHA_BF) and one evaporation parameters (ESCO) most of the time. The periods of high parameter sensitivity can be related to different phases of the hydrograph with dominances of the groundwater parameters in the recession phases and of ESCO in baseflow and resaturation periods. Surface runoff parameters show high parameter sensitivities in phases of a precipitation event in combination with high soil water contents. The dominant parameters give indication for the controlling processes during a given period for the hydrological catchment. The second step included the temporal analysis of model performance. For each time step, model performance was characterized with a "finger print" consisting of a large set of performance measures. These finger prints were clustered into

  12. Development of modeling tools for pin-by-pin precise reactor simulation

    International Nuclear Information System (INIS)

    Ma Yan; Li Shu; Li Gang; Zhang Baoyin; Deng Li; Fu Yuanguang

    2013-01-01

    In order to develop large-scale transport simulation and calculation method (such as simulation of whole reactor core pin-by-pin problem), the Institute of Applied Physics and Computational Mathematics developed the neutron-photon coupled transport code JMCT and the toolkit JCOGIN. Creating physical calculation model easily and efficiently can essentially reduce problem solving time. Currently, lots of visual modeling programs have been developed based on different CAD systems. In this article, the developing idea of a visual modeling tool based on field oriented development was introduced. Considering the feature of physical modeling, fast and convenient operation modules were developed. In order to solve the storage and conversion problems of large scale models, the data structure and conversional algorithm based on the hierarchical geometry tree were designed. The automatic conversion and generation of physical model input file for JMCT were realized. By using this modeling tool, the Dayawan reactor whole core physical model was created, and the transformed file was delivered to JMCT for transport calculation. The results validate the correctness of the visual modeling tool. (authors)

  13. Improving Power System Modeling. A Tool to Link Capacity Expansion and Production Cost Models

    Energy Technology Data Exchange (ETDEWEB)

    Diakov, Victor [National Renewable Energy Lab. (NREL), Golden, CO (United States); Cole, Wesley [National Renewable Energy Lab. (NREL), Golden, CO (United States); Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Brinkman, Gregory [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-11-01

    Capacity expansion models (CEM) provide a high-level long-term view at the prospects of the evolving power system. In simulating the possibilities of long-term capacity expansion, it is important to maintain the viability of power system operation in the short-term (daily, hourly and sub-hourly) scales. Production-cost models (PCM) simulate routine power system operation on these shorter time scales using detailed load, transmission and generation fleet data by minimizing production costs and following reliability requirements. When based on CEM 'predictions' about generating unit retirements and buildup, PCM provide more detailed simulation for the short-term system operation and, consequently, may confirm the validity of capacity expansion predictions. Further, production cost model simulations of a system that is based on capacity expansion model solution are 'evolutionary' sound: the generator mix is the result of logical sequence of unit retirement and buildup resulting from policy and incentives. The above has motivated us to bridge CEM with PCM by building a capacity expansion - to - production cost model Linking Tool (CEPCoLT). The Linking Tool is built to onset capacity expansion model prescriptions onto production cost model inputs. NREL's ReEDS and Energy Examplar's PLEXOS are the capacity expansion and the production cost models, respectively. Via the Linking Tool, PLEXOS provides details of operation for the regionally-defined ReEDS scenarios.

  14. Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.

    Science.gov (United States)

    Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel

    2016-01-01

    One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.

  15. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    Science.gov (United States)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  16. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    International Nuclear Information System (INIS)

    Joshi, D.M.; Patel, H.K.

    2015-01-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant

  17. Rapid State Space Modeling Tool for Rectangular Wing Aeroservoelastic Studies

    Science.gov (United States)

    Suh, Peter M.; Conyers, Howard Jason; Mavris, Dimitri N.

    2015-01-01

    This report introduces a modeling and simulation tool for aeroservoelastic analysis of rectangular wings with trailing-edge control surfaces. The inputs to the code are planform design parameters such as wing span, aspect ratio, and number of control surfaces. Using this information, the generalized forces are computed using the doublet-lattice method. Using Roger's approximation, a rational function approximation is computed. The output, computed in a few seconds, is a state space aeroservoelastic model which can be used for analysis and control design. The tool is fully parameterized with default information so there is little required interaction with the model developer. All parameters can be easily modified if desired. The focus of this report is on tool presentation, verification, and validation. These processes are carried out in stages throughout the report. The rational function approximation is verified against computed generalized forces for a plate model. A model composed of finite element plates is compared to a modal analysis from commercial software and an independently conducted experimental ground vibration test analysis. Aeroservoelastic analysis is the ultimate goal of this tool, therefore, the flutter speed and frequency for a clamped plate are computed using damping-versus-velocity and frequency-versus-velocity analysis. The computational results are compared to a previously published computational analysis and wind-tunnel results for the same structure. A case study of a generic wing model with a single control surface is presented. Verification of the state space model is presented in comparison to damping-versus-velocity and frequency-versus-velocity analysis, including the analysis of the model in response to a 1-cos gust.

  18. FASTBUS simulation tools

    International Nuclear Information System (INIS)

    Dean, T.D.; Haney, M.J.

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs

  19. Modelling stillbirth mortality reduction with the Lives Saved Tool

    Directory of Open Access Journals (Sweden)

    Hannah Blencowe

    2017-11-01

    Full Text Available Abstract Background The worldwide burden of stillbirths is large, with an estimated 2.6 million babies stillborn in 2015 including 1.3 million dying during labour. The Every Newborn Action Plan set a stillbirth target of ≤12 per 1000 in all countries by 2030. Planning tools will be essential as countries set policy and plan investment to scale up interventions to meet this target. This paper summarises the approach taken for modelling the impact of scaling-up health interventions on stillbirths in the Lives Saved tool (LiST, and potential future refinements. Methods The specific application to stillbirths of the general method for modelling the impact of interventions in LiST is described. The evidence for the effectiveness of potential interventions to reduce stillbirths are reviewed and the assumptions of the affected fraction of stillbirths who could potentially benefit from these interventions are presented. The current assumptions and their effects on stillbirth reduction are described and potential future improvements discussed. Results High quality evidence are not available for all parameters in the LiST stillbirth model. Cause-specific mortality data is not available for stillbirths, therefore stillbirths are modelled in LiST using an attributable fraction approach by timing of stillbirths (antepartum/ intrapartum. Of 35 potential interventions to reduce stillbirths identified, eight interventions are currently modelled in LiST. These include childbirth care, induction for prolonged pregnancy, multiple micronutrient and balanced energy supplementation, malaria prevention and detection and management of hypertensive disorders of pregnancy, diabetes and syphilis. For three of the interventions, childbirth care, detection and management of hypertensive disorders of pregnancy, and diabetes the estimate of effectiveness is based on expert opinion through a Delphi process. Only for malaria is coverage information available, with coverage

  20. THE USE OF NEURAL NETWORK TECHNOLOGY TO MODEL SWIMMING PERFORMANCE

    Directory of Open Access Journals (Sweden)

    António José Silva

    2007-03-01

    Full Text Available The aims of the present study were: to identify the factors which are able to explain the performance in the 200 meters individual medley and 400 meters front crawl events in young swimmers, to model the performance in those events using non-linear mathematic methods through artificial neural networks (multi-layer perceptrons and to assess the neural network models precision to predict the performance. A sample of 138 young swimmers (65 males and 73 females of national level was submitted to a test battery comprising four different domains: kinanthropometric evaluation, dry land functional evaluation (strength and flexibility, swimming functional evaluation (hydrodynamics, hydrostatic and bioenergetics characteristics and swimming technique evaluation. To establish a profile of the young swimmer non-linear combinations between preponderant variables for each gender and swim performance in the 200 meters medley and 400 meters font crawl events were developed. For this purpose a feed forward neural network was used (Multilayer Perceptron with three neurons in a single hidden layer. The prognosis precision of the model (error lower than 0.8% between true and estimated performances is supported by recent evidence. Therefore, we consider that the neural network tool can be a good approach in the resolution of complex problems such as performance modeling and the talent identification in swimming and, possibly, in a wide variety of sports

  1. Performance Evaluation of UML2-Modeled Embedded Streaming Applications with System-Level Simulation

    Directory of Open Access Journals (Sweden)

    Arpinen Tero

    2009-01-01

    Full Text Available This article presents an efficient method to capture abstract performance model of streaming data real-time embedded systems (RTESs. Unified Modeling Language version 2 (UML2 is used for the performance modeling and as a front-end for a tool framework that enables simulation-based performance evaluation and design-space exploration. The adopted application meta-model in UML resembles the Kahn Process Network (KPN model and it is targeted at simulation-based performance evaluation. The application workload modeling is done using UML2 activity diagrams, and platform is described with structural UML2 diagrams and model elements. These concepts are defined using a subset of the profile for Modeling and Analysis of Realtime and Embedded (MARTE systems from OMG and custom stereotype extensions. The goal of the performance modeling and simulation is to achieve early estimates on task response times, processing element, memory, and on-chip network utilizations, among other information that is used for design-space exploration. As a case study, a video codec application on multiple processors is modeled, evaluated, and explored. In comparison to related work, this is the first proposal that defines transformation between UML activity diagrams and streaming data application workload meta models and successfully adopts it for RTES performance evaluation.

  2. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through

  3. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  4. Dynamic wind turbine models in power system simulation tool DIgSILENT

    DEFF Research Database (Denmark)

    Hansen, Anca Daniela; Iov, F.; Sørensen, Poul Ejnar

    , connection of the wind turbine at different types of grid and storage systems. Different control strategies have been developed and implemented for these wind turbine concepts, their performance in normal or fault operation being assessed and discussed by means of simulations. The described control......This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The report provides thus a description of the wind turbines modelling, both at a component level and at a system level. The report contains both the description of DIgSILENT built...

  5. Multi-category micro-milling tool wear monitoring with continuous hidden Markov models

    Science.gov (United States)

    Zhu, Kunpeng; Wong, Yoke San; Hong, Geok Soon

    2009-02-01

    In-process monitoring of tool conditions is important in micro-machining due to the high precision requirement and high tool wear rate. Tool condition monitoring in micro-machining poses new challenges compared to conventional machining. In this paper, a multi-category classification approach is proposed for tool flank wear state identification in micro-milling. Continuous Hidden Markov models (HMMs) are adapted for modeling of the tool wear process in micro-milling, and estimation of the tool wear state given the cutting force features. For a noise-robust approach, the HMM outputs are connected via a medium filter to minimize the tool state before entry into the next state due to high noise level. A detailed study on the selection of HMM structures for tool condition monitoring (TCM) is presented. Case studies on the tool state estimation in the micro-milling of pure copper and steel demonstrate the effectiveness and potential of these methods.

  6. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  7. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  8. ADAS tools for collisional–radiative modelling of molecules

    Energy Technology Data Exchange (ETDEWEB)

    Guzmán, F., E-mail: francisco.guzman@cea.fr [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom); CEA, IRFM, Saint-Paul-lez-Durance 13108 (France); O’Mullane, M.; Summers, H.P. [Department of Physics, University of Strathclyde, Glasgow G4 0NG (United Kingdom)

    2013-07-15

    New theoretical and computational tools for molecular collisional–radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H{sub 2} are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional–radiative) rate coefficients versus temperature and density are presented.

  9. ADAS tools for collisional-radiative modelling of molecules

    Science.gov (United States)

    Guzmán, F.; O'Mullane, M.; Summers, H. P.

    2013-07-01

    New theoretical and computational tools for molecular collisional-radiative models are presented. An application to the hydrogen molecule system has been made. At the same time, a structured database has been created where fundamental cross sections and rates for individual processes as well as derived data (effective coefficients) are stored. Relative populations for the vibrational states of the ground electronic state of H2 are presented and this vibronic resolution model is compared electronic resolution where vibronic transitions are summed over vibrational sub-states. Some new reaction rates are calculated by means of the impact parameter approximation. Computational tools have been developed to automate process and simplify the data assembly. Effective (collisional-radiative) rate coefficients versus temperature and density are presented.

  10. Multidisciplinary Energy Assessment of Tertiary Buildings: Automated Geomatic Inspection, Building Information Modeling Reconstruction and Building Performance Simulation

    Directory of Open Access Journals (Sweden)

    Faustino Patiño-Cambeiro

    2017-07-01

    Full Text Available There is an urgent need for energy efficiency in buildings within the European framework, considering its environmental implications, and Europe’s energy dependence. Furthermore, the need for enhancing and increasing productivity in the building industry turns new technologies and building energy performance simulation environments into extremely interesting solutions towards rigorous analysis and decision making in renovation within acceptable risk levels. The present work describes a multidisciplinary approach for the estimation of the energy performance of an educational building. The research involved data acquisition with advanced geomatic tools, the development of an optimized building information model, and energy assessment in Building Performance Simulation (BPS software. Interoperability issues were observed in the different steps of the process. The inspection and diagnostic phases were conducted in a timely, accurate manner thanks to automated data acquisition and subsequent analysis using Building Information Modeling based tools (BIM-based tools. Energy simulation was performed using Design Builder, and the results obtained were compared with those yielded by the official software tool established by Spanish regulations for energy certification. The discrepancies between the results of both programs have proven that the official software program is conservative in this sense. This may cause the depreciation of the assessed buildings.

  11. PyCoTools: A Python Toolbox for COPASI.

    Science.gov (United States)

    Welsh, Ciaran M; Fullard, Nicola; Proctor, Carole J; Martinez-Guimera, Alvaro; Isfort, Robert J; Bascom, Charles C; Tasseff, Ryan; Przyborski, Stefan A; Shanley, Daryl P

    2018-05-22

    COPASI is an open source software package for constructing, simulating and analysing dynamic models of biochemical networks. COPASI is primarily intended to be used with a graphical user interface but often it is desirable to be able to access COPASI features programmatically, with a high level interface. PyCoTools is a Python package aimed at providing a high level interface to COPASI tasks with an emphasis on model calibration. PyCoTools enables the construction of COPASI models and the execution of a subset of COPASI tasks including time courses, parameter scans and parameter estimations. Additional 'composite' tasks which use COPASI tasks as building blocks are available for increasing parameter estimation throughput, performing identifiability analysis and performing model selection. PyCoTools supports exploratory data analysis on parameter estimation data to assist with troubleshooting model calibrations. We demonstrate PyCoTools by posing a model selection problem designed to show case PyCoTools within a realistic scenario. The aim of the model selection problem is to test the feasibility of three alternative hypotheses in explaining experimental data derived from neonatal dermal fibroblasts in response to TGF-β over time. PyCoTools is used to critically analyse the parameter estimations and propose strategies for model improvement. PyCoTools can be downloaded from the Python Package Index (PyPI) using the command 'pip install pycotools' or directly from GitHub (https://github.com/CiaranWelsh/pycotools). Documentation at http://pycotools.readthedocs.io. Supplementary data are available at Bioinformatics.

  12. Simulation models: a current indispensable tool in studies of the continuous water-soil-plant - atmosphere

    International Nuclear Information System (INIS)

    Lopez Seijas, Teresa; Gonzalez, Felicita; Cid, G.; Osorio, Maria de los A.; Ruiz, Maria Elena

    2008-01-01

    Full text: This work assesses the current use of simulation models as a tool useful and indispensable for the advancement in the research and study of the processes related to the continuous water-soil - plant-atmosphere. In recent years they have reported in the literature many jobs where these modeling tools are used as a support to the decision-making process of companies or organizations in the agricultural sphere and in Special for the design of optimal management of irrigation and fertilization strategies of the crops. Summarizes some of the latest applications reported with respect to the use of water transfers and solutes, such simulation models mainly to nitrate leaching and groundwater contamination problems. On the other hand also summarizes important applications of simulation models of growth of cultivation for the prediction of effects on the performance of different conditions of water stress, and finally some other applications on the management of the different irrigation technologies as kingpins, superfiail irrigation and drip irrigation. Refer also the main work carried out in Cuba. (author)

  13. A Hybrid Evaluation System Framework (Shell & Web) with Standardized Access to Climate Model Data and Verification Tools for a Clear Climate Science Infrastructure on Big Data High Performance Computers

    Science.gov (United States)

    Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Ulbrich, Uwe; Cubasch, Ulrich

    2015-04-01

    The project 'Integrated Data and Evaluation System for Decadal Scale Prediction' (INTEGRATION) as part of the German decadal prediction project MiKlip develops a central evaluation system. The fully operational hybrid features a HPC shell access and an user friendly web-interface. It employs one common system with a variety of verification tools and validation data from different projects in- and outside of MiKlip. The evaluation system is located at the German Climate Computing Centre (DKRZ) and has direct access to the bulk of its ESGF node including millions of climate model data sets, e.g. from CMIP5 and CORDEX. The database is organized by the international CMOR standard using the meta information of the self-describing model, reanalysis and observational data sets. Apache Solr is used for indexing the different data projects into one common search environment. This implemented meta data system with its advanced but easy to handle search tool supports users, developers and their tools to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitating the provision and usage of tools and climate data increases automatically the number of scientists working with the data sets and identify discrepancies. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a MySQL database. Configurations and results of the tools can be shared among scientists via shell or web-system. Therefore, plugged-in tools gain automatically from transparency and reproducibility. Furthermore, when configurations match while starting a evaluation tool, the system suggests to use results already produced

  14. Evaluating the Impact of Business Intelligence Tools on Organizational Performance in Food and Groceries Retail

    Directory of Open Access Journals (Sweden)

    Sailaja Venuturumilli

    2016-01-01

    Full Text Available While retailers are spending a significant portion of its information technology (IT budgets on BI and related technology in order to handle the ever increasing volumes of data, the actual benefits derived from these tools needs to be explored. The study focuses on the organized food and groceries retail, and explores benefits of business intelligence (BI and hypothesis‟s a structural causal relationship among its intrinsic attributes, and impact on organizational performance. A focus group of selected senior marketing employees was used to develop and validate the research model. Based on findings from the literature survey and focus group, a survey instrument was developed to empirically validate the research model. Data collected from senior marketing executives and managers from six organized food and groceries retail was analyzed using exploratory factor analysis, confirmatory factor analysis, and structural equation modeling. Five major categories of BI were identified: (1 access to data quality, (2 improved managerial effectiveness, (3 improved operational effectiveness, (4 improved customer orientation and (5 improved organizational efficiency. From the structural causal relationship analysis, a significant relationship was found between intrinsic attributes and benefits of BI and data quality. The structural equation model also suggests a significant relationship between BI and data quality on organizational performance.

  15. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  16. Unleashing spatially distributed ecohydrology modeling using Big Data tools

    Science.gov (United States)

    Miles, B.; Idaszak, R.

    2015-12-01

    Physically based spatially distributed ecohydrology models are useful for answering science and management questions related to the hydrology and biogeochemistry of prairie, savanna, forested, as well as urbanized ecosystems. However, these models can produce hundreds of gigabytes of spatial output for a single model run over decadal time scales when run at regional spatial scales and moderate spatial resolutions (~100-km2+ at 30-m spatial resolution) or when run for small watersheds at high spatial resolutions (~1-km2 at 3-m spatial resolution). Numerical data formats such as HDF5 can store arbitrarily large datasets. However even in HPC environments, there are practical limits on the size of single files that can be stored and reliably backed up. Even when such large datasets can be stored, querying and analyzing these data can suffer from poor performance due to memory limitations and I/O bottlenecks, for example on single workstations where memory and bandwidth are limited, or in HPC environments where data are stored separately from computational nodes. The difficulty of storing and analyzing spatial data from ecohydrology models limits our ability to harness these powerful tools. Big Data tools such as distributed databases have the potential to surmount the data storage and analysis challenges inherent to large spatial datasets. Distributed databases solve these problems by storing data close to computational nodes while enabling horizontal scalability and fault tolerance. Here we present the architecture of and preliminary results from PatchDB, a distributed datastore for managing spatial output from the Regional Hydro-Ecological Simulation System (RHESSys). The initial version of PatchDB uses message queueing to asynchronously write RHESSys model output to an Apache Cassandra cluster. Once stored in the cluster, these data can be efficiently queried to quickly produce both spatial visualizations for a particular variable (e.g. maps and animations), as well

  17. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    International Nuclear Information System (INIS)

    Dunford, Gary; Williams, David; Smith, Rick

    2013-01-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  18. Hanford River Protection Project Life cycle Cost Modeling Tool to Enhance Mission Planning - 13396

    Energy Technology Data Exchange (ETDEWEB)

    Dunford, Gary [AEM Consulting, LLC, 1201 Jadwin Avenue, Richland, WA 99352 (United States); Williams, David [WIT, Inc., 11173 Oak Fern Court, San Diego, CA 92131 (United States); Smith, Rick [Knowledge Systems Design, Inc., 13595 Quaker Hill Cross Rd, Nevada City, CA 95959 (United States)

    2013-07-01

    The Life cycle Cost Model (LCM) Tool is an overall systems model that incorporates budget, and schedule impacts for the entire life cycle of the River Protection Project (RPP) mission, and is replacing the Hanford Tank Waste Operations Simulator (HTWOS) model as the foundation of the RPP system planning process. Currently, the DOE frequently requests HTWOS simulations of alternative technical and programmatic strategies for completing the RPP mission. Analysis of technical and programmatic changes can be performed with HTWOS; however, life cycle costs and schedules were previously generated by manual transfer of time-based data from HTWOS to Primavera P6. The LCM Tool automates the preparation of life cycle costs and schedules and is needed to provide timely turnaround capability for RPP mission alternative analyses. LCM is the simulation component of the LCM Tool. The simulation component is a replacement of the HTWOS model with new capability to support life cycle cost modeling. It is currently deployed in G22, but has been designed to work in any full object-oriented language with an extensive feature set focused on networking and cross-platform compatibility. The LCM retains existing HTWOS functionality needed to support system planning and alternatives studies going forward. In addition, it incorporates new functionality, coding improvements that streamline programming and model maintenance, and capability to input/export data to/from the LCM using the LCM Database (LCMDB). The LCM Cost/Schedule (LCMCS) contains cost and schedule data and logic. The LCMCS is used to generate life cycle costs and schedules for waste retrieval and processing scenarios. It uses time-based output data from the LCM to produce the logic ties in Primavera P6 necessary for shifting activities. The LCM Tool is evolving to address the needs of decision makers who want to understand the broad spectrum of risks facing complex organizations like DOE-RPP to understand how near

  19. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  20. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  1. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  2. Tools for model-independent bounds in direct dark matter searches

    DEFF Research Database (Denmark)

    Cirelli, M.; Del Nobile, E.; Panci, P.

    2013-01-01

    We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei.......We discuss a framework (based on non-relativistic operators) and a self-contained set of numerical tools to derive the bounds from some current direct detection experiments on virtually any arbitrary model of Dark Matter elastically scattering on nuclei....

  3. High-throughput migration modelling for estimating exposure to chemicals in food packaging in screening and prioritization tools

    DEFF Research Database (Denmark)

    Ernstoff, Alexi S; Fantke, Peter; Huang, Lei

    2017-01-01

    Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioriti...... to uncertainty and dramatically decreased model performance (R2 = 0.4, Se = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches....

  4. Rationale and Resources for Teaching the Mathematical Modeling of Athletic Training and Performance

    Science.gov (United States)

    Clarke, David C.; Skiba, Philip F.

    2013-01-01

    A number of professions rely on exercise prescription to improve health or athletic performance, including coaching, fitness/personal training, rehabilitation, and exercise physiology. It is therefore advisable that the professionals involved learn the various tools available for designing effective training programs. Mathematical modeling of…

  5. Advanced REACH tool: A Bayesian model for occupational exposure assessment

    NARCIS (Netherlands)

    McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.

    2014-01-01

    This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate

  6. Dynamic wind turbine models in power system simulation tool DIgSILENT

    OpenAIRE

    Hansen, A.D.; Jauch, C.; Sørensen, Poul Ejnar; Iov, F.; Blaabjerg, F.

    2004-01-01

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT (Version 12.0). The developed models are a part of the results of a national research project, whose overall objective is to create amodel database in different simulation tools. This model database should be able to support the analysis of the interaction between the mechanical structure of the wind turbine and the electrical grid during different operational modes. The repo...

  7. An Interactive Tool for Automatic Predimensioning and Numerical Modeling of Arch Dams

    Directory of Open Access Journals (Sweden)

    D. J. Vicente

    2017-01-01

    Full Text Available The construction of double-curvature arch dams is an attractive solution from an economic viewpoint due to the reduced volume of concrete necessary for their construction as compared to conventional gravity dams. Due to their complex geometry, many criteria have arisen for their design. However, the most widespread methods are based on recommendations of traditional technical documents without taking into account the possibilities of computer-aided design. In this paper, an innovative software tool to design FEM models of double-curvature arch dams is presented. Several capabilities are allowed: simplified geometry creation (interesting for academic purposes, preliminary geometrical design, high-detailed model construction, and stochastic calculation performance (introducing uncertainty associated with material properties and other parameters. This paper specially focuses on geometrical issues describing the functionalities of the tool and the fundamentals of the design procedure with regard to the following aspects: topography, reference cylinder, excavation depth, crown cantilever thickness and curvature, horizontal arch curvature, excavation and concrete mass volume, and additional elements such as joints or spillways. Examples of application on two Spanish dams are presented and the results obtained analyzed.

  8. Introducing Modeling Transition Diagrams as a Tool to Connect Mathematical Modeling to Mathematical Thinking

    Science.gov (United States)

    Czocher, Jennifer A.

    2016-01-01

    This study contributes a methodological tool to reconstruct the cognitive processes and mathematical activities carried out by mathematical modelers. Represented as Modeling Transition Diagrams (MTDs), individual modeling routes were constructed for four engineering undergraduate students. Findings stress the importance and limitations of using…

  9. OrFPGA: An Empirical Performance Tuning Tool for FPGA Designs, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase II STTR project, RNET and its subcontractors are proposing to fully develop an empirical performance optimization tool called OrFPGA that efficiently...

  10. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    Science.gov (United States)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    -of-the-art methods and models that are most applicable to Irish conditions and management challenges. All styles of modelling considered useful for water resources management are relevant to this project and a balance of technical sophistication, data availability and operational practicalities is the ultimate goal. Achievement of this objective will be measured by comparing the performance of the new models developed in the project with models used in other countries. The models and tools developed in the course of the project will be evaluated by comparison with Irish catchment data and with other state-of-the-art models in a model-inter-comparison workshop which will be open to other models and the wider research community.

  11. Endoscopy nurse-administered propofol sedation performance. Development of an assessment tool and a reliability testing model

    DEFF Research Database (Denmark)

    Jensen, Jeppe Thue; Konge, Lars; Møller, Ann

    2014-01-01

    of training and for future certification. The aim of this study was to develop an assessment tool for measuring competency in propofol sedation and to explore the reliability and validity of the tool. MATERIAL AND METHODS: The nurse-administered propofol assessment tool (NAPSAT) was developed in a Delphi...... and good construct validity. This makes NAPSAT fit for formative assessment and proficiency feedback; however, high stakes and summative assessment cannot be advised....

  12. The systems integration operations/logistics model as a decision-support tool

    International Nuclear Information System (INIS)

    Miller, C.; Vogel, L.W.; Joy, D.S.

    1989-01-01

    Congress has enacted legislation specifying Yucca Mountain, Nevada, for characterization as the candidate site for the disposal of spent fuel and high-level wastes and has authorized a monitored retrievable storage (MRS) facility if one is warranted. Nevertheless, the exact configuration of the facilities making up the Federal Waste Management System (FWMS) was not specified. This has left the Office of Civilian Radioactive Waste Management (OCRWM) the responsibility for assuring the design of a safe and reliable disposal system. In order to assist in the analysis of potential configuration alternatives, operating strategies, and other factors for the FWMS and its various elements, a decision-support tool known as the systems integration operations/logistics model (SOLMOD) was developed. SOLMOD is a discrete event simulation model that emulates the movement and interaction of equipment and radioactive waste as it is processed through the FWMS - from pickup at reactor pools to emplacement. The model can be used to measure the impacts of different operating schedules and rules, system configurations, and equipment and other resource availabilities on the performance of processes comprising the FWMS and how these factors combine to determine overall system performance. SOLMOD can assist in identifying bottlenecks and can be used to assess capacity utilization of specific equipment and staff as well as overall system resilience

  13. Green roof hydrologic performance and modeling: a review.

    Science.gov (United States)

    Li, Yanling; Babcock, Roger W

    2014-01-01

    Green roofs reduce runoff from impervious surfaces in urban development. This paper reviews the technical literature on green roof hydrology. Laboratory experiments and field measurements have shown that green roofs can reduce stormwater runoff volume by 30 to 86%, reduce peak flow rate by 22 to 93% and delay the peak flow by 0 to 30 min and thereby decrease pollution, flooding and erosion during precipitation events. However, the effectiveness can vary substantially due to design characteristics making performance predictions difficult. Evaluation of the most recently published study findings indicates that the major factors affecting green roof hydrology are precipitation volume, precipitation dynamics, antecedent conditions, growth medium, plant species, and roof slope. This paper also evaluates the computer models commonly used to simulate hydrologic processes for green roofs, including stormwater management model, soil water atmosphere and plant, SWMS-2D, HYDRUS, and other models that are shown to be effective for predicting precipitation response and economic benefits. The review findings indicate that green roofs are effective for reduction of runoff volume and peak flow, and delay of peak flow, however, no tool or model is available to predict expected performance for any given anticipated system based on design parameters that directly affect green roof hydrology.

  14. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  15. Modelling fuel cell performance using artificial intelligence

    Science.gov (United States)

    Ogaji, S. O. T.; Singh, R.; Pilidis, P.; Diacakis, M.

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed.

  16. Modelling fuel cell performance using artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Ogaji, S.O.T.; Singh, R.; Pilidis, P.; Diacakis, M. [Power Propulsion and Aerospace Engineering Department, Centre for Diagnostics and Life Cycle Costs, Cranfield University (United Kingdom)

    2006-03-09

    Over the last few years, fuel cell technology has been increasing promisingly its share in the generation of stationary power. Numerous pilot projects are operating worldwide, continuously increasing the amount of operating hours either as stand-alone devices or as part of gas turbine combined cycles. An essential tool for the adequate and dynamic analysis of such systems is a software model that enables the user to assess a large number of alternative options in the least possible time. On the other hand, the sphere of application of artificial neural networks has widened covering such endeavours of life such as medicine, finance and unsurprisingly engineering (diagnostics of faults in machines). Artificial neural networks have been described as diagrammatic representation of a mathematical equation that receives values (inputs) and gives out results (outputs). Artificial neural networks systems have the capacity to recognise and associate patterns and because of their inherent design features, they can be applied to linear and non-linear problem domains. In this paper, the performance of the fuel cell is modelled using artificial neural networks. The inputs to the network are variables that are critical to the performance of the fuel cell while the outputs are the result of changes in any one or all of the fuel cell design variables, on its performance. Critical parameters for the cell include the geometrical configuration as well as the operating conditions. For the neural network, various network design parameters such as the network size, training algorithm, activation functions and their causes on the effectiveness of the performance modelling are discussed. Results from the analysis as well as the limitations of the approach are presented and discussed. (author)

  17. AgMIP Training in Multiple Crop Models and Tools

    Science.gov (United States)

    Boote, Kenneth J.; Porter, Cheryl H.; Hargreaves, John; Hoogenboom, Gerrit; Thornburn, Peter; Mutter, Carolyn

    2015-01-01

    The Agricultural Model Intercomparison and Improvement Project (AgMIP) has the goal of using multiple crop models to evaluate climate impacts on agricultural production and food security in developed and developing countries. There are several major limitations that must be overcome to achieve this goal, including the need to train AgMIP regional research team (RRT) crop modelers to use models other than the ones they are currently familiar with, plus the need to harmonize and interconvert the disparate input file formats used for the various models. Two activities were followed to address these shortcomings among AgMIP RRTs to enable them to use multiple models to evaluate climate impacts on crop production and food security. We designed and conducted courses in which participants trained on two different sets of crop models, with emphasis on the model of least experience. In a second activity, the AgMIP IT group created templates for inputting data on soils, management, weather, and crops into AgMIP harmonized databases, and developed translation tools for converting the harmonized data into files that are ready for multiple crop model simulations. The strategies for creating and conducting the multi-model course and developing entry and translation tools are reviewed in this chapter.

  18. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  19. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib, E-mail: garib@mrc-lmb.cam.ac.uk [MRC Laboratory of Molecular Biology, Francis Crick Avenue, Cambridge CB2 0QH (United Kingdom)

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC.

  20. Tools for macromolecular model building and refinement into electron cryo-microscopy reconstructions

    International Nuclear Information System (INIS)

    Brown, Alan; Long, Fei; Nicholls, Robert A.; Toots, Jaan; Emsley, Paul; Murshudov, Garib

    2015-01-01

    A description is given of new tools to facilitate model building and refinement into electron cryo-microscopy reconstructions. The recent rapid development of single-particle electron cryo-microscopy (cryo-EM) now allows structures to be solved by this method at resolutions close to 3 Å. Here, a number of tools to facilitate the interpretation of EM reconstructions with stereochemically reasonable all-atom models are described. The BALBES database has been repurposed as a tool for identifying protein folds from density maps. Modifications to Coot, including new Jiggle Fit and morphing tools and improved handling of nucleic acids, enhance its functionality for interpreting EM maps. REFMAC has been modified for optimal fitting of atomic models into EM maps. As external structural information can enhance the reliability of the derived atomic models, stabilize refinement and reduce overfitting, ProSMART has been extended to generate interatomic distance restraints from nucleic acid reference structures, and a new tool, LIBG, has been developed to generate nucleic acid base-pair and parallel-plane restraints. Furthermore, restraint generation has been integrated with visualization and editing in Coot, and these restraints have been applied to both real-space refinement in Coot and reciprocal-space refinement in REFMAC

  1. Customer Data Analysis Model using Business Intelligence Tools in Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Monica LIA

    2015-10-01

    Full Text Available This article presents a customer data analysis model in a telecommunication company and business intelligence tools for data modelling, transforming, data visualization and dynamic reports building . For a mature market, knowing the information inside the data and making forecast for strategic decision become more important in Romanian Market. Business Intelligence tools are used in business organization as support for decision making.

  2. Improving the performance of the actinic inspection tool with an optimized alignment procedure

    International Nuclear Information System (INIS)

    Mochi, I.; Goldberg, K.A.; Naulleau, P.; Huh, Sungmin

    2009-01-01

    Extreme ultraviolet (EUV) microscopy is an important tool for the investigation of the performance of EUV masks, for detecting the presence and the characteristics of defects, and for evaluating the effectiveness of defect repair techniques. Aerial image measurement bypasses the difficulties inherent to photoresist imaging and enables high data collection speed and flexibility. It provides reliable and quick feedback for the development of masks and lithography system modeling methods. We operate the SEMATECH Berkeley Actinic Inspection Tool (AIT), a EUV microscope installed at the Advanced Light Source at Lawrence Berkeley National Laboratory. The AIT is equipped with several high-magnification Fresnel zoneplate lenses, with various numerical aperture values, that enable it image the reflective mask surface with various resolution and magnification settings. Although the AIT has undergone significant recent improvements in terms of imaging resolution and illumination uniformity, there is still room for improvement. In the AIT, an off-axis zoneplate lens collects the light coming from the sample and an image of the sample is projected onto an EUV-sensitive CCD camera. The simplicity of the optical system is particularly helpful considering that the AIT alignment has to be performed every time that a sample or a zoneplate is replaced. The alignment is sensitive to several parameters such as the lens position and orientation, the illumination direction and the sample characteristics. Since the AIT works in high vacuum, there is no direct access to the optics or to the sample during the alignment and the measurements. For all these reasons the alignment procedures and feedback can be complex, and in some cases can reduce the overall data throughput of the system. In this paper we review the main strategies and procedures that have been developed for quick and reliable alignments, and we describe the performance improvements we have achieved, in terms of aberration

  3. Improving the performance of the actinic inspection tool with an optimized alignment procedure

    Energy Technology Data Exchange (ETDEWEB)

    Mochi, I.; Goldberg, K.A.; Naulleau, P.; Huh, Sungmin

    2009-03-04

    Extreme ultraviolet (EUV) microscopy is an important tool for the investigation of the performance of EUV masks, for detecting the presence and the characteristics of defects, and for evaluating the effectiveness of defect repair techniques. Aerial image measurement bypasses the difficulties inherent to photoresist imaging and enables high data collection speed and flexibility. It provides reliable and quick feedback for the development of masks and lithography system modeling methods. We operate the SEMATECH Berkeley Actinic Inspection Tool (AIT), a EUV microscope installed at the Advanced Light Source at Lawrence Berkeley National Laboratory. The AIT is equipped with several high-magnification Fresnel zoneplate lenses, with various numerical aperture values, that enable it image the reflective mask surface with various resolution and magnification settings. Although the AIT has undergone significant recent improvements in terms of imaging resolution and illumination uniformity, there is still room for improvement. In the AIT, an off-axis zoneplate lens collects the light coming from the sample and an image of the sample is projected onto an EUV-sensitive CCD camera. The simplicity of the optical system is particularly helpful considering that the AIT alignment has to be performed every time that a sample or a zoneplate is replaced. The alignment is sensitive to several parameters such as the lens position and orientation, the illumination direction and the sample characteristics. Since the AIT works in high vacuum, there is no direct access to the optics or to the sample during the alignment and the measurements. For all these reasons the alignment procedures and feedback can be complex, and in some cases can reduce the overall data throughput of the system. In this paper we review the main strategies and procedures that have been developed for quick and reliable alignments, and we describe the performance improvements we have achieved, in terms of aberration

  4. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  5. Development of hydrogeological modelling tools based on NAMMU

    Energy Technology Data Exchange (ETDEWEB)

    Marsic, N. [Kemakta Konsult AB, Stockholm (Sweden); Hartley, L.; Jackson, P.; Poole, M. [AEA Technology, Harwell (United Kingdom); Morvik, A. [Bergen Software Services International AS, Bergen (Norway)

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  6. Development of hydrogeological modelling tools based on NAMMU

    International Nuclear Information System (INIS)

    Marsic, N.; Hartley, L.; Jackson, P.; Poole, M.; Morvik, A.

    2001-09-01

    A number of relatively sophisticated hydrogeological models were developed within the SR 97 project to handle issues such as nesting of scales and the effects of salinity. However, these issues and others are considered of significant importance and generality to warrant further development of the hydrogeological methodology. Several such developments based on the NAMMU package are reported here: - Embedded grid: nesting of the regional- and site-scale models within the same numerical model has given greater consistency in the structural model representation and in the flow between scales. Since there is a continuous representation of the regional- and site-scales the modelling of pathways from the repository no longer has to be contained wholly by the site-scale region. This allows greater choice in the size of the site-scale. - Implicit Fracture Zones (IFZ): this method of incorporating the structural model is very efficient and allows changes to either the mesh or fracture zones to be implemented quickly. It also supports great flexibility in the properties of the structures and rock mass. - Stochastic fractures: new functionality has been added to IFZ to allow arbitrary combinations of stochastic or deterministic fracture zones with the rock-mass. Whether a fracture zone is modelled deterministically or stochastically its statistical properties can be defined independently. - Stochastic modelling: efficient methods for Monte-Carlo simulation of stochastic permeability fields have been implemented and tested on SKB's computers. - Visualisation: the visualisation tool Avizier for NAMMU has been enhanced such that it is efficient for checking models and presentation. - PROPER interface: NAMMU outputs pathlines in PROPER format so that it can be included in PA workflow. The developed methods are illustrated by application to stochastic nested modelling of the Beberg site using data from SR 97. The model properties were in accordance with the regional- and site

  7. Programming Models and Tools for Intelligent Embedded Systems

    DEFF Research Database (Denmark)

    Sørensen, Peter Verner Bojsen

    Design automation and analysis tools targeting embedded platforms, developed using a component-based design approach, must be able to reason about the capabilities of the platforms. In the general case where nothing is assumed about the components comprising a platform or the platform topology...... is used for checking the consistency of a design with respect to the availablity of services and resources. In the second application, a tool for automatically implementing the communication infrastructure of a process network application, the Service Relation Model is used for analyzing the capabilities...

  8. USER FRIENDLY OPEN GIS TOOL FOR LARGE SCALE DATA ASSIMILATION – A CASE STUDY OF HYDROLOGICAL MODELLING

    Directory of Open Access Journals (Sweden)

    P. K. Gupta

    2012-08-01

    Full Text Available Open source software (OSS coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc... and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.. Quantum GIS (QGIS is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn – Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect, landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  9. State-of-the-Art for Hygrothermal Simulation Tools

    Energy Technology Data Exchange (ETDEWEB)

    Boudreaux, Philip R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); New, Joshua Ryan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shrestha, Som S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Adams, Mark B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pallin, Simon B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-02-01

    The hygrothermal (heat and moisture) performance of buildings can be assessed by utilizing simulation tools. There are currently a number of available hygrothermal calculation tools available which vary in their degree of sophistication and runtime requirements. This report investigates three of the most commonly used models (WUFI, HAMT, and EMPD) to assess their limitations and potential to generate physically realistic results to prioritize improvements for EnergyPlus (which uses HAMT and EMPD). The outcome of the study shows that, out of these three tools, WUFI has the greatest hygrothermal capabilities. Limitations of these tools were also assessed including: WUFI’s inability to properly account for air leakage and transfer at surface boundaries; HAMT’s inability to handle air leakage, precipitationrelated moisture problems, or condensation problems from high relative humidity; and multiple limitations for EMPD as a simplified method to estimate indoor temperature and humidity levels and generally not used to estimate the hygrothermal performance of the building envelope materials. In conclusion, out of the three investigated simulation tools, HAMT has the greatest modeling potential, is open source, and we have prioritized specific features that can enable EnergyPlus to model all relevant heat and moisture transfer mechanisms that impact the performance of building envelope components.

  10. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    Science.gov (United States)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  11. The Strategic Analysis as a Management Tool to Improve the Performance of National Enterprises

    Directory of Open Access Journals (Sweden)

    Shtal Tetiana V.

    2018-01-01

    Full Text Available The publication considers the issue of improving the performance of enterprises, in particular of their international activities. In order to address this problem, the management of development of international activities uses a variety of tools, one of which is strategic analysis, which allows to analyze the overall status of enterprise, as well as determine the directions of improvement of its efficiency. The main methods of strategic analysis, the appropriateness of their use depending on the set goals and objectives were analyzed. Practical application of separate methods in the strategic analysis (such as model by I. Adizes, model of «five forces» of competitiveness according to Porter, analysis of financial indicators and costs, PEST-analysis and SWOT-analysis is considered on the example of machine-building enterprises, specializing in the production of turbo-expanders. Recommendations on development of their efficiency have been offered.

  12. Design logistics performance measurement model of automotive component industry for srengthening competitiveness of dealing AEC 2015

    Science.gov (United States)

    Amran, T. G.; Janitra Yose, Mindy

    2018-03-01

    As the free trade Asean Economic Community (AEC) causes the tougher competition, it is important that Indonesia’s automotive industry have high competitiveness as well. A model of logistics performance measurement was designed as an evaluation tool for automotive component companies to improve their logistics performance in order to compete in AEC. The design of logistics performance measurement model was based on the Logistics Scorecard perspectives, divided into two stages: identifying the logistics business strategy to get the KPI and arranging the model. 23 KPI was obtained. The measurement result can be taken into consideration of determining policies to improve the performance logistics competitiveness.

  13. A Bayesian Performance Prediction Model for Mathematics Education: A Prototypical Approach for Effective Group Composition

    Science.gov (United States)

    Bekele, Rahel; McPherson, Maggie

    2011-01-01

    This research work presents a Bayesian Performance Prediction Model that was created in order to determine the strength of personality traits in predicting the level of mathematics performance of high school students in Addis Ababa. It is an automated tool that can be used to collect information from students for the purpose of effective group…

  14. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  15. Virtual Reality Simulation as a Tool to Monitor Surgical Performance Indicators: VIRESI Observational Study.

    Science.gov (United States)

    Muralha, Nuno; Oliveira, Manuel; Ferreira, Maria Amélia; Costa-Maia, José

    2017-05-31

    Virtual reality simulation is a topic of discussion as a complementary tool to traditional laparoscopic surgical training in the operating room. However, it is unclear whether virtual reality training can have an impact on the surgical performance of advanced laparoscopic procedures. Our objective was to assess the ability of the virtual reality simulator LAP Mentor to identify and quantify changes in surgical performance indicators, after LAP Mentor training for digestive anastomosis. Twelve surgeons from Centro Hospitalar de São João in Porto (Portugal) performed two sessions of advanced task 5: anastomosis in LAP Mentor, before and after completing the tutorial, and were evaluated on 34 surgical performance indicators. The results show that six surgical performance indicators significantly changed after LAP Mentor training. The surgeons performed the task significantly faster as the median 'total time' significantly reduced (p virtual reality training simulation as a benchmark tool to assess the surgical performance of Portuguese surgeons. LAP Mentor is able to identify variations in surgical performance indicators of digestive anastomosis.

  16. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  17. Performance model of metallic concentric tube recuperator with counter flow arrangement

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Harshdeep [HIET, Department of Mechanical Engineering, Ghaziabad, Uttar Pradesh (India); Kumar, Anoop; Goel, Varun [NIT, Department of Mechanical Engineering, Hamirpur, Himachal Pradesh (India)

    2010-03-15

    A performance model for counter flow arrangement in concentric tube recuperator that can be used to utilize the waste heat in the temperature range of 900-1,400 C is presented. The arrangement consists of metallic tubular inner and outer concentric shell with a small annular gap between two concentric shells. Flue gases pass through the inner shell while air passes through the annular gap in the reverse direction (counter flow arrangement). The height of the recuperator is divided into elements and an energy balance is performed on each elemental height. Results give necessary information about surface, gas and air temperature distribution, and the influence of operating conditions on recuperator performance. The recuperative effectiveness is found to be increased with increasing inlet gas temperature and decreased with increasing fuel flow rate. The present model accounts for all heat transfer processes pertinent to a counterflow radiation recuperator and provide a valuable tool for performance considerations. (orig.)

  18. Building performance simulation as a design tool for refurbishment of buildings

    NARCIS (Netherlands)

    Hensen, J.L.M.; Bartak, M.; Drkal, F.; Dunovska, T.; Lain, M.; Matuska, T.; Schwarzer, J.; Sourek, B.; Bednar, T.

    2004-01-01

    This paper attempts to outline the current state-of-the-art regarding the use of building performance simulation as a design tool for refurbishment of buildings. This is illus-trated by means of three recent studies for conversion of historical buildings (an early 20th century factory, and a water

  19. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    Science.gov (United States)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  20. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    Science.gov (United States)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  1. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  2. Performance Assessment Tools for Distance Learning and Simulation: Knowledge, Models and Tools to Improve the Effectiveness of Naval Distance Learning

    National Research Council Canada - National Science Library

    Baker, Eva L; Munro, Allen; Pizzini, Quentin A; Brill, David G; Michiuye, Joanne K

    2006-01-01

    ... by (a) extending CRESST's knowledge mapping tool's authoring and scoring functionality and (b) providing the capability to embed a knowledge mapping assessment in simulation-based training developed by BTL...

  3. Information Theoretic Tools for Parameter Fitting in Coarse Grained Models

    KAUST Repository

    Kalligiannaki, Evangelia; Harmandaris, Vagelis; Katsoulakis, Markos A.; Plechac, Petr

    2015-01-01

    We study the application of information theoretic tools for model reduction in the case of systems driven by stochastic dynamics out of equilibrium. The model/dimension reduction is considered by proposing parametrized coarse grained dynamics

  4. Modelling of proton exchange membrane fuel cell performance based on semi-empirical equations

    Energy Technology Data Exchange (ETDEWEB)

    Al-Baghdadi, Maher A.R. Sadiq [Babylon Univ., Dept. of Mechanical Engineering, Babylon (Iraq)

    2005-08-01

    Using semi-empirical equations for modeling a proton exchange membrane fuel cell is proposed for providing a tool for the design and analysis of fuel cell total systems. The focus of this study is to derive an empirical model including process variations to estimate the performance of fuel cell without extensive calculations. The model take into account not only the current density but also the process variations, such as the gas pressure, temperature, humidity, and utilization to cover operating processes, which are important factors in determining the real performance of fuel cell. The modelling results are compared well with known experimental results. The comparison shows good agreements between the modeling results and the experimental data. The model can be used to investigate the influence of process variables for design optimization of fuel cells, stacks, and complete fuel cell power system. (Author)

  5. Fluid Survival Tool: A Model Checker for Hybrid Petri Nets

    NARCIS (Netherlands)

    Postema, Björn Frits; Remke, Anne Katharina Ingrid; Haverkort, Boudewijn R.H.M.; Ghasemieh, Hamed

    2014-01-01

    Recently, algorithms for model checking Stochastic Time Logic (STL) on Hybrid Petri nets with a single general one-shot transition (HPNG) have been introduced. This paper presents a tool for model checking HPNG models against STL formulas. A graphical user interface (GUI) not only helps to

  6. Development and implementation of a dynamic TES dispatch control component in a PV-CSP techno-economic performance modelling tool

    Science.gov (United States)

    Hansson, Linus; Guédez, Rafael; Larchet, Kevin; Laumert, Bjorn

    2017-06-01

    The dispatchability offered by thermal energy storage (TES) in concentrated solar power (CSP) and solar hybrid plants based on such technology presents the most important difference compared to power generation based only on photovoltaics (PV). This has also been one reason for recent hybridization efforts of the two technologies and the creation of Power Purchase Agreement (PPA) payment schemes based on offering higher payment multiples during daily hours of higher (peak or priority) demand. Recent studies involving plant-level thermal energy storage control strategies are however to a large extent based on pre-determined approaches, thereby not taking into account the actual dynamics of thermal energy storage system operation. In this study, the implementation of a dynamic dispatch strategy in the form of a TRNSYS controller for hybrid PV-CSP plants in the power-plant modelling tool DYESOPT is presented. In doing this it was attempted to gauge the benefits of incorporating a day-ahead approach to dispatch control compared to a fully pre-determined approach determining hourly dispatch only once prior to annual simulation. By implementing a dynamic strategy, it was found possible to enhance technical and economic performance for CSP-only plants designed for peaking operation and featuring low values of the solar multiple. This was achieved by enhancing dispatch control, primarily by taking storage levels at the beginning of every simulation day into account. The sequential prediction of the TES level could therefore be improved, notably for evaluated plants without integrated PV, for which the predicted storage levels deviated less than when PV was present in the design. While also featuring dispatch performance gains, optimal plant configurations for hybrid PV-CSP was found to present a trade-off in economic performance in the form of an increase in break-even electricity price when using the dynamic strategy which was offset to some extent by a reduction in

  7. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo

    2016-02-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  8. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Geneva (Switzerland). Theoretical Physics Dept.; Athron, Peter [Monash Univ., Melbourne (Australia). ARC Center of Excellence for Particle Physics at the Terascale; Basso, Lorenzo [Aix-Marseille Univ., CNRS-IN2P3, UMR 7346 (France). CPPM; and others

    2016-02-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model.

  9. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    International Nuclear Information System (INIS)

    Staub, Florian; Athron, Peter; Basso, Lorenzo; Goodsell, Mark D.; Harries, Dylan; Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby; Ubaldi, Lorenzo; Vicente, Avelino; Voigt, Alexander

    2016-01-01

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  10. Precision tools and models to narrow in on the 750 GeV diphoton resonance

    Energy Technology Data Exchange (ETDEWEB)

    Staub, Florian [CERN, Theoretical Physics Department, Geneva (Switzerland); Athron, Peter [Monash University, ARC Centre of Excellence for Particle Physics at the Terascale, School of Physics, Melbourne, VIC (Australia); Basso, Lorenzo [CPPM, Aix-Marseille Universite, CNRS-IN2P3, UMR 7346, Marseille Cedex 9 (France); Goodsell, Mark D. [Sorbonne Universites, LPTHE, UMR 7589, CNRS and Universite Pierre et Marie Curie, Paris Cedex 05 (France); Harries, Dylan [The University of Adelaide, Department of Physics, ARC Centre of Excellence for Particle Physics at the Terascale, Adelaide, SA (Australia); Krauss, Manuel E.; Nickel, Kilian; Opferkuch, Toby [Bethe Center for Theoretical Physics and Physikalisches Institut der Universitaet Bonn, Bonn (Germany); Ubaldi, Lorenzo [Tel-Aviv University, Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv (Israel); Vicente, Avelino [Instituto de Fisica Corpuscular (CSIC-Universitat de Valencia), Valencia (Spain); Voigt, Alexander [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany)

    2016-09-15

    The hints for a new resonance at 750 GeV from ATLAS and CMS have triggered a significant amount of attention. Since the simplest extensions of the standard model cannot accommodate the observation, many alternatives have been considered to explain the excess. Here we focus on several proposed renormalisable weakly-coupled models and revisit results given in the literature. We point out that physically important subtleties are often missed or neglected. To facilitate the study of the excess we have created a collection of 40 model files, selected from recent literature, for the Mathematica package SARAH. With SARAH one can generate files to perform numerical studies using the tailor-made spectrum generators FlexibleSUSY and SPheno. These have been extended to automatically include crucial higher order corrections to the diphoton and digluon decay rates for both CP-even and CP-odd scalars. Additionally, we have extended the UFO and CalcHep interfaces of SARAH, to pass the precise information about the effective vertices from the spectrum generator to a Monte-Carlo tool. Finally, as an example to demonstrate the power of the entire setup, we present a new supersymmetric model that accommodates the diphoton excess, explicitly demonstrating how a large width can be obtained. We explicitly show several steps in detail to elucidate the use of these public tools in the precision study of this model. (orig.)

  11. A GIS Tool for evaluating and improving NEXRAD and its application in distributed hydrologic modeling

    Science.gov (United States)

    Zhang, X.; Srinivasan, R.

    2008-12-01

    In this study, a user friendly GIS tool was developed for evaluating and improving NEXRAD using raingauge data. This GIS tool can automatically read in raingauge and NEXRAD data, evaluate the accuracy of NEXRAD for each time unit, implement several geostatistical methods to improve the accuracy of NEXRAD through raingauge data, and output spatial precipitation map for distributed hydrologic model. The geostatistical methods incorporated in this tool include Simple Kriging with varying local means, Kriging with External Drift, Regression Kriging, Co-Kriging, and a new geostatistical method that was newly developed by Li et al. (2008). This tool was applied in two test watersheds at hourly and daily temporal scale. The preliminary cross-validation results show that incorporating raingauge data to calibrate NEXRAD can pronouncedly change the spatial pattern of NEXRAD and improve its accuracy. Using different geostatistical methods, the GIS tool was applied to produce long term precipitation input for a distributed hydrologic model - Soil and Water Assessment Tool (SWAT). Animated video was generated to vividly illustrate the effect of using different precipitation input data on distributed hydrologic modeling. Currently, this GIS tool is developed as an extension of SWAT, which is used as water quantity and quality modeling tool by USDA and EPA. The flexible module based design of this tool also makes it easy to be adapted for other hydrologic models for hydrological modeling and water resources management.

  12. TIPPtool: Compositional Specification and Analysis of Markovian Performance Models

    NARCIS (Netherlands)

    Hermanns, H.; Halbwachs, N.; Peled, D.; Mertsiotakis, V.; Siegle, M.

    1999-01-01

    In this short paper we briefly describe a tool which is based on a Markovian stochastic process algebra. The tool offers both model specification and quantitative model analysis in a compositional fashion, wrapped in a userfriendly graphical front-end.

  13. An empirical modeling tool and glass property database in development of US-DOE radioactive waste glasses

    International Nuclear Information System (INIS)

    Muller, I.; Gan, H.

    1997-01-01

    An integrated glass database has been developed at the Vitreous State Laboratory of Catholic University of America. The major objective of this tool was to support glass formulation using the MAWS approach (Minimum Additives Waste Stabilization). An empirical modeling capability, based on the properties of over 1000 glasses in the database, was also developed to help formulate glasses from waste streams under multiple user-imposed constraints. The use of this modeling capability, the performance of resulting models in predicting properties of waste glasses, and the correlation of simple structural theories to glass properties are the subjects of this paper. (authors)

  14. Results from evaluations of models and cost-effectiveness tools to support introduction decisions for new vaccines need critical appraisal

    Directory of Open Access Journals (Sweden)

    Moorthy Vasee

    2011-05-01

    Full Text Available Abstract The World Health Organization (WHO recommends that the cost-effectiveness (CE of introducing new vaccines be considered before such a programme is implemented. However, in low- and middle-income countries (LMICs, it is often challenging to perform and interpret the results of model-based economic appraisals of vaccines that benefit from locally relevant data. As a result, WHO embarked on a series of consultations to assess economic analytical tools to support vaccine introduction decisions for pneumococcal, rotavirus and human papillomavirus vaccines. The objectives of these assessments are to provide decision makers with a menu of existing CE tools for vaccines and their characteristics rather than to endorse the use of a single tool. The outcome will provide policy makers in LMICs with information about the feasibility of applying these models to inform their own decision making. We argue that if models and CE analyses are used to inform decisions, they ought to be critically appraised beforehand, including a transparent evaluation of their structure, assumptions and data sources (in isolation or in comparison to similar tools, so that decision makers can use them while being fully aware of their robustness and limitations.

  15. A tool model for predicting atmospheric kinetics with sensitivity analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A package( a tool model) for program of predicting atmospheric chemical kinetics with sensitivity analysis is presented. The new direct method of calculating the first order sensitivity coefficients using sparse matrix technology to chemical kinetics is included in the tool model, it is only necessary to triangularize the matrix related to the Jacobian matrix of the model equation. The Gear type procedure is used to integrate amodel equation and its coupled auxiliary sensitivity coefficient equations. The FORTRAN subroutines of the model equation, the sensitivity coefficient equations, and their Jacobian analytical expressions are generated automatically from a chemical mechanism. The kinetic representation for the model equation and its sensitivity coefficient equations, and their Jacobian matrix is presented. Various FORTRAN subroutines in packages, such as SLODE, modified MA28, Gear package, with which the program runs in conjunction are recommended.The photo-oxidation of dimethyl disulfide is used for illustration.

  16. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  17. A Quasiphysics Intelligent Model for a Long Range Fast Tool Servo

    Science.gov (United States)

    Liu, Qiang; Zhou, Xiaoqin; Lin, Jieqiong; Xu, Pengzi; Zhu, Zhiwei

    2013-01-01

    Accurately modeling the dynamic behaviors of fast tool servo (FTS) is one of the key issues in the ultraprecision positioning of the cutting tool. Herein, a quasiphysics intelligent model (QPIM) integrating a linear physics model (LPM) and a radial basis function (RBF) based neural model (NM) is developed to accurately describe the dynamic behaviors of a voice coil motor (VCM) actuated long range fast tool servo (LFTS). To identify the parameters of the LPM, a novel Opposition-based Self-adaptive Replacement Differential Evolution (OSaRDE) algorithm is proposed which has been proved to have a faster convergence mechanism without compromising with the quality of solution and outperform than similar evolution algorithms taken for consideration. The modeling errors of the LPM and the QPIM are investigated by experiments. The modeling error of the LPM presents an obvious trend component which is about ±1.15% of the full span range verifying the efficiency of the proposed OSaRDE algorithm for system identification. As for the QPIM, the trend component in the residual error of LPM can be well suppressed, and the error of the QPIM maintains noise level. All the results verify the efficiency and superiority of the proposed modeling and identification approaches. PMID:24163627

  18. A Modeling Tool for Household Biogas Burner Flame Port Design

    Science.gov (United States)

    Decker, Thomas J.

    Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.

  19. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  20. Knowledge modelling and reliability processing: presentation of the Figaro language and associated tools

    International Nuclear Information System (INIS)

    Bouissou, M.; Villatte, N.; Bouhadana, H.; Bannelier, M.

    1991-12-01

    EDF has been developing for several years an integrated set of knowledge-based and algorithmic tools for automation of reliability assessment of complex (especially sequential) systems. In this environment, the reliability expert has at his disposal all the powerful software tools for qualitative and quantitative processing, besides he gets various means to generate automatically the inputs for these tools, through the acquisition of graphical data. The development of these tools has been based on FIGARO, a specific language, which was built to get an homogeneous system modelling. Various compilers and interpreters get a FIGARO model into conventional models, such as fault-trees, Markov chains, Petri Networks. In this report, we introduce the main basics of FIGARO language, illustrating them with examples

  1. A conceptual model to improve performance in virtual teams

    Directory of Open Access Journals (Sweden)

    Shopee Dube

    2016-09-01

    Full Text Available Background: The vast improvement in communication technologies and sophisticated project management tools, methods and techniques has allowed geographically and culturally diverse groups to operate and function in a virtual environment. To succeed in this virtual environment where time and space are becoming increasingly irrelevant, organisations must define new ways of implementing initiatives. This virtual environment phenomenon has brought about the formation of virtual project teams that allow organisations to harness the skills and knowhow of the best resources, irrespective of their location. Objectives: The aim of this article was to investigate performance criteria and develop a conceptual model which can be applied to enhance the success of virtual project teams. There are no clear guidelines of the performance criteria in managing virtual project teams. Method: A qualitative research methodology was used in this article. The purpose of content analysis was to explore the literature to understand the concept of performance in virtual project teams and to summarise the findings of the literature reviewed. Results: The research identified a set of performance criteria for the virtual project teams as follows: leadership, trust, communication, team cooperation, reliability, motivation, comfort and social interaction. These were used to conceptualise the model. Conclusion: The conceptual model can be used in a holistic way to determine the overall performance of the virtual project team, but each factor can be analysed individually to determine the impact on the overall performance. The knowledge of performance criteria for virtual project teams could aid project managers in enhancing the success of these teams and taking a different approach to better manage and coordinate them.

  2. Maintenance personnel performance simulation (MAPPS) model: overview and evaluation efforts

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.; Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Ryan, T.G.

    1984-01-01

    The development of the MAPPS model has been completed and the model is currently undergoing evaluation. These efforts are addressing a number of identified issues concerning practicality, acceptability, usefulness, and validity. Preliminary analysis of the evaluation data that has been collected indicates that MAPPS will provide comprehensive and reliable data for PRA purposes and for a number of other applications. The MAPPS computer simulation model provides the user with a sophisticated tool for gaining insights into tasks performed by NPP maintenance personnel. Its wide variety of input parameters and output data makes it extremely flexible for application to a number of diverse applications. With the demonstration of favorable model evaluation results, the MAPPS model will represent a valuable source of NPP maintainer reliability data and provide PRA studies with a source of data on maintainers that has previously not existed

  3. Theoretical Model for the Performance of Liquid Ring Pump Based on the Actual Operating Cycle

    Directory of Open Access Journals (Sweden)

    Si Huang

    2017-01-01

    Full Text Available Liquid ring pump is widely applied in many industry fields due to the advantages of isothermal compression process, simple structure, and liquid-sealing. Based on the actual operating cycle of “suction-compression-discharge-expansion,” a universal theoretical model for performance of liquid ring pump was established in this study, to solve the problem that the theoretical models deviated from the actual performance in operating cycle. With the major geometric parameters and operating conditions of a liquid ring pump, the performance parameters such as the actual capacity for suction and discharge, shaft power, and global efficiency can be conveniently predicted by the proposed theoretical model, without the limitation of empiric range, performance data, or the detailed 3D geometry of pumps. The proposed theoretical model was verified by experimental performances of liquid ring pumps and could provide a feasible tool for the application of liquid ring pump.

  4. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    Funkhouser, B.R.; Moyer, J.W.; Ballweg, E.L.

    1994-01-01

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  5. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    Science.gov (United States)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through

  6. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  7. Planetary Suit Hip Bearing Model for Predicting Design vs. Performance

    Science.gov (United States)

    Cowley, Matthew S.; Margerum, Sarah; Harvil, Lauren; Rajulu, Sudhakar

    2011-01-01

    Designing a planetary suit is very complex and often requires difficult trade-offs between performance, cost, mass, and system complexity. In order to verifying that new suit designs meet requirements, full prototypes must eventually be built and tested with human subjects. Using computer models early in the design phase of new hardware development can be advantageous, allowing virtual prototyping to take place. Having easily modifiable models of the suit hard sections may reduce the time it takes to make changes to the hardware designs and then to understand their impact on suit and human performance. A virtual design environment gives designers the ability to think outside the box and exhaust design possibilities before building and testing physical prototypes with human subjects. Reductions in prototyping and testing may eventually reduce development costs. This study is an attempt to develop computer models of the hard components of the suit with known physical characteristics, supplemented with human subject performance data. Objectives: The primary objective was to develop an articulating solid model of the Mark III hip bearings to be used for evaluating suit design performance of the hip joint. Methods: Solid models of a planetary prototype (Mark III) suit s hip bearings and brief section were reverse-engineered from the prototype. The performance of the models was then compared by evaluating the mobility performance differences between the nominal hardware configuration and hardware modifications. This was accomplished by gathering data from specific suited tasks. Subjects performed maximum flexion and abduction tasks while in a nominal suit bearing configuration and in three off-nominal configurations. Performance data for the hip were recorded using state-of-the-art motion capture technology. Results: The results demonstrate that solid models of planetary suit hard segments for use as a performance design tool is feasible. From a general trend perspective

  8. Transformation of UML models to CSP : a case study for graph transformation tools

    NARCIS (Netherlands)

    Varró, D.; Asztalos, M.; Bisztray, D.; Boronat, A.; Dang, D.; Geiß, R.; Greenyer, J.; Van Gorp, P.M.E.; Kniemeyer, O.; Narayanan, A.; Rencis, E.; Weinell, E.; Schürr, A.; Nagl, M.; Zündorf, A.

    2008-01-01

    Graph transformation provides an intuitive mechanism for capturing model transformations. In the current paper, we investigate and compare various graph transformation tools using a compact practical model transformation case study carried out as part of the AGTIVE 2007 Tool Contest [22]. The aim of

  9. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-05-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  10. SafetyBarrierManager, a software tool to perform risk analysis using ARAMIS's principles

    DEFF Research Database (Denmark)

    Duijm, Nijs Jan

    2017-01-01

    of the ARAMIS project, Risø National Laboratory started developing a tool that could implement these methodologies, leading to SafetyBarrierManager. The tool is based on the principles of “safety‐barrier diagrams”, which are very similar to “bowties”, with the possibility of performing quantitative analysis......The ARAMIS project resulted in a number of methodologies, dealing with among others: the development of standard fault trees and “bowties”; the identification and classification of safety barriers; and including the quality of safety management into the quantified risk assessment. After conclusion....... The tool allows constructing comprehensive fault trees, event trees and safety‐barrier diagrams. The tool implements the ARAMIS idea of a set of safety barrier types, to which a number of safety management issues can be linked. By rating the quality of these management issues, the operational probability...

  11. Comprehensive Assessment of Models and Events based on Library tools (CAMEL)

    Science.gov (United States)

    Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.

    2017-12-01

    At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.

  12. Investigation into the Effects of Textural Properties on Cuttability Performance of a Chisel Tool

    Science.gov (United States)

    Tumac, Deniz; Copur, Hanifi; Balci, Cemal; Er, Selman; Avunduk, Emre

    2018-04-01

    The main objective of this study is to investigate the effect of textural properties of stones on cutting performance of a standard chisel tool. Therewithal, the relationships between textural properties and cutting performance parameters and physical and mechanical properties were statistically analyzed. For this purpose, physical and mechanical property tests and mineralogical and petrographic analyses were carried out on eighteen natural stone samples, which can be grouped into three fundamentally different geological origins, i.e., metamorphic, igneous, and sedimentary. Then, texture coefficient analyses were performed on the samples. To determine the cuttability of the stones; the samples were cut with a portable linear cutting machine using a standard chisel tool at different depths of cut in unrelieved (non-interactive) cutting mode. The average and maximum forces (normal and cutting) and specific energy were measured, and the obtained values were correlated with texture coefficient, packing weighting, and grain size. With reference to the relation between depth of cut and cutting performance of the chisel tool for three types of natural stone groups, specific energy decreases with increasing depth of cut, and cutting forces increase in proportion to the depth of cut. The same is observed for the relationship between packing weighting and both of specific energy and cutter forces. On the other hand, specific energy and the forces decrease while grain size increases. Based on the findings of the present study, texture coefficient has strong correlation with specific energy. Generally, the lower depth of cut values in cutting tests shows higher and more reliable correlations with texture coefficient than the increased depth of cut. The results of cutting tests show also that, at a lower depth of cut (less than 1.5 mm), even stronger correlations can be observed between texture coefficient and cutting performance. Experimental studies indicate that cutting

  13. BPMNDiffViz : a tool for BPMN models comparison

    NARCIS (Netherlands)

    Ivanov, S.Y.; Kalenkova, A.A.; Aalst, van der W.M.P.; Daniel, F.; Zugal, S.

    2015-01-01

    Automatic comparison of business processes plays an important role in their analysis and optimization. In this paper we present the web-based tool BPMNDiffViz, that finds business processes discrepancies and visualizes them. BPMN (Business Process Model and Notation) 2.0 - one of the most commonly

  14. High performance electromagnetic simulation tools

    Science.gov (United States)

    Gedney, Stephen D.; Whites, Keith W.

    1994-10-01

    Army Research Office Grant #DAAH04-93-G-0453 has supported the purchase of 24 additional compute nodes that were installed in the Intel iPsC/860 hypercube at the Univesity Of Kentucky (UK), rendering a 32-node multiprocessor. This facility has allowed the investigators to explore and extend the boundaries of electromagnetic simulation for important areas of defense concerns including microwave monolithic integrated circuit (MMIC) design/analysis and electromagnetic materials research and development. The iPSC/860 has also provided an ideal platform for MMIC circuit simulations. A number of parallel methods based on direct time-domain solutions of Maxwell's equations have been developed on the iPSC/860, including a parallel finite-difference time-domain (FDTD) algorithm, and a parallel planar generalized Yee-algorithm (PGY). The iPSC/860 has also provided an ideal platform on which to develop a 'virtual laboratory' to numerically analyze, scientifically study and develop new types of materials with beneficial electromagnetic properties. These materials simulations are capable of assembling hundreds of microscopic inclusions from which an electromagnetic full-wave solution will be obtained in toto. This powerful simulation tool has enabled research of the full-wave analysis of complex multicomponent MMIC devices and the electromagnetic properties of many types of materials to be performed numerically rather than strictly in the laboratory.

  15. Well performance model

    International Nuclear Information System (INIS)

    Thomas, L.K.; Evans, C.E.; Pierson, R.G.; Scott, S.L.

    1992-01-01

    This paper describes the development and application of a comprehensive oil or gas well performance model. The model contains six distinct sections: stimulation design, tubing and/or casing flow, reservoir and near-wellbore calculations, production forecasting, wellbore heat transmission, and economics. These calculations may be performed separately or in an integrated fashion with data and results shared among the different sections. The model analysis allows evaluation of all aspects of well completion design, including the effects on future production and overall well economics

  16. The realistic consideration of human factors in model based simulation tools for the air traffic control domain.

    Science.gov (United States)

    Duca, Gabriella; Attaianese, Erminia

    2012-01-01

    Advanced Air Traffic Management (ATM) concepts related to automation, airspace organization and operational procedures are driven by the overall goal to increase ATM system performance. Independently on the nature and/or impact of envisaged changes (e.g. from a short term procedure adjustment to a very long term operational concept or aid tools completion), the preliminary assessment of possible gains in airspace/airport capacity, safety and cost-effectiveness is done by running Model Based Simulations (MBSs, also known as Fast Time Simulations - FTS). Being a not human-in-the-loop technique, the reliability of a MBS results depend on the accuracy and significance of modeled human factors. Despite that, it can be observed in the practice that modeling tools commonly assume a generalized standardization of human behaviors and tasks and consider a very few range of work environment factors that, in the reality, affect the actual human-system performance. The present paper is aimed at opening a discussion about the possibility to keep task description and related weight at a high/general level, suitable for an efficient use of MBSs and, at the same time, increasing simulations reliability adopting some adjustment coming from the elaboration of further variables related to the human aspects of controllers workload.

  17. Tool Support for Collaborative Teaching and Learning of Object-Oriented Modelling

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Ratzer, Anne Vinter

    2002-01-01

    Modeling is central to doing and learning object-oriented development. We present a new tool, Ideogramic UML, for gesture-based collaborative modeling with the Unified Modeling Language (UML), which can be used to collaboratively teach and learn modeling. Furthermore, we discuss how we have...

  18. Performance curves of room air conditioners for building energy simulation tools

    International Nuclear Information System (INIS)

    Meissner, José W.; Abadie, Marc O.; Moura, Luís M.; Mendonça, Kátia C.; Mendes, Nathan

    2014-01-01

    Highlights: • Experimental characteristic curves for two room air conditioners are presented. • These results can be implemented in building simulation codes. • The energy consumption under different conditions can numerically determine. • The labeled higher energy efficiency product not always provides the best result. - Abstract: In order to improve the modeling of air conditioners in building simulation tools, the characteristic curves for total cooling capacity, sensible cooling capacity and energy efficiency ratio of two room units were determined. They were obtained by means of standard capacity tests on climatic chambers in a set of environmental conditions described by external dry- and internal wet bulb temperatures. Afterward, the performance of these two units and that of four other units, with and without taking into to account the thermodynamic variations of the surrounding environments on it, were compared using a whole building simulation program for simulating a conditioned space. The comparative analysis showed that the air conditioner with the higher energy efficiency rating not always provides the lowest power consumption in real conditions of use

  19. The development of environmental modeling tools in Brazil for emergency preparedness

    International Nuclear Information System (INIS)

    Rochedo, Elaine R.R.; Vetere, Maria I.C.; Conti, Luiz F.C.; Wasserman, Maria A.V.; Salinas, Isabel C.P.; Pereira, Jose F.; Silva, Diogo N.G.; Vinhas, Denise M

    2008-01-01

    Since the Goiania accident, in 1987, the IRD is developing tools to support decision-making processes after accidents involving radiological public exposure. The Environmental Modelling Project began with the development of the code CORAL, based on the German dynamic model ECOSYS, developed by the GSF, with the purpose of assessing the consequences of an accidental contamination of rural areas. Then, in cooperation with the GSF, the IRD has developed the model PARATI, based on information from Chernobyl and Goiania accidents, for the assessment of the exposure of the public due to a contamination of Cs-137 in urban areas. This model includes the ability to simulate the implementation of countermeasure and its effectiveness in reducing doses to the public. Subsequently, the SIEM - Integrated Emergency System was developed to include CORAL and PARATI, as well as some generic models developed by the IAEA, for short-term dose estimates and to support protective strategies during the emergency phase of an accident. SIEM also incorporated standardized data on the physical behavior of radionuclides and dose conversion factors. Several improvements have been performed in order to better adequate the model to Brazilian social, political, economic and climatic characteristics. Currently a multi-criteria strategy to support decision-making processes after the occurrence of an event of environmental contamination is under development. That work includes the development of a database on countermeasures and a computer model to perform the multi-criteria simulation. At all stages of the work, the pertinent weather and seasonal aspects are considered, in order to obtain a guide to protective actions accounting for social, economic and climatic characteristics, to be used in multi-criteria optimization processes adequate for tropical climate areas. (author)

  20. Embedding Java Types in CPN Tools

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; Westergaard, Michael

    the modeller to call methods on Java ob jects. This paper is about how the stub code is generated, i.e., representing Java classes to Standard ML to be able to call Java code in the CPN models, and how the BRITNeY Suite framework handles the invocations of the stub code. The contribution of this paper is give......CPN Tools is a well known editor for Colored Petri nets (CPNs) that is capable of doing state space and performance analysis. The BRITNeY Suite has added yet another feature to CPN Tools for integrating CPN models with Java programs, by providing stubs accessible from the models, to allow...

  1. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Directory of Open Access Journals (Sweden)

    Okokpujie Imhade Princess

    2017-12-01

    Full Text Available In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N, feed rate (f, axial depth of cut (a and radial depth of cut (r. The experiment was designed using central composite design (CCD in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM. The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  2. Experimental and Mathematical Modeling for Prediction of Tool Wear on the Machining of Aluminium 6061 Alloy by High Speed Steel Tools

    Science.gov (United States)

    Okokpujie, Imhade Princess; Ikumapayi, Omolayo M.; Okonkwo, Ugochukwu C.; Salawu, Enesi Y.; Afolalu, Sunday A.; Dirisu, Joseph O.; Nwoke, Obinna N.; Ajayi, Oluseyi O.

    2017-12-01

    In recent machining operation, tool life is one of the most demanding tasks in production process, especially in the automotive industry. The aim of this paper is to study tool wear on HSS in end milling of aluminium 6061 alloy. The experiments were carried out to investigate tool wear with the machined parameters and to developed mathematical model using response surface methodology. The various machining parameters selected for the experiment are spindle speed (N), feed rate (f), axial depth of cut (a) and radial depth of cut (r). The experiment was designed using central composite design (CCD) in which 31 samples were run on SIEG 3/10/0010 CNC end milling machine. After each experiment the cutting tool was measured using scanning electron microscope (SEM). The obtained optimum machining parameter combination are spindle speed of 2500 rpm, feed rate of 200 mm/min, axial depth of cut of 20 mm, and radial depth of cut 1.0mm was found out to achieved the minimum tool wear as 0.213 mm. The mathematical model developed predicted the tool wear with 99.7% which is within the acceptable accuracy range for tool wear prediction.

  3. Theoretical Modeling of Rock Breakage by Hydraulic and Mechanical Tool

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available Rock breakage by coupled mechanical and hydraulic action has been developed over the past several decades, but theoretical study on rock fragmentation by mechanical tool with water pressure assistance was still lacking. The theoretical model of rock breakage by mechanical tool was developed based on the rock fracture mechanics and the solution of Boussinesq’s problem, and it could explain the process of rock fragmentation as well as predicating the peak reacting force. The theoretical model of rock breakage by coupled mechanical and hydraulic action was developed according to the superposition principle of intensity factors at the crack tip, and the reacting force of mechanical tool assisted by hydraulic action could be reduced obviously if the crack with a critical length could be produced by mechanical or hydraulic impact. The experimental results indicated that the peak reacting force could be reduced about 15% assisted by medium water pressure, and quick reduction of reacting force after peak value decreased the specific energy consumption of rock fragmentation by mechanical tool. The crack formation by mechanical or hydraulic impact was the prerequisite to improvement of the ability of combined breakage.

  4. Using ProModel as a simulation tools to assist plant layout design and planning: Case study plastic packaging factory

    Directory of Open Access Journals (Sweden)

    Pochamarn Tearwattanarattikal

    2008-01-01

    Full Text Available This study is about the application of a Simulation Model to assist decision making on expanding capacity and plant layout design and planning. The plant layout design concept is performed first to create the physical layouts then the simulation model used to test the capability of plant to meet various demand forecast scena. The study employed ProModel package as a tool, using the model to compare the performances in term of % utilization, characteristics of WIP and ability to meet due date. The verification and validation stages were perform before running the scenarios. The model runs daily production and then the capacity constraint resources defined by % utilization. The expanding capacity policy can be extra shift-working hours or increasing the number of machines. After expanding capacity solutions are found, the physical layout is selected based on the criterion of space available for WIP and easy flow of material.

  5. Modeling Constellation Virtual Missions Using the Vdot(Trademark) Process Management Tool

    Science.gov (United States)

    Hardy, Roger; ONeil, Daniel; Sturken, Ian; Nix, Michael; Yanez, Damian

    2011-01-01

    The authors have identified a software tool suite that will support NASA's Virtual Mission (VM) effort. This is accomplished by transforming a spreadsheet database of mission events, task inputs and outputs, timelines, and organizations into process visualization tools and a Vdot process management model that includes embedded analysis software as well as requirements and information related to data manipulation and transfer. This paper describes the progress to date, and the application of the Virtual Mission to not only Constellation but to other architectures, and the pertinence to other aerospace applications. Vdot s intuitive visual interface brings VMs to life by turning static, paper-based processes into active, electronic processes that can be deployed, executed, managed, verified, and continuously improved. A VM can be executed using a computer-based, human-in-the-loop, real-time format, under the direction and control of the NASA VM Manager. Engineers in the various disciplines will not have to be Vdot-proficient but rather can fill out on-line, Excel-type databases with the mission information discussed above. The author s tool suite converts this database into several process visualization tools for review and into Microsoft Project, which can be imported directly into Vdot. Many tools can be embedded directly into Vdot, and when the necessary data/information is received from a preceding task, the analysis can be initiated automatically. Other NASA analysis tools are too complex for this process but Vdot automatically notifies the tool user that the data has been received and analysis can begin. The VM can be simulated from end-to-end using the author s tool suite. The planned approach for the Vdot-based process simulation is to generate the process model from a database; other advantages of this semi-automated approach are the participants can be geographically remote and after refining the process models via the human-in-the-loop simulation, the

  6. A strategic management model for evaluation of health, safety and environmental performance.

    Science.gov (United States)

    Abbaspour, Majid; Toutounchian, Solmaz; Roayaei, Emad; Nassiri, Parvin

    2012-05-01

    Strategic health, safety, and environmental management system (HSE-MS) involves systematic and cooperative planning in each phase of the lifecycle of a project to ensure that interaction among the industry group, client, contractor, stakeholder, and host community exists with the highest level of health, safety, and environmental standard performances. Therefore, it seems necessary to assess the HSE-MS performance of contractor(s) by a comparative strategic management model with the aim of continuous improvement. The present Strategic Management Model (SMM) has been illustrated by a case study and the results show that the model is a suitable management tool for decision making in a contract environment, especially in oil and gas fields and based on accepted international standards within the framework of management deming cycle. To develop this model, a data bank has been created, which includes the statistical data calculated by converting the HSE performance qualitative data into quantitative values. Based on this fact, the structure of the model has been formed by defining HSE performance indicators according to the HSE-MS model. Therefore, 178 indicators have been selected which have been grouped into four attributes. Model output provides quantitative measures of HSE-MS performance as a percentage of an ideal level with maximum possible score for each attribute. Defining the strengths and weaknesses of the contractor(s) is another capability of this model. On the other hand, this model provides a ranking that could be used as the basis for decision making at the contractors' pre-qualification phase or during the execution of the project.

  7. Applications and issues of GIS as tool for civil engineering modeling

    Science.gov (United States)

    Miles, S.B.; Ho, C.L.

    1999-01-01

    A tool that has proliferated within civil engineering in recent years is geographic information systems (GIS). The goal of a tool is to supplement ability and knowledge that already exists, not to serve as a replacement for that which is lacking. To secure the benefits and avoid misuse of a burgeoning tool, engineers must understand the limitations, alternatives, and context of the tool. The common benefits of using GIS as a supplement to engineering modeling are summarized. Several brief case studies of GIS modeling applications are taken from popular civil engineering literature to demonstrate the wide use and varied implementation of GIS across the discipline. Drawing from the case studies, limitations regarding traditional GIS data models find the implementation of civil engineering models within current GIS are identified and countered by discussing the direction of the next generation of GIS. The paper concludes by highlighting the potential for the misuse of GIS in the context of engineering modeling and suggests that this potential can be reduced through education and awareness. The goal of this paper is to promote awareness of the issues related to GIS-based modeling and to assist in the formulation of questions regarding the application of current GIS. The technology has experienced much publicity of late, with many engineers being perhaps too excited about the usefulness of current GIS. An undoubtedly beneficial side effect of this, however, is that engineers are becoming more aware of GIS and, hopefully, the associated subtleties. Civil engineers must stay informed of GIS issues and progress, but more importantly, civil engineers must inform the GIS community to direct the technology development optimally.

  8. Effect of vacuum oxy-nitrocarburizing on the microstructure of tool steels: an experimental and modeling study

    Directory of Open Access Journals (Sweden)

    Nikolova Maria

    2017-01-01

    Full Text Available The thermochemical treatments of tool steels improve the performance of the components with respect to surface hardness, wear and tribological performance as well as corrosion resistance. Compared to the conventional gas ferritic nitrocarburizing process, the original vacuum oxy-nitrocarburizing is a time-, cost-effective and environmentally-friendly gas process. Because of the oxidizing nature of the gas atmosphere, there is no need to perform subsequent post-oxidation.In this study, a vacuum oxynitrocarburizing process was carried out onto four tool steels (AISI H10, H11, H21 and D2 at 570 °C, after hardening and single tempering. The structural analysis of the compound and diffusion layers was performed by optical and electron microscopy, X-ray diffraction and glow discharge optical emission spectrometry (GDOES methods. A largely monophase ε- layer is formed with a carbon accumulation at the substrate adjacent area. The overlaying oxides adjacent to the ε-carbonitride phase contained Fe3O4 (magnetite as a main constituent. A thermodynamic modelling approach was also performed to understand and optimize the process. The “Equilib module” of FactSage software which uses Gibbs energy minimization method, was used to estimate the possible products during vacuum oxynitrocarburising process.

  9. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  10. Predictive tool of energy performance of cold storage in agrifood industries: The Portuguese case study

    International Nuclear Information System (INIS)

    Nunes, José; Neves, Diogo; Gaspar, Pedro D.; Silva, Pedro D.; Andrade, Luís P.

    2014-01-01

    Highlights: • A predictive tool for assessment of the energy performance in agrifood industries that use cold storage is developed. • The correlations used by the predictive tool result from the greatest number of data sets collected to date in Portugal. • Strong relationships between raw material, energy consumption and volume of cold stores were established. • Case studies were analyzed that demonstrate the applicability of the tool. • The tool results are useful in the decision-making process of practice measures for the improvement of energy efficiency. - Abstract: Food processing and conservation represent decisive factors for the sustainability of the planet given the significant growth of the world population in the last decades. Therefore, the cooling process during the manufacture and/or storage of food products has been subject of study and improvement in order to ensure the food supply with good quality and safety. A predictive tool for assessment of the energy performance in agrifood industries that use cold storage is developed in order to contribute to the improvement of the energy efficiency of this industry. The predictive tool is based on a set of characteristic correlated parameters: amount of raw material annually processed, annual energy consumption and volume of cold rooms. Case studies of application of the predictive tool consider industries in the meat sector, specifically slaughterhouses. The results obtained help on the decision-making of practice measures for improvement of the energy efficiency in this industry

  11. Lightweight approach to model traceability in a CASE tool

    Science.gov (United States)

    Vileiniskis, Tomas; Skersys, Tomas; Pavalkis, Saulius; Butleris, Rimantas; Butkiene, Rita

    2017-07-01

    A term "model-driven" is not at all a new buzzword within the ranks of system development community. Nevertheless, the ever increasing complexity of model-driven approaches keeps fueling all kinds of discussions around this paradigm and pushes researchers forward to research and develop new and more effective ways to system development. With the increasing complexity, model traceability, and model management as a whole, becomes indispensable activities of model-driven system development process. The main goal of this paper is to present a conceptual design and implementation of a practical lightweight approach to model traceability in a CASE tool.

  12. An empirical approach for evaluating the usability of model-driven tools

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Panach, Jose Ignacio; Baars, Arthur Iwan; Vos, Tanja; Pastor, Oscar

    2013-01-01

    MDD tools are very useful to draw conceptual models and to automate code generation. Even though this would bring many benefits, wide adoption of MDD tools is not yet a reality. Various research activities are being undertaken to find why and to provide the required solutions. However, insufficient

  13. ModelMage: a tool for automatic model generation, selection and management.

    Science.gov (United States)

    Flöttmann, Max; Schaber, Jörg; Hoops, Stephan; Klipp, Edda; Mendes, Pedro

    2008-01-01

    Mathematical modeling of biological systems usually involves implementing, simulating, and discriminating several candidate models that represent alternative hypotheses. Generating and managing these candidate models is a tedious and difficult task and can easily lead to errors. ModelMage is a tool that facilitates management of candidate models. It is designed for the easy and rapid development, generation, simulation, and discrimination of candidate models. The main idea of the program is to automatically create a defined set of model alternatives from a single master model. The user provides only one SBML-model and a set of directives from which the candidate models are created by leaving out species, modifiers or reactions. After generating models the software can automatically fit all these models to the data and provides a ranking for model selection, in case data is available. In contrast to other model generation programs, ModelMage aims at generating only a limited set of models that the user can precisely define. ModelMage uses COPASI as a simulation and optimization engine. Thus, all simulation and optimization features of COPASI are readily incorporated. ModelMage can be downloaded from http://sysbio.molgen.mpg.de/modelmage and is distributed as free software.

  14. An Overview of the Object Protocol Model (OPM) and the OPM Data Management Tools.

    Science.gov (United States)

    Chen, I-Min A.; Markowitz, Victor M.

    1995-01-01

    Discussion of database management tools for scientific information focuses on the Object Protocol Model (OPM) and data management tools based on OPM. Topics include the need for new constructs for modeling scientific experiments, modeling object structures and experiments in OPM, queries and updates, and developing scientific database applications…

  15. Automated Improvement of Software Architecture Models for Performance and Other Quality Attributes

    OpenAIRE

    Koziolek, Anne

    2013-01-01

    Quality attributes, such as performance or reliability, are crucial for the success of a software system and largely influenced by the software architecture. Their quantitative prediction supports systematic, goal-oriented software design and forms a base of an engineering approach to software design. This thesis proposes a method and tool to automatically improve component-based software architecture (CBA) models based on such quantitative quality prediction techniques.

  16. A set of tools for determining the LAT performance in specific applications

    International Nuclear Information System (INIS)

    Lott, B.; Ballet, J.; Chiang, J.; Lonjou, V.; Funk, S.

    2007-01-01

    The poster presents a set of simple tools being developed to predict GLAST's performance for specific cases, like the accumulation time needed to reach a given significance or statistical accuracy for a particular source. Different examples are given, like the generation of a full-sky sensitivity map

  17. Tribological performances of new steel grades for hot stamping tools

    Science.gov (United States)

    Medea, F.; Venturato, G.; Ghiotti, A.; Bruschi, S.

    2017-09-01

    In the last years, the use of High Strength Steels (HSS) as structural parts in car body-in-white manufacturing has rapidly increased thanks to their favourable strength-to-weight ratio and stiffness, which allow a reduction of the fuel consumption to accommodate the new restricted regulations for CO2 emissions control. The survey of the technical and scientific literature shows a large interest in the development of different coatings for the blanks from the traditional Al-Si up to new Zn-based coatings and on the analysis of hard PVD, CVD coatings and plasma nitriding applied on the tools. By contrast, fewer investigations have been focused on the development and test of new tools steels grades capable to improve the wear resistance and the thermal properties that are required for the in-die quenching during forming. On this base, the paper deals with the analysis and comparison the tribological performances in terms of wear, friction and heat transfer of new tool steel grades for high-temperature applications, characterized by a higher thermal conductivity than the commonly used tools. Testing equipment, procedures as well as measurements analyses to evaluate the friction coefficient, the wear and heat transfer phenomena are presented. Emphasis is given on the physical simulation techniques that were specifically developed to reproduce the thermal and mechanical cycles on the metal sheets and dies as in the industrial practice. The reference industrial process is the direct hot stamping of the 22MnB5 HSS coated with the common Al-Si coating for automotive applications.

  18. Chimera Grid Tools

    Science.gov (United States)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  19. Metabolic engineering tools in model cyanobacteria.

    Science.gov (United States)

    Carroll, Austin L; Case, Anna E; Zhang, Angela; Atsumi, Shota

    2018-03-26

    Developing sustainable routes for producing chemicals and fuels is one of the most important challenges in metabolic engineering. Photoautotrophic hosts are particularly attractive because of their potential to utilize light as an energy source and CO 2 as a carbon substrate through photosynthesis. Cyanobacteria are unicellular organisms capable of photosynthesis and CO 2 fixation. While engineering in heterotrophs, such as Escherichia coli, has result in a plethora of tools for strain development and hosts capable of producing valuable chemicals efficiently, these techniques are not always directly transferable to cyanobacteria. However, recent efforts have led to an increase in the scope and scale of chemicals that cyanobacteria can produce. Adaptations of important metabolic engineering tools have also been optimized to function in photoautotrophic hosts, which include Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR)-Cas9, 13 C Metabolic Flux Analysis (MFA), and Genome-Scale Modeling (GSM). This review explores innovations in cyanobacterial metabolic engineering, and highlights how photoautotrophic metabolism has shaped their development. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  20. Visiting Vehicle Ground Trajectory Tool

    Science.gov (United States)

    Hamm, Dustin

    2013-01-01

    The International Space Station (ISS) Visiting Vehicle Group needed a targeting tool for vehicles that rendezvous with the ISS. The Visiting Vehicle Ground Trajectory targeting tool provides the ability to perform both realtime and planning operations for the Visiting Vehicle Group. This tool provides a highly reconfigurable base, which allows the Visiting Vehicle Group to perform their work. The application is composed of a telemetry processing function, a relative motion function, a targeting function, a vector view, and 2D/3D world map type graphics. The software tool provides the ability to plan a rendezvous trajectory for vehicles that visit the ISS. It models these relative trajectories using planned and realtime data from the vehicle. The tool monitors ongoing rendezvous trajectory relative motion, and ensures visiting vehicles stay within agreed corridors. The software provides the ability to update or re-plan a rendezvous to support contingency operations. Adding new parameters and incorporating them into the system was previously not available on-the-fly. If an unanticipated capability wasn't discovered until the vehicle was flying, there was no way to update things.

  1. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  2. Performance modelling for product development of advanced window systems

    DEFF Research Database (Denmark)

    Appelfeld, David

    properties (chapter 5) of CFSs. The last chapter concludes the thesis and the individual investigations. It is complicated to holistically evaluate the performance of a prototyped system, since simulation programs evaluate standardised products such as aluminium venetian blinds. State-of-the-art tools......The research presented in this doctoral thesis shows how the product development (PD) of Complex Fenestration Systems (CFSs) can be facilitated by computer-based analysis to improve the energy efficiency of fenestration systems as well as to improve the indoor environment. The first chapter defines...... the hypothesis and objectives of the thesis, which is followed by an extended introduction and background. The third chapter briefly suggests the PD framework which is suitable for CFSs. The fourth and fifth chapter refer to the detailed performance modelling of thermal properties (chapter 4) and optical...

  3. Test of a causal Human Resource Management-Performance Linkage Model: Evidence from the Greek manufacturing sector

    OpenAIRE

    Katou, A.; Katou, A.

    2011-01-01

    Although a number of studies have recognized the relationship between Human Resource Management (HRM) policies and organisational performance, the mechanisms through which HRM policies lead to organisational performance remain still unexplored. The purpose of this paper is to investigate the pathways leading from HRM policies to organisational performance by using structural equation modelling. Specifically, this analytical tool has been used to test a research framework that is constituted ...

  4. A Decision Support Model and Tool to Assist Financial Decision-Making in Universities

    Science.gov (United States)

    Bhayat, Imtiaz; Manuguerra, Maurizio; Baldock, Clive

    2015-01-01

    In this paper, a model and tool is proposed to assist universities and other mission-based organisations to ascertain systematically the optimal portfolio of projects, in any year, meeting the organisations risk tolerances and available funds. The model and tool presented build on previous work on university operations and decision support systems…

  5. Towards an Experimental Framework for Measuring Usability of Model-Driven Tools

    NARCIS (Netherlands)

    Panach, Jose Ignacio; Condori-Fernandez, Nelly; Baar, Arthur; Vos, Tanja; Romeu, Ignacio; Pastor, Oscar; Campos, Pedro; Graham, Nicholas; Jorge, Joaquim; Nunes, Nuno; Palanque, Philippe; Winckler, Marco

    2011-01-01

    According to the Model-Driven Development (MDD) paradigm, analysts can substantially improve the software development process concentrating their efforts on a conceptual model, which can be transformed into code by means of transformation rules applied by a model compiler. However, MDD tools are not

  6. Transition Towards Energy Efficient Machine Tools

    CERN Document Server

    Zein, André

    2012-01-01

    Energy efficiency represents a cost-effective and immediate strategy of a sustainable development. Due to substantial environmental and economic implications, a strong emphasis is put on the electrical energy requirements of machine tools for metalworking processes. The improvement of energy efficiency is however confronted with diverse barriers, which sustain an energy efficiency gap of unexploited potential. The deficiencies lie in the lack of information about the actual energy requirements of machine tools, a minimum energy reference to quantify improvement potential and the possible actions to improve the energy demand. Therefore, a comprehensive concept for energy performance management of machine tools is developed which guides the transition towards energy efficient machine tools. It is structured in four innovative concept modules, which are embedded into step-by-step workflow models. The capability of the performance management concept is demonstrated in an automotive manufacturing environment. The ...

  7. Operation room tool handling and miscommunication scenarios: an object-process methodology conceptual model.

    Science.gov (United States)

    Wachs, Juan P; Frenkel, Boaz; Dori, Dov

    2014-11-01

    various levels of detail, each level is depicted in a separate diagram, and all the diagrams are "aware" of each other as part of the whole model. Providing ontology of verbal and non-verbal modalities of communication in the OR, the resulting conceptual model is a solid basis for analyzing and understanding the source of the large variety of errors occurring in the course of an operation, providing an opportunity to decrease the quantity and severity of mistakes related to the use and misuse of surgical instrumentations. Since the model is event driven, rather than person driven, the focus is on the factors causing the errors, rather than the specific person. This approach advocates searching for technological solutions to alleviate tool-related errors rather than finger-pointing. Concretely, the model was validated through a structured questionnaire and it was found that surgeons agreed that the conceptual model was flexible (3.8 of 5, std=0.69), accurate, and it generalizable (3.7 of 5, std=0.37 and 3.7 of 5, std=0.85, respectively). The detailed conceptual model of the tools handling subsystem of the operation performed in an OR focuses on the details of the communication and the interactions taking place between the surgeon and the surgical technician during an operation, with the objective of pinpointing the exact circumstances in which errors can happen. Exact and concise specification of the communication events in general and the surgical instrument requests in particular is a prerequisite for a methodical analysis of the various modes of errors and the circumstances under which they occur. This has significant potential value in both reduction in tool-handling-related errors during an operation and providing a solid formal basis for designing a cybernetic agent which can replace a surgical technician in routine tool handling activities during an operation, freeing the technician to focus on quality assurance, monitoring and control of the cybernetic agent

  8. GAMBIT: the global and modular beyond-the-standard-model inference tool

    Science.gov (United States)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian

    2017-11-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.

  9. GAMBIT. The global and modular beyond-the-standard-model inference tool

    Energy Technology Data Exchange (ETDEWEB)

    Athron, Peter; Balazs, Csaba [Monash University, School of Physics and Astronomy, Melbourne, VIC (Australia); Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are [University of Oslo, Department of Physics, Oslo (Norway); Buckley, Andy [University of Glasgow, SUPA, School of Physics and Astronomy, Glasgow (United Kingdom); Chrzaszcz, Marcin [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Polish Academy of Sciences, H. Niewodniczanski Institute of Nuclear Physics, Krakow (Poland); Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan [AlbaNova University Centre, Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Stockholm University, Department of Physics, Stockholm (Sweden); Cornell, Jonathan M. [McGill University, Department of Physics, Montreal, QC (Canada); Dickinson, Hugh [University of Minnesota, Minnesota Institute for Astrophysics, Minneapolis, MN (United States); Jackson, Paul; White, Martin [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); University of Adelaide, Department of Physics, Adelaide, SA (Australia); Kvellestad, Anders; Savage, Christopher [NORDITA, Stockholm (Sweden); McKay, James [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Mahmoudi, Farvah [Univ Lyon, Univ Lyon 1, ENS de Lyon, CNRS, Centre de Recherche Astrophysique de Lyon UMR5574, Saint-Genis-Laval (France); CERN, Theoretical Physics Department, Geneva (Switzerland); Martinez, Gregory D. [University of California, Physics and Astronomy Department, Los Angeles, CA (United States); Putze, Antje [LAPTh, Universite de Savoie, CNRS, Annecy-le-Vieux (France); Ripken, Joachim [Max Planck Institute for Solar System Research, Goettingen (Germany); Rogan, Christopher [Harvard University, Department of Physics, Cambridge, MA (United States); Saavedra, Aldo [Australian Research Council Centre of Excellence for Particle Physics at the Tera-scale (Australia); The University of Sydney, Faculty of Engineering and Information Technologies, Centre for Translational Data Science, School of Physics, Sydney, NSW (Australia); Scott, Pat [Imperial College London, Blackett Laboratory, Department of Physics, London (United Kingdom); Seo, Seon-Hee [Seoul National University, Department of Physics and Astronomy, Seoul (Korea, Republic of); Serra, Nicola [Universitaet Zuerich, Physik-Institut, Zurich (Switzerland); Weniger, Christoph [University of Amsterdam, GRAPPA, Institute of Physics, Amsterdam (Netherlands); Wild, Sebastian [DESY, Hamburg (Germany); Collaboration: The GAMBIT Collaboration

    2017-11-15

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  10. GAMBIT. The global and modular beyond-the-standard-model inference tool

    International Nuclear Information System (INIS)

    Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Dal, Lars A.; Gonzalo, Tomas E.; Krislock, Abram; Raklev, Are; Buckley, Andy; Chrzaszcz, Marcin; Conrad, Jan; Edsjoe, Joakim; Farmer, Ben; Lundberg, Johan; Cornell, Jonathan M.; Dickinson, Hugh; Jackson, Paul; White, Martin; Kvellestad, Anders; Savage, Christopher; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; Wild, Sebastian

    2017-01-01

    We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org. (orig.)

  11. Neuro-Simulation Tool for Enhanced Oil Recovery Screening and Reservoir Performance Prediction

    Directory of Open Access Journals (Sweden)

    Soheil Bahrekazemi

    2017-09-01

    Full Text Available Assessment of the suitable enhanced oil recovery method in an oilfield is one of the decisions which are made prior to the natural drive production mechanism. In some cases, having in-depth knowledge about reservoir’s rock, fluid properties, and equipment is needed as well as economic evaluation. Both putting such data into simulation and its related consequent processes are generally very time consuming and costly.  In order to reduce study cases, an appropriate tool is required for primary screening prior to any operations being performed, to which leads reduction of time in design of ether pilot section or production under field condition. In this research, two different and useful screening tools are presented through a graphical user interface. The output of just over 900 simulations and verified screening criteria tables were employed to design the mentioned tools. Moreover, by means of gathered data and development of artificial neural networks, two dissimilar screening tools for proper assessment of suitable enhanced oil recovery method were finally introduced. The first tool is about the screening of enhanced oil recovery process based on published tables/charts and the second one which is Neuro-Simulation tool, concerns economical evaluation of miscible and immiscible injection of carbon dioxide, nitrogen and natural gas into the reservoir. Both of designed tools are provided in the form of a graphical user interface by which the user, can perceive suitable method through plot of oil recovery graph during 20 years of production, costs of gas injection per produced barrel, cumulative oil production, and finally, design the most efficient scenario.

  12. Conversion and Validation of Distribution System Model from a QSTS-Based Tool to a Real-Time Dynamic Phasor Simulator: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Chamana, Manohar; Prabakar, Kumaraguru; Palmintier, Bryan; Baggu, Murali M.

    2017-04-11

    A software process is developed to convert distribution network models from a quasi-static time-series tool (OpenDSS) to a real-time dynamic phasor simulator (ePHASORSIM). The description of this process in this paper would be helpful for researchers who intend to perform similar conversions. The converter could be utilized directly by users of real-time simulators who intend to perform software-in-the-loop or hardware-in-the-loop tests on large distribution test feeders for a range of use cases, including testing functions of advanced distribution management systems against a simulated distribution system. In the future, the developers intend to release the conversion tool as open source to enable use by others.

  13. Development of a climate data analysis tool (CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Marlais, S.M.

    1997-09-01

    The Climate Data Analysis Tool (CDAT) is designed to provide the Program for Climate Model Diagnosis and Intercomparison (PCMDI) at Lawrence Livermore National Laboratory, California, with the capabilities needed to analyze model data with little effort on the part of the scientist, while performing complex mathematical calculations, and graphically displaying the results. This computer software will meet the demanding need of climate scientists by providing the necessary tools to diagnose, validate, and intercompare large observational and global climate model datasets.

  14. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    Science.gov (United States)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the

  15. Inspection of the Math Model Tools for On-Orbit Assessment of Impact Damage Report

    Science.gov (United States)

    Harris, Charles E.; Raju, Ivatury S.; Piascik, Robert S> KramerWhite, Julie A.; KramerWhite, Julie A.; Labbe, Steve G.; Rotter, Hank A.

    2007-01-01

    In Spring of 2005, the NASA Engineering Safety Center (NESC) was engaged by the Space Shuttle Program (SSP) to peer review the suite of analytical tools being developed to support the determination of impact and damage tolerance of the Orbiter Thermal Protection Systems (TPS). The NESC formed an independent review team with the core disciplines of materials, flight sciences, structures, mechanical analysis and thermal analysis. The Math Model Tools reviewed included damage prediction and stress analysis, aeroheating analysis, and thermal analysis tools. Some tools are physics-based and other tools are empirically-derived. Each tool was created for a specific use and timeframe, including certification, real-time pre-launch assessments. In addition, the tools are used together in an integrated strategy for assessing the ramifications of impact damage to tile and RCC. The NESC teams conducted a peer review of the engineering data package for each Math Model Tool. This report contains the summary of the team observations and recommendations from these reviews.

  16. Assessment of the Clinical Trainer as a Role Model: A Role Model Apperception Tool (RoMAT)

    NARCIS (Netherlands)

    Jochemsen-van der Leeuw, H. G. A. Ria; van Dijk, Nynke; Wieringa-de Waard, Margreet

    2014-01-01

    Purpose Positive role modeling by clinical trainers is important for helping trainees learn professional and competent behavior. The authors developed and validated an instrument to assess clinical trainers as role models: the Role Model Apperception Tool (RoMAT). Method On the basis of a 2011

  17. Tool-specific performance of vibration-reducing gloves for attenuating fingers-transmitted vibration

    Science.gov (United States)

    Welcome, Daniel E.; Dong, Ren G.; Xu, Xueyan S.; Warren, Christopher; McDowell, Thomas W.

    2016-01-01

    BACKGROUND Fingers-transmitted vibration can cause vibration-induced white finger. The effectiveness of vibration-reducing (VR) gloves for reducing hand transmitted vibration to the fingers has not been sufficiently examined. OBJECTIVE The objective of this study is to examine tool-specific performance of VR gloves for reducing finger-transmitted vibrations in three orthogonal directions (3D) from powered hand tools. METHODS A transfer function method was used to estimate the tool-specific effectiveness of four typical VR gloves. The transfer functions of the VR glove fingers in three directions were either measured in this study or during a previous study using a 3D laser vibrometer. More than seventy vibration spectra of various tools or machines were used in the estimations. RESULTS When assessed based on frequency-weighted acceleration, the gloves provided little vibration reduction. In some cases, the gloves amplified the vibration by more than 10%, especially the neoprene glove. However, the neoprene glove did the best when the assessment was based on unweighted acceleration. The neoprene glove was able to reduce the vibration by 10% or more of the unweighted vibration for 27 out of the 79 tools. If the dominant vibration of a tool handle or workpiece was in the shear direction relative to the fingers, as observed in the operation of needle scalers, hammer chisels, and bucking bars, the gloves did not reduce the vibration but increased it. CONCLUSIONS This study confirmed that the effectiveness for reducing vibration varied with the gloves and the vibration reduction of each glove depended on tool, vibration direction to the fingers, and finger location. VR gloves, including certified anti-vibration gloves do not provide much vibration reduction when judged based on frequency-weighted acceleration. However, some of the VR gloves can provide more than 10% reduction of the unweighted vibration for some tools or workpieces. Tools and gloves can be matched for

  18. Balanced Scorecard – Strategic Management Tool of Performance in Public Institutions

    Directory of Open Access Journals (Sweden)

    Carmen Cretu

    2015-02-01

    Full Text Available Balanced Scorecard (BSC is used to achieve an operational strategic vision at all levels of the organization regarding issues related to performance, strategy, communication, resource allocation, decision-making and competitiveness. BSC was created to restrict the limits of traditional financial and management tools and ensure unity of vision and long-term action in an organization. The main advantage of the method consists in guiding managers and departments, human resources, technological and financial resources towards the strategy of the organization. Unfortunately BSC is mainly used in private companies, because high costs and lack of specialists pose a real obstacle in implementing this instrument in public institutions. Our arguments attempts to show that the Balanced Scorecard can be the most appropriate among all the management tools for the public sector.

  19. Modeling and multi-objective optimization of surface roughness and productivity in dry turning of AISI 52100 steel using (TiCN-TiN) coating cermet tools

    OpenAIRE

    Ouahid Keblouti; Lakhdar Boulanouar; Mohamed Walid Azizi; Mohamed Athmane Yallese

    2017-01-01

    The present work concerns an experimental study of turning with coated cermet tools with TiCN-TiN coating layer of AISI 52100 bearing steel. The main objectives are firstly focused on the effect of cutting parameters and coating material on the performances of cutting tools. Secondly, to perform a Multi-objective optimization for minimizing surface roughness (Ra) and maximizing material removal rate by desirability approach. A mathematical model was developed based on the Response Surface Met...

  20. RooStats: Statistical Tools for the LHC

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    LHC data, with emphasis on discoveries, confidence intervals, and combined measurements in the both the Bayesian and Frequentist approaches. The tools are built on top of the RooFit data modeling language and core ROOT mathematics libraries and persistence technology. These tools have been developed in collaboration with the LHC experiments and used by them to produce numerous physics results, such as the combination of ATLAS and CMS Higgs searches that resulted in a model with more than 200 parameters. We will review new developments which have been included in RooStats and the performance optimizations, required to cope with such complex models used by the LHC experiments. We will show as well the parallelization capability of these statistical tools using multiple-processors via PROOF.

  1. ISAC: A tool for aeroservoelastic modeling and analysis

    Science.gov (United States)

    Adams, William M., Jr.; Hoadley, Sherwood Tiffany

    1993-01-01

    The capabilities of the Interaction of Structures, Aerodynamics, and Controls (ISAC) system of program modules is discussed. The major modeling, analysis, and data management components of ISAC are identified. Equations of motion are displayed for a Laplace-domain representation of the unsteady aerodynamic forces. Options for approximating a frequency-domain representation of unsteady aerodynamic forces with rational functions of the Laplace variable are shown. Linear time invariant state-space equations of motion that result are discussed. Model generation and analyses of stability and dynamic response characteristics are shown for an aeroelastic vehicle which illustrates some of the capabilities of ISAC as a modeling and analysis tool for aeroelastic applications.

  2. Detection of Cutting Tool Wear using Statistical Analysis and Regression Model

    Science.gov (United States)

    Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin

    2010-10-01

    This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.

  3. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  4. Modifying the Soil and Water Assessment Tool to simulate cropland carbon flux: Model development and initial evaluation

    International Nuclear Information System (INIS)

    Zhang, Xuesong; Izaurralde, R. César; Arnold, Jeffrey G.; Williams, Jimmy R.; Srinivasan, Raghavan

    2013-01-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO 2 emission, one important source of greenhouse gasses (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)-residue and simulate land–atmosphere carbon exchange. We test this new SWAT-C model with daily eddy covariance (EC) observations of net ecosystem exchange (NEE) and evapotranspiration (ET) and annual crop yield at six sites across the U.S. Midwest. Results show that SWAT-C simulates well multi-year average NEE and ET across the spatially distributed sites and capture the majority of temporal variation of these two variables at a daily time scale at each site. Our analyses also reveal that performance of SWAT-C is influenced by multiple factors, such as crop management practices (irrigated vs. rainfed), completeness and accuracy of input data, crop species, and initialization of state variables. Overall, the new SWAT-C demonstrates favorable performance for simulating land–atmosphere carbon exchange across agricultural sites with different soils, climate, and management practices. SWAT-C is expected to serve as a useful tool for including carbon flux into consideration in sustainable watershed management under a changing climate. We also note that extensive assessment of SWAT-C with field observations is required for further improving the model and understanding potential uncertainties of applying it across large regions with complex landscapes. - Highlights: • Expanding the SWAT

  5. Modifying the Soil and Water Assessment Tool to simulate cropland carbon flux: Model development and initial evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xuesong; Izaurralde, R. César [Joint Global Change Research Institute, Pacific Northwest National Laboratory and University of Maryland, College Park, MD 20740 (United States); Arnold, Jeffrey G. [Grassland, Soil and Water Research Laboratory USDA-ARS, Temple, TX 76502 (United States); Williams, Jimmy R. [Blackland Research and Extension Center, AgriLIFE Research, 720 E. Blackland Road, Temple, TX 76502 (United States); Srinivasan, Raghavan [Spatial Sciences Laboratory in the Department of Ecosystem Science and Management, Texas A and M University, College Stations, TX 77845 (United States)

    2013-10-01

    Climate change is one of the most compelling modern issues and has important implications for almost every aspect of natural and human systems. The Soil and Water Assessment Tool (SWAT) model has been applied worldwide to support sustainable land and water management in a changing climate. However, the inadequacies of the existing carbon algorithm in SWAT limit its application in assessing impacts of human activities on CO{sub 2} emission, one important source of greenhouse gasses (GHGs) that traps heat in the earth system and results in global warming. In this research, we incorporate a revised version of the CENTURY carbon model into SWAT to describe dynamics of soil organic matter (SOM)-residue and simulate land–atmosphere carbon exchange. We test this new SWAT-C model with daily eddy covariance (EC) observations of net ecosystem exchange (NEE) and evapotranspiration (ET) and annual crop yield at six sites across the U.S. Midwest. Results show that SWAT-C simulates well multi-year average NEE and ET across the spatially distributed sites and capture the majority of temporal variation of these two variables at a daily time scale at each site. Our analyses also reveal that performance of SWAT-C is influenced by multiple factors, such as crop management practices (irrigated vs. rainfed), completeness and accuracy of input data, crop species, and initialization of state variables. Overall, the new SWAT-C demonstrates favorable performance for simulating land–atmosphere carbon exchange across agricultural sites with different soils, climate, and management practices. SWAT-C is expected to serve as a useful tool for including carbon flux into consideration in sustainable watershed management under a changing climate. We also note that extensive assessment of SWAT-C with field observations is required for further improving the model and understanding potential uncertainties of applying it across large regions with complex landscapes. - Highlights: • Expanding the

  6. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  7. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  8. Development Of A Parallel Performance Model For The THOR Neutral Particle Transport Code

    Energy Technology Data Exchange (ETDEWEB)

    Yessayan, Raffi; Azmy, Yousry; Schunert, Sebastian

    2017-02-01

    The THOR neutral particle transport code enables simulation of complex geometries for various problems from reactor simulations to nuclear non-proliferation. It is undergoing a thorough V&V requiring computational efficiency. This has motivated various improvements including angular parallelization, outer iteration acceleration, and development of peripheral tools. For guiding future improvements to the code’s efficiency, better characterization of its parallel performance is useful. A parallel performance model (PPM) can be used to evaluate the benefits of modifications and to identify performance bottlenecks. Using INL’s Falcon HPC, the PPM development incorporates an evaluation of network communication behavior over heterogeneous links and a functional characterization of the per-cell/angle/group runtime of each major code component. After evaluating several possible sources of variability, this resulted in a communication model and a parallel portion model. The former’s accuracy is bounded by the variability of communication on Falcon while the latter has an error on the order of 1%.

  9. Performance of Process Damping in Machining Titanium Alloys at Low Cutting Speed with Different Helix Tools

    International Nuclear Information System (INIS)

    Shaharun, M A; Yusoff, A R; Reza, M S; Jalal, K A

    2012-01-01

    Titanium is a strong, lustrous, corrosion-resistant and transition metal with a silver color to produce strong lightweight alloys for industrial process, automotive, medical instruments and other applications. However, it is very difficult to machine the titanium due to its poor machinability. When machining titanium alloys with the conventional tools, the wear rate of the tool is rapidly accelerate and it is generally difficult to achieve at high cutting speed. In order to get better understanding of machining titanium alloy, the interaction between machining structural system and the cutting process which result in machining instability will be studied. Process damping is a useful phenomenon that can be exploited to improve the limited productivity of low speed machining. In this study, experiments are performed to evaluate the performance of process damping of milling under different tool helix geometries. The results showed that the helix of 42° angle is significantly increase process damping performance in machining titanium alloy.

  10. Development of a Mobile Application for Building Energy Prediction Using Performance Prediction Model

    Directory of Open Access Journals (Sweden)

    Yu-Ri Kim

    2016-03-01

    Full Text Available Recently, the Korean government has enforced disclosure of building energy performance, so that such information can help owners and prospective buyers to make suitable investment plans. Such a building energy performance policy of the government makes it mandatory for the building owners to obtain engineering audits and thereby evaluate the energy performance levels of their buildings. However, to calculate energy performance levels (i.e., asset rating methodology, a qualified expert needs to have access to at least the full project documentation and/or conduct an on-site inspection of the buildings. Energy performance certification costs a lot of time and money. Moreover, the database of certified buildings is still actually quite small. A need, therefore, is increasing for a simplified and user-friendly energy performance prediction tool for non-specialists. Also, a database which allows building owners and users to compare best practices is required. In this regard, the current study developed a simplified performance prediction model through experimental design, energy simulations and ANOVA (analysis of variance. Furthermore, using the new prediction model, a related mobile application was also developed.

  11. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process.

    Science.gov (United States)

    Araujo, Pablo Granda; Gras, Anna; Ginovart, Marta

    2016-01-01

    Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs) using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics) can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4 (+), NO3 (-), NO2 (-), N2) to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc). MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  12. Importance and performance evaluation tools for small and medium companies: critical analysis of national versus international literature

    Directory of Open Access Journals (Sweden)

    Sandro César Bortoluzzi

    2015-12-01

    Full Text Available The research aims to map the importance and performance evaluation tools for small and medium companies. This descriptive and qualitative study analyzed 33 national articles and 21 international ones. Regarding the importance of performance evaluation for small and medium companies, the literature highlights: (i it increases the success of the network; (ii it is useful for management; (iii it strengthens competitiveness; (iv it consolidates cooperation; and, (v it increases trust among partners. Comparing the national versus international literature on the importance of performance evaluation for small and medium companies, it can be noticed similar and complementary aspects, that is, there is not disagreement between the authors. The authors use tools consolidated in the literature, such as Balanced Scorecard; Benchmarking; Performance Prism and tools proposed specifically to evaluate small and medium networks. The main dimensions evaluated are: (i exchange of information; (ii value management in networks; (Iii level of network maturity; (iv benefits of collaboration; (v social capital; (vi collective efficiency; (vii network life cycle; (viii efficiency and inefficiency of the networks; and, (ix existence and intensity of the relationship between partners. The critical analysis regarding the performance evaluation concept adopted in the present study shows that the tools proposed or implemented to evaluate small and medium business networks have gaps in the process to identify criteria, measure ordinal and cardinally, integrate and generate actions of improvement.

  13. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  14. Effect of Using Extreme Years in Hydrologic Model Calibration Performance

    Science.gov (United States)

    Goktas, R. K.; Tezel, U.; Kargi, P. G.; Ayvaz, T.; Tezyapar, I.; Mesta, B.; Kentel, E.

    2017-12-01

    Hydrological models are useful in predicting and developing management strategies for controlling the system behaviour. Specifically they can be used for evaluating streamflow at ungaged catchments, effect of climate change, best management practices on water resources, or identification of pollution sources in a watershed. This study is a part of a TUBITAK project named "Development of a geographical information system based decision-making tool for water quality management of Ergene Watershed using pollutant fingerprints". Within the scope of this project, first water resources in Ergene Watershed is studied. Streamgages found in the basin are identified and daily streamflow measurements are obtained from State Hydraulic Works of Turkey. Streamflow data is analysed using box-whisker plots, hydrographs and flow-duration curves focusing on identification of extreme periods, dry or wet. Then a hydrological model is developed for Ergene Watershed using HEC-HMS in the Watershed Modeling System (WMS) environment. The model is calibrated for various time periods including dry and wet ones and the performance of calibration is evaluated using Nash-Sutcliffe Efficiency (NSE), correlation coefficient, percent bias (PBIAS) and root mean square error. It is observed that calibration period affects the model performance, and the main purpose of the development of the hydrological model should guide calibration period selection. Acknowledgement: This study is funded by The Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 115Y064.

  15. Theoretical Tools and Software for Modeling, Simulation and Control Design of Rocket Test Facilities

    Science.gov (United States)

    Richter, Hanz

    2004-01-01

    A rocket test stand and associated subsystems are complex devices whose operation requires that certain preparatory calculations be carried out before a test. In addition, real-time control calculations must be performed during the test, and further calculations are carried out after a test is completed. The latter may be required in order to evaluate if a particular test conformed to specifications. These calculations are used to set valve positions, pressure setpoints, control gains and other operating parameters so that a desired system behavior is obtained and the test can be successfully carried out. Currently, calculations are made in an ad-hoc fashion and involve trial-and-error procedures that may involve activating the system with the sole purpose of finding the correct parameter settings. The goals of this project are to develop mathematical models, control methodologies and associated simulation environments to provide a systematic and comprehensive prediction and real-time control capability. The models and controller designs are expected to be useful in two respects: 1) As a design tool, a model is the only way to determine the effects of design choices without building a prototype, which is, in the context of rocket test stands, impracticable; 2) As a prediction and tuning tool, a good model allows to set system parameters off-line, so that the expected system response conforms to specifications. This includes the setting of physical parameters, such as valve positions, and the configuration and tuning of any feedback controllers in the loop.

  16. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  17. The importance of shared mental models and shared situation awareness for transforming robots from tools to teammates

    Science.gov (United States)

    Ososky, Scott; Schuster, David; Jentsch, Florian; Fiore, Stephen; Shumaker, Randall; Lebiere, Christian; Kurup, Unmesh; Oh, Jean; Stentz, Anthony

    2012-06-01

    Current ground robots are largely employed via tele-operation and provide their operators with useful tools to extend reach, improve sensing, and avoid dangers. To move from robots that are useful as tools to truly synergistic human-robot teaming, however, will require not only greater technical capabilities among robots, but also a better understanding of the ways in which the principles of teamwork can be applied from exclusively human teams to mixed teams of humans and robots. In this respect, a core characteristic that enables successful human teams to coordinate shared tasks is their ability to create, maintain, and act on a shared understanding of the world and the roles of the team and its members in it. The team performance literature clearly points towards two important cornerstones for shared understanding of team members: mental models and situation awareness. These constructs have been investigated as products of teams as well; amongst teams, they are shared mental models and shared situation awareness. Consequently, we are studying how these two constructs can be measured and instantiated in human-robot teams. In this paper, we report results from three related efforts that are investigating process and performance outcomes for human robot teams. Our investigations include: (a) how human mental models of tasks and teams change whether a teammate is human, a service animal, or an advanced automated system; (b) how computer modeling can lead to mental models being instantiated and used in robots; (c) how we can simulate the interactions between human and future robotic teammates on the basis of changes in shared mental models and situation assessment.

  18. Development and application of modeling tools for sodium fast reactor inspection

    Energy Technology Data Exchange (ETDEWEB)

    Le Bourdais, Florian; Marchand, Benoît; Baronian, Vahan [CEA LIST, Centre de Saclay F-91191 Gif-sur-Yvette (France)

    2014-02-18

    To support the development of in-service inspection methods for the Advanced Sodium Test Reactor for Industrial Demonstration (ASTRID) project led by the French Atomic Energy Commission (CEA), several tools that allow situations specific to Sodium cooled Fast Reactors (SFR) to be modeled have been implemented in the CIVA software and exploited. This paper details specific applications and results obtained. For instance, a new specular reflection model allows the calculation of complex echoes from scattering structures inside the reactor vessel. EMAT transducer simulation models have been implemented to develop new transducers for sodium visualization and imaging. Guided wave analysis tools have been developed to permit defect detection in the vessel shell. Application examples and comparisons with experimental data are presented.

  19. Update on Small Modular Reactors Dynamics System Modeling Tool -- Molten Salt Cooled Architecture

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Richard Edward [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cetiner, Sacit M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Fugate, David L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Qualls, A L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Borum, Robert C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Chaleff, Ethan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rogerson, Doug W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Batteh, John J. [Modelon Corporation (Sweden); Tiller, Michael M. [Xogeny Corporation, Canton, MI (United States)

    2014-08-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the third year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled) concepts, including the use of multiple coupled reactors at a single site. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor SMR models, ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface (ICHMI) technical area, and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the Modular Dynamic SIMulation (MoDSIM) tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the program, (2) developing a library of baseline component modules that can be assembled into full plant models using existing geometry and thermal-hydraulic data, (3) defining modeling conventions for interconnecting component models, and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  20. Formal Development of a Tool for Automated Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth; Kjær, Andreas A.; Le Bliguet, Marie

    2011-01-01

    This paper describes a tool for formal modelling relay interlocking systems and explains how it has been stepwise, formally developed using the RAISE method. The developed tool takes the circuit diagrams of a relay interlocking system as input and gives as result a state transition system modelling...

  1. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  2. Greenhouse gases from wastewater treatment — A review of modelling tools

    International Nuclear Information System (INIS)

    Mannina, Giorgio; Ekama, George; Caniani, Donatella; Cosenza, Alida; Esposito, Giovanni; Gori, Riccardo; Garrido-Baserba, Manel; Rosso, Diego; Olsson, Gustaf

    2016-01-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N_2O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. - Highlights: • The state of the art in GHG production/emission/modelling from WWTPs was outlined. • Detailed mechanisms of N_2O production by AOB are still not completely known. • N_2O modelling could be improved considering both AOB pathways contribution. • To improve protocols the regulatory framework among countries has to be equalized. • Plant-wide modelling can help modeller/engineer/operator to reduce GHG emissions.

  3. Greenhouse gases from wastewater treatment — A review of modelling tools

    Energy Technology Data Exchange (ETDEWEB)

    Mannina, Giorgio, E-mail: giorgio.mannina@unipa.it [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Ekama, George [Water Research Group, Department of Civil Engineering, University of Cape Town, Rondebosch, 7700 Cape (South Africa); Caniani, Donatella [Department of Engineering and Physics of the Environment, University of Basilicata, viale dell' Ateneo Lucano 10, 85100 Potenza (Italy); Cosenza, Alida [Dipartimento di Ingegneria Civile, Ambientale, Aerospaziale, dei Materiali, Università di Palermo, Viale delle Scienze, 90100 Palermo (Italy); Esposito, Giovanni [Department of Civil and Mechanical Engineering, University of Cassino and the Southern Lazio, Via Di Biasio, 43, 03043 Cassino, FR (Italy); Gori, Riccardo [Department of Civil and Environmental Engineering, University of Florence, Via Santa Marta 3, 50139 Florence (Italy); Garrido-Baserba, Manel [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Rosso, Diego [Department of Civil & Environmental Engineering, University of California, Irvine, CA 92697-2175 (United States); Water-Energy Nexus Center, University of California, Irvine, CA 92697-2175 (United States); Olsson, Gustaf [Department of Industrial Electrical Engineering and Automation (IEA), Lund University, Box 118, SE-22100 Lund (Sweden)

    2016-05-01

    Nitrous oxide, carbon dioxide and methane are greenhouse gases (GHG) emitted from wastewater treatment that contribute to its carbon footprint. As a result of the increasing awareness of GHG emissions from wastewater treatment plants (WWTPs), new modelling, design, and operational tools have been developed to address and reduce GHG emissions at the plant-wide scale and beyond. This paper reviews the state-of-the-art and the recently developed tools used to understand and manage GHG emissions from WWTPs, and discusses open problems and research gaps. The literature review reveals that knowledge on the processes related to N{sub 2}O formation, especially due to autotrophic biomass, is still incomplete. The literature review shows also that a plant-wide modelling approach that includes GHG is the best option for the understanding how to reduce the carbon footprint of WWTPs. Indeed, several studies have confirmed that a wide vision of the WWPTs has to be considered in order to make them more sustainable as possible. Mechanistic dynamic models were demonstrated as the most comprehensive and reliable tools for GHG assessment. Very few plant-wide GHG modelling studies have been applied to real WWTPs due to the huge difficulties related to data availability and the model complexity. For further improvement in GHG plant-wide modelling and to favour its use at large real scale, knowledge of the mechanisms involved in GHG formation and release, and data acquisition must be enhanced. - Highlights: • The state of the art in GHG production/emission/modelling from WWTPs was outlined. • Detailed mechanisms of N{sub 2}O production by AOB are still not completely known. • N{sub 2}O modelling could be improved considering both AOB pathways contribution. • To improve protocols the regulatory framework among countries has to be equalized. • Plant-wide modelling can help modeller/engineer/operator to reduce GHG emissions.

  4. Pandora - a simulation tool for safety assessments. Technical description and user's guide

    International Nuclear Information System (INIS)

    Ekstroem, Per-Anders

    2010-12-01

    This report documents a flexible simulation tool, Pandora, used in several post closure safety assessments in both Sweden and Finland to assess the radiological dose to man due to releases from radioactive waste repositories. Pandora allows the user to build compartment models to represent the migration and fate of radionuclides in the environment. The tool simplifies the implementation and simulation of radioecological biosphere models in which there exist a large set of radionuclides and input variables. Based on the well-known technical computing software MATLAB and especially its interactive graphical environment Simulink, Pandora receives many benefits. MATLAB/Simulink is a highly flexible tool used for simulations of practically any type of dynamic system; it is widely used, continuously maintained, and often upgraded. By basing the tool on this commercial software package, we gain both the graphical interface provided by Simulink, as well as the ability to access the advanced numerical equation solving routines in MATLAB. Since these numerical methods are well established and quality assured in their MATLAB implementation, the solution methods used in Pandora can be considered to have high level of quality assurance. The structure of Pandora provides clarity in the model format, which means the model itself assists its own documentation, since the model can be understood by inspecting its structure. With the introduction of the external tool Pandas (Pandora assessment tool), version handling and an integrated way of performing the entire calculation chain has been added. Instead of being dependent on other commercial statistical software as Risk for performing probabilistic assessments, they can now be performed within the tool

  5. Using Collaborative Simulation Modeling to Develop a Web-Based Tool to Support Policy-Level Decision Making About Breast Cancer Screening Initiation Age

    Directory of Open Access Journals (Sweden)

    Elizabeth S. Burnside MD, MPH, MS

    2017-07-01

    Full Text Available Background: There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective: To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods: The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49, annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16 who completed surveys. Results: The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16 liked the appearance of the site; 94% (15/16 found the tool helpful; and 94% (15/16 would recommend the tool to a colleague. Conclusions: This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms.

  6. High-Level Performance Modeling of SAR Systems

    Science.gov (United States)

    Chen, Curtis

    2006-01-01

    SAUSAGE (Still Another Utility for SAR Analysis that s General and Extensible) is a computer program for modeling (see figure) the performance of synthetic- aperture radar (SAR) or interferometric synthetic-aperture radar (InSAR or IFSAR) systems. The user is assumed to be familiar with the basic principles of SAR imaging and interferometry. Given design parameters (e.g., altitude, power, and bandwidth) that characterize a radar system, the software predicts various performance metrics (e.g., signal-to-noise ratio and resolution). SAUSAGE is intended to be a general software tool for quick, high-level evaluation of radar designs; it is not meant to capture all the subtleties, nuances, and particulars of specific systems. SAUSAGE was written to facilitate the exploration of engineering tradeoffs within the multidimensional space of design parameters. Typically, this space is examined through an iterative process of adjusting the values of the design parameters and examining the effects of the adjustments on the overall performance of the system at each iteration. The software is designed to be modular and extensible to enable consideration of a variety of operating modes and antenna beam patterns, including, for example, strip-map and spotlight SAR acquisitions, polarimetry, burst modes, and squinted geometries.

  7. Models as Tools of Analysis of a Network Organisation

    Directory of Open Access Journals (Sweden)

    Wojciech Pająk

    2013-06-01

    Full Text Available The paper presents models which may be applied as tools of analysis of a network organisation. The starting point of the discussion is defining the following terms: supply chain and network organisation. Further parts of the paper present basic assumptions analysis of a network organisation. Then the study characterises the best known models utilised in analysis of a network organisation. The purpose of the article is to define the notion and the essence of network organizations and to present the models used for their analysis.

  8. Verification and Performance Analysis for Embedded Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2009-01-01

    This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....

  9. Influence of Workpiece Material on Tool Wear Performance and Tribofilm Formation in Machining Hardened Steel

    Directory of Open Access Journals (Sweden)

    Junfeng Yuan

    2016-04-01

    Full Text Available In addition to the bulk properties of a workpiece material, characteristics of the tribofilms formed as a result of workpiece material mass transfer to the friction surface play a significant role in friction control. This is especially true in cutting of hardened materials, where it is very difficult to use liquid based lubricants. To better understand wear performance and the formation of beneficial tribofilms, this study presents an assessment of uncoated mixed alumina ceramic tools (Al2O3+TiC in the turning of two grades of steel, AISI T1 and AISI D2. Both workpiece materials were hardened to 59 HRC then machined under identical cutting conditions. Comprehensive characterization of the resulting wear patterns and the tribofilms formed at the tool/workpiece interface were made using X-ray Photoelectron Spectroscopy and Scanning Electron Microscopy. Metallographic studies on the workpiece material were performed before the machining process and the surface integrity of the machined part was investigated after machining. Tool life was 23% higher when turning D2 than T1. This improvement in cutting tool life and wear behaviour was attributed to a difference in: (1 tribofilm generation on the friction surface and (2 the amount and distribution of carbide phases in the workpiece materials. The results show that wear performance depends both on properties of the workpiece material and characteristics of the tribofilms formed on the friction surface.

  10. An original tool for checking energy performance and certification of buildings by means of Artificial Neural Networks

    International Nuclear Information System (INIS)

    Buratti, C.; Barbanera, M.; Palladino, D.

    2014-01-01

    Highlights: • ANN used as a tool for evaluating energy performance of buildings. • Training, validation, and testing of Neural Network with real energy certificates data. • Global energy performance index was chosen as a target of ANN. • A good correlation and a minimum error was found with certificates data. • A new energy index was defined in order to check the energy certificates. - Abstract: The Energy Performance Buildings Directive (EPBD) was issued to provide a common strategy for all European countries and to implement several actions for improving energy efficiency of buildings, responsible for 40% of energy consumption. Energy Performance Certificates are provided as a tool to evaluate the energy performance of buildings; however, costly and time-consuming controls are necessary to verify the accuracy of the set and declared data. Useful tools could be the Artificial Neural Networks (ANN), whereby it is possible to estimate the energy consumptions from specific parameters, to evaluate the accuracy of data in the energy certificates, and to identify the certificates needing accurate control. In this study, an Artificial Neural Network was developed based on approximately 6500 energy certificates (2700 are self-declaration) received by the Umbria Region (central Italy), in order to evaluate the global energy consumption of buildings from several and specific parameters reported in certificates. Data was checked in compliance with energy standards and only the correct certificates were used to train the Neural Network. The implemented Neural Network was tested with database data and a good correlation was found; in particular the energy performance calculated with the Neural Network presents an error greater than 15 kW h/m 2 year with respect to the real value of global energy performance index in only 3.6% of cases. Finally, a Neural Energy Performance Index (N.E.P.I.) was defined, in order to verify the accuracy of the energy certificates; the

  11. Predicting the Abrasion Resistance of Tool Steels by Means of Neurofuzzy Model

    Directory of Open Access Journals (Sweden)

    Dragutin Lisjak

    2013-07-01

    Full Text Available This work considers use neurofuzzy set theory for estimate abrasion wear resistance of steels based on chemical composition, heat treatment (austenitising temperature, quenchant and tempering temperature, hardness after hardening and different tempering temperature and volume loss of materials according to ASTM G 65-94. Testing of volume loss for the following group of materials as fuzzy data set was taken: carbon tool steels, cold work tool steels, hot work tools steels, high-speed steels. Modelled adaptive neuro fuzzy inference system (ANFIS is compared to statistical model of multivariable non-linear regression (MNLR. From the results it could be concluded that it is possible well estimate abrasion wear resistance for steel whose volume loss is unknown and thus eliminate unnecessary testing.

  12. Micromechanical Models of Mechanical Response of High Performance Fibre Reinforced Cement Composites

    DEFF Research Database (Denmark)

    Li, V. C.; Mihashi, H.; Alwan, J.

    1996-01-01

    generation of FRC with high performance and economical viability, is in sight. However, utilization of micromechanical models for a more comprehensive set of important HPFRCC properties awaits further investigations into fundamental mechanisms governing composite properties, as well as intergrative efforts......The state-of-the-art in micromechanical modeling of the mechanical response of HPFRCC is reviewed. Much advances in modeling has been made over the last decade to the point that certain properties of composites can be carefully designed using the models as analytic tools. As a result, a new...... across responses to different load types. Further, micromechanical models for HPFRCC behavior under complex loading histories, including those in fracture, fatigue and multuaxial loading are urgently needed in order to optimize HPFRCC microstrcuctures and enable predictions of such material in structures...

  13. Developing a fuzzy ANP model for performance appraisal based on firm strategy

    Directory of Open Access Journals (Sweden)

    Seid Mohammad Reza Mirahmadi

    2018-10-01

    Full Text Available The purpose of this study is to develop a fuzzy Analytic Network Process (ANP model that has the ability to evaluate employee performance in different strategies. A team of experts in the field of strategic human resource management and senior management of an organization engaged in steel production were involved in the study. The data collection tool was a questionnaire that was designed based on the criteria of organization's performance appraisal system. The results showed that in cost leadership strategy, compliance of work hierarchy, quantity of work and the ability to make important decisions constituted the highest coefficients, while in the focus strategy, participate in group work, power of supervision and administration and decision making ability had the highest importance. In differentiation strategy, innovation and creativity, quality and offering constructive suggestions received higher ratings than other criteria. Finally, the developed model was used to evaluate the performance of a sample employee

  14. MbT-Tool: An open-access tool based on Thermodynamic Electron Equivalents Model to obtain microbial-metabolic reactions to be used in biotechnological process

    Directory of Open Access Journals (Sweden)

    Pablo Araujo Granda

    2016-01-01

    Full Text Available Modelling cellular metabolism is a strategic factor in investigating microbial behaviour and interactions, especially for bio-technological processes. A key factor for modelling microbial activity is the calculation of nutrient amounts and products generated as a result of the microbial metabolism. Representing metabolic pathways through balanced reactions is a complex and time-consuming task for biologists, ecologists, modellers and engineers. A new computational tool to represent microbial pathways through microbial metabolic reactions (MMRs using the approach of the Thermodynamic Electron Equivalents Model has been designed and implemented in the open-access framework NetLogo. This computational tool, called MbT-Tool (Metabolism based on Thermodynamics can write MMRs for different microbial functional groups, such as aerobic heterotrophs, nitrifiers, denitrifiers, methanogens, sulphate reducers, sulphide oxidizers and fermenters. The MbT-Tool's code contains eighteen organic and twenty inorganic reduction-half-reactions, four N-sources (NH4+, NO3−, NO2−, N2 to biomass synthesis and twenty-four microbial empirical formulas, one of which can be determined by the user (CnHaObNc. MbT-Tool is an open-source program capable of writing MMRs based on thermodynamic concepts, which are applicable in a wide range of academic research interested in designing, optimizing and modelling microbial activity without any extensive chemical, microbiological and programing experience.

  15. Linear regression metamodeling as a tool to summarize and present simulation model results.

    Science.gov (United States)

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  16. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  17. Collaboro: a collaborative (meta modeling tool

    Directory of Open Access Journals (Sweden)

    Javier Luis Cánovas Izquierdo

    2016-10-01

    Full Text Available Software development is becoming more and more collaborative, emphasizing the role of end-users in the development process to make sure the final product will satisfy customer needs. This is especially relevant when developing Domain-Specific Modeling Languages (DSMLs, which are modeling languages specifically designed to carry out the tasks of a particular domain. While end-users are actually the experts of the domain for which a DSML is developed, their participation in the DSML specification process is still rather limited nowadays. In this paper, we propose a more community-aware language development process by enabling the active participation of all community members (both developers and end-users from the very beginning. Our proposal, called Collaboro, is based on a DSML itself enabling the representation of change proposals during the language design and the discussion (and trace back of possible solutions, comments and decisions arisen during the collaboration. Collaboro also incorporates a metric-based recommender system to help community members to define high-quality notations for the DSMLs. We also show how Collaboro can be used at the model-level to facilitate the collaborative specification of software models. Tool support is available both as an Eclipse plug-in a web-based solution.

  18. Thermodynamic simulation model for predicting the performance of spark ignition engines using biogas as fuel

    International Nuclear Information System (INIS)

    Nunes de Faria, Mário M.; Vargas Machuca Bueno, Juan P.; Ayad, Sami M.M. Elmassalami; Belchior, Carlos R. Pereira

    2017-01-01

    Highlights: • A 0-D model for performance prediction of SI ICE fueled with biogas is proposed. • Relative difference between simulated and experimental values was under 5%. • Can be adapted for different biogas compositions and operating ranges. • Could be a valuable tool for predicting trends and guiding experimentation. • Is suitable for use with biogas supplies in developing regions. - Abstract: Biogas found its way from developing countries and is now an alternative to fossil fuels in internal combustion engines and with the advantage of lower greenhouse gas emissions. However, its use in gas engines requires engine modifications or adaptations that may be costly. This paper reports the results of experimental performance and emissions tests of an engine-generator unit fueled with biogas produced in a sewage plant in Brazil, operating under different loads, and with suitable engine modifications. These emissions and performance results were in agreement with the literature and it was confirmed that the penalties to engine performance were more significant than emission reduction in the operating range tested. Furthermore, a zero dimensional simulation model was employed to predict performance characteristics. Moreover, a differential thermodynamic equation system was solved, obtaining the pressure inside the cylinder as a function of the crank angle for different engine conditions. Mean effective pressure and indicated power were also obtained. The results of simulation and experimental tests of the engine in similar conditions were compared and the model validated. Although several simplifying assumptions were adopted and empirical correlations were used for Wiebe function, the model was adequate in predicting engine performance as the relative difference between simulated and experimental values was lower than 5%. The model can be adapted for use with different raw or enriched biogas compositions and could prove to be a valuable tool to guide

  19. Final Report: Simulation Tools for Parallel Microwave Particle in Cell Modeling

    International Nuclear Information System (INIS)

    Stoltz, Peter H.

    2008-01-01

    Transport of high-power rf fields and the subsequent deposition of rf power into plasma is an important component of developing tokamak fusion energy. Two limitations on rf heating are: (i) breakdown of the metallic structures used to deliver rf power to the plasma, and (ii) a detailed understanding of how rf power couples into a plasma. Computer simulation is a main tool for helping solve both of these problems, but one of the premier tools, VORPAL, is traditionally too difficult to use for non-experts. During this Phase II project, we developed the VorpalView user interface tool. This tool allows Department of Energy researchers a fully graphical interface for analyzing VORPAL output to more easily model rf power delivery and deposition in plasmas.

  20. A review of performance measurement’s maturity models

    Directory of Open Access Journals (Sweden)

    María Paula Bertolli

    2017-01-01

    Full Text Available Introduction: In a context as dynamic as today, SMEs need performance measurement systems (PMS that are able to generate useful, relevant and reliable information to manage. Measuring the maturity of PMS is an essential step to achieve its evolution to an ideal state that allows a better control of the results and to act consequently, improving management and decision making. Objective: To develop a bibliographic review to identify and characterize PMS maturity models, recognizing between them the most feasible models to apply in SMEs, in order to generate a contribution for the strengthening of such systems, facilitating effective and timely decision making in organizations. Methodology: The research question defined is: which existing PMS maturity model can be used by industrial SMEs? Google Scholar database was consulted for searching information, using certain search parameters. Based on a previous criteria definition, the selected models are compared. Finally, the conclusions about these models are elaborated. Results: From the results obtained through the bibliographic search in Google Scholar, different criteria were used to select the models to be characterized and compared. The four models selected were the proposed by Wettstein and Kueng, Van Aken, Tangen and Aho. Conclusions: The models considered most adequate are those proposed by Wettstein and Kueng (2002 and Aho (2012, due to their easy application and the low requirement of resource use. However, as such models do not have an evaluation tool, it has to be defined by the company.

  1. Performance Estimation of Networked Business Models: Case Study on a Finnish eHealth Service Project

    Directory of Open Access Journals (Sweden)

    Marikka Heikkilä

    2014-08-01

    Full Text Available Purpose: The objective of this paper is to propose and demonstrate a framework for estimating performance in a networked business model. Design/methodology/approach: Our approach is design science, utilising action research in studying a case of four independent firms in Health & Wellbeing sector aiming to jointly provide a new service for business and private customers. The duration of the research study is 3 years. Findings: We propose that a balanced set of performance indicators can be defined by paying attention to all main components of the business model, enriched with of network collaboration. The results highlight the importance of measuring all main components of the business model and also the business network partners’ view on trust, contracts and fairness. Research implications: This article contributes to the business model literature by combining business modelling with performance evaluation. The article points out that it is essential to create metrics that can be applied to evaluate and improve the business model blueprints, but it is also important to measure business collaboration aspects. Practical implications: Companies have already adopted Business model canvas or similar business model tools to innovate new business models. We suggest that companies continue their business model innovation work by agreeing on a set of performance metrics, building on the business model components model enriched with measures of network collaboration. Originality/value: This article contributes to the business model literature and praxis by combining business modelling with performance evaluation.

  2. Achieving informed decision-making for net zero energy buildings design using building performance simulation tools

    NARCIS (Netherlands)

    Attia, S.G.; Gratia, E.; De Herde, A.; Hensen, J.L.M.

    2013-01-01

    Building performance simulation (BPS) is the basis for informed decision-making of Net Zero Energy Buildings (NZEBs) design. This paper aims to investigate the use of building performance simulation tools as a method of informing the design decision of NZEBs. The aim of this study is to evaluate the

  3. Model predictive control as a tool for improving the process operation of MSW combustion plants

    International Nuclear Information System (INIS)

    Leskens, M.; Kessel, L.B.M. van; Bosgra, O.H.

    2005-01-01

    In this paper a feasibility study is presented on the application of the advanced control strategy called model predictive control (MPC) as a tool for obtaining improved process operation performance for municipal solid waste (MSW) combustion plants. The paper starts with a discussion of the operational objectives and control of such plants, from which a motivation follows for applying MPC to them. This is followed by a discussion on the basic idea behind this advanced control strategy. After that, an MPC-based combustion control system is proposed aimed at tackling a typical MSW combustion control problem and, using this proposed control system, an assessment is made of the improvement in performance that an MPC-based MSW combustion control system can provide in comparison to conventional MSW combustion control systems. This assessment is based on simulations using an experimentally obtained process and disturbance model of a real-life large-scale MSW combustion plant

  4. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  5. A Unified Model of Performance for Predicting the Effects of Sleep and Caffeine.

    Science.gov (United States)

    Ramakrishnan, Sridhar; Wesensten, Nancy J; Kamimori, Gary H; Moon, James E; Balkin, Thomas J; Reifman, Jaques

    2016-10-01

    Existing mathematical models of neurobehavioral performance cannot predict the beneficial effects of caffeine across the spectrum of sleep loss conditions, limiting their practical utility. Here, we closed this research gap by integrating a model of caffeine effects with the recently validated unified model of performance (UMP) into a single, unified modeling framework. We then assessed the accuracy of this new UMP in predicting performance across multiple studies. We hypothesized that the pharmacodynamics of caffeine vary similarly during both wakefulness and sleep, and that caffeine has a multiplicative effect on performance. Accordingly, to represent the effects of caffeine in the UMP, we multiplied a dose-dependent caffeine factor (which accounts for the pharmacokinetics and pharmacodynamics of caffeine) to the performance estimated in the absence of caffeine. We assessed the UMP predictions in 14 distinct laboratory- and field-study conditions, including 7 different sleep-loss schedules (from 5 h of sleep per night to continuous sleep loss for 85 h) and 6 different caffeine doses (from placebo to repeated 200 mg doses to a single dose of 600 mg). The UMP accurately predicted group-average psychomotor vigilance task performance data across the different sleep loss and caffeine conditions (6% caffeine resulted in improved predictions (after caffeine consumption) by up to 70%. The UMP provides the first comprehensive tool for accurate selection of combinations of sleep schedules and caffeine countermeasure strategies to optimize neurobehavioral performance. © 2016 Associated Professional Sleep Societies, LLC.

  6. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  7. Performance evaluation of Titanium nitride coated tool in turning of mild steel

    Science.gov (United States)

    Srinivas, B.; Pramod Kumar, G.; Cheepu, Muralimohan; Jagadeesh, N.; kumar, K. Ravi; Haribabu, S.

    2018-03-01

    The growth in demand for bio-gradable materials is opened as a venue for using vegetable oils, coconut oils etc., as alternate to the conventional coolants for machining operations. At present in manufacturing industries the demand for surface quality is increasing rapidly along with dimensional accuracy and geometric tolerances. The present study is influence of cutting parameters on the surface roughness during the turning of mild steel with TiN coated carbide tool using groundnut oil and soluble oil as coolants. The results showed vegetable gave closer surface finish compares with soluble oil. Cutting parameters has been optimized with Taguchi technique. In this paper, the main objective is to optimize the cutting parameters and reduce surface roughness analogous to increase the tool life by apply the coating on the carbide inserts. The cost of the coating is more, but economically efficient than changing the tools frequently. The plots were generated and analysed to find the relationship between them which are confirmed by performing a comparison study between the predicted results and theoretical results.

  8. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  9. Teaching leadership in trauma resuscitation: Immediate feedback from a real-time, competency-based evaluation tool shows long-term improvement in resident performance.

    Science.gov (United States)

    Gregg, Shea C; Heffernan, Daithi S; Connolly, Michael D; Stephen, Andrew H; Leuckel, Stephanie N; Harrington, David T; Machan, Jason T; Adams, Charles A; Cioffi, William G

    2016-10-01

    Limited data exist on how to develop resident leadership and communication skills during actual trauma resuscitations. An evaluation tool was developed to grade senior resident performance as the team leader during full-trauma-team activations. Thirty actions that demonstrated the Accreditation Council for Graduate Medical Education core competencies were graded on a Likert scale of 1 (poor) to 5 (exceptional). These actions were grouped by their respective core competencies on 5 × 7-inch index cards. In Phase 1, baseline performance scores were obtained. In Phase 2, trauma-focused communication in-services were conducted early in the academic year, and immediate, personalized feedback sessions were performed after resuscitations based on the evaluation tool. In Phase 3, residents received only evaluation-based feedback following resuscitations. In Phase 1 (October 2009 to April 2010), 27 evaluations were performed on 10 residents. In Phase 2 (April 2010 to October 2010), 28 evaluations were performed on nine residents. In Phase 3 (October 2010 to January 2012), 44 evaluations were performed on 13 residents. Total scores improved significantly between Phases 1 and 2 (p = 0.003) and remained elevated throughout Phase 3. When analyzing performance by competency, significant improvement between Phases 1 and 2 (p competencies (patient care, knowledge, system-based practice, practice-based learning) with the exception of "communication and professionalism" (p = 0.56). Statistically similar scores were observed between Phases 2 and 3 in all competencies with the exception of "medical knowledge," which showed ongoing significant improvement (p = 0.003). Directed resident feedback sessions utilizing data from a real-time, competency-based evaluation tool have allowed us to improve our residents' abilities to lead trauma resuscitations over a 30-month period. Given pressures to maximize clinical educational opportunities among work-hour constraints, such a model may help

  10. Performance Analysis of a Fluidic Axial Oscillation Tool for Friction Reduction with the Absence of a Throttling Plate

    Directory of Open Access Journals (Sweden)

    Xinxin Zhang

    2017-04-01

    Full Text Available An axial oscillation tool is proved to be effective in solving problems associated with high friction and torque in the sliding drilling of a complex well. The fluidic axial oscillation tool, based on an output-fed bistable fluidic oscillator, is a type of axial oscillation tool which has become increasingly popular in recent years. The aim of this paper is to analyze the dynamic flow behavior of a fluidic axial oscillation tool with the absence of a throttling plate in order to evaluate its overall performance. In particular, the differences between the original design with a throttling plate and the current default design are profoundly analyzed, and an improvement is expected to be recorded for the latter. A commercial computational fluid dynamics code, Fluent, was used to predict the pressure drop and oscillation frequency of a fluidic axial oscillation tool. The results of the numerical simulations agree well with corresponding experimental results. A sufficient pressure pulse amplitude with a low pressure drop is desired in this study. Therefore, a relative pulse amplitude of pressure drop and displacement are introduced in our study. A comparison analysis between the two designs with and without a throttling plate indicates that when the supply flow rate is relatively low or higher than a certain value, the fluidic axial oscillation tool with a throttling plate exhibits a better performance; otherwise, the fluidic axial oscillation tool without a throttling plate seems to be a preferred alternative. In most of the operating circumstances in terms of the supply flow rate and pressure drop, the fluidic axial oscillation tool performs better than the original design.

  11. Software Application Profile: PHESANT: a tool for performing automated phenome scans in UK Biobank.

    Science.gov (United States)

    Millard, Louise A C; Davies, Neil M; Gaunt, Tom R; Davey Smith, George; Tilling, Kate

    2017-10-05

    Epidemiological cohorts typically contain a diverse set of phenotypes such that automation of phenome scans is non-trivial, because they require highly heterogeneous models. For this reason, phenome scans have to date tended to use a smaller homogeneous set of phenotypes that can be analysed in a consistent fashion. We present PHESANT (PHEnome Scan ANalysis Tool), a software package for performing comprehensive phenome scans in UK Biobank. PHESANT tests the association of a specified trait with all continuous, integer and categorical variables in UK Biobank, or a specified subset. PHESANT uses a novel rule-based algorithm to determine how to appropriately test each trait, then performs the analyses and produces plots and summary tables. The PHESANT phenome scan is implemented in R. PHESANT includes a novel Javascript D3.js visualization and accompanying Java code that converts the phenome scan results to the required JavaScript Object Notation (JSON) format. PHESANT is available on GitHub at [https://github.com/MRCIEU/PHESANT]. Git tag v0.5 corresponds to the version presented here. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  12. Graphite-MicroMégas, a tool for DNA modeling

    OpenAIRE

    Hornus , Samuel; Larivière , Damien

    2011-01-01

    National audience; MicroMégas is the current state of an ongoing effort to develop tools for modeling biological assembly of molecules. We here present its DNA modeling part. MicroMégas is implemented as a plug-in to Graphite, which is a research plat- form for computer graphics, 3D modeling and numerical geometry that is developed by members of the ALICE team of INRIA.; Nous décrivons l'outils MicroMégas et les techniques qu'il met en jeu pour la modélisation d'assemblage de molécule, en par...

  13. Modelling Machine Tools using Structure Integrated Sensors for Fast Calibration

    Directory of Open Access Journals (Sweden)

    Benjamin Montavon

    2018-02-01

    Full Text Available Monitoring of the relative deviation between commanded and actual tool tip position, which limits the volumetric performance of the machine tool, enables the use of contemporary methods of compensation to reduce tolerance mismatch and the uncertainties of on-machine measurements. The development of a primarily optical sensor setup capable of being integrated into the machine structure without limiting its operating range is presented. The use of a frequency-modulating interferometer and photosensitive arrays in combination with a Gaussian laser beam allows for fast and automated online measurements of the axes’ motion errors and thermal conditions with comparable accuracy, lower cost, and smaller dimensions as compared to state-of-the-art optical measuring instruments for offline machine tool calibration. The development is tested through simulation of the sensor setup based on raytracing and Monte-Carlo techniques.

  14. Rapid Tooling via Stereolithography

    OpenAIRE

    Montgomery, Eva

    2006-01-01

    Approximately three years ago, composite stereolithography (SL) resins were introduced to the marketplace, offering performance features beyond what traditional SL resins could offer. In particular, the high heat deflection temperatures and high stiffness of these highly filled resins have opened the door to several new rapid prototyping (RP) applications, including wind tunnel test modelling and, more recently, rapid tooling.

  15. NIF capsule performance modeling

    Directory of Open Access Journals (Sweden)

    Weber S.

    2013-11-01

    Full Text Available Post-shot modeling of NIF capsule implosions was performed in order to validate our physical and numerical models. Cryogenic layered target implosions and experiments with surrogate targets produce an abundance of capsule performance data including implosion velocity, remaining ablator mass, times of peak x-ray and neutron emission, core image size, core symmetry, neutron yield, and x-ray spectra. We have attempted to match the integrated data set with capsule-only simulations by adjusting the drive and other physics parameters within expected uncertainties. The simulations include interface roughness, time-dependent symmetry, and a model of mix. We were able to match many of the measured performance parameters for a selection of shots.

  16. Development and Validation of Performance Assessment Tools for Interprofessional Communication and Teamwork (PACT)

    Science.gov (United States)

    Chiu, Chia-Ju

    2014-01-01

    Background: Medical errors caused by breakdowns in teamwork and interprofessional communication contribute to many deaths in the United States each year. Team Strategies and Tools to Enhance Performance and Patient Safety (TeamSTEPPS®) is an evidence-based teamwork system developed to improve communication and teamwork skills among health care…

  17. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  18. Mathematical modeling of optical glazing performance

    NARCIS (Netherlands)

    Nijnatten, van P.A.; Wittwer, V.; Granqvist, C.G.; Lampert, C.M.

    1994-01-01

    Mathematical modelling can be a powerful tool in the design and optimalization of glazing. By calculation, the specifications of a glazing design and the optimal design parameters can be predicted without building costly prototypes first. Furthermore, properties which are difficult to measure, like

  19. Predictive models for PEM-electrolyzer performance using adaptive neuro-fuzzy inference systems

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Steffen [University of Tasmania, Hobart 7001, Tasmania (Australia); Karri, Vishy [Australian College of Kuwait (Kuwait)

    2010-09-15

    Predictive models were built using neural network based Adaptive Neuro-Fuzzy Inference Systems for hydrogen flow rate, electrolyzer system-efficiency and stack-efficiency respectively. A comprehensive experimental database forms the foundation for the predictive models. It is argued that, due to the high costs associated with the hydrogen measuring equipment; these reliable predictive models can be implemented as virtual sensors. These models can also be used on-line for monitoring and safety of hydrogen equipment. The quantitative accuracy of the predictive models is appraised using statistical techniques. These mathematical models are found to be reliable predictive tools with an excellent accuracy of {+-}3% compared with experimental values. The predictive nature of these models did not show any significant bias to either over prediction or under prediction. These predictive models, built on a sound mathematical and quantitative basis, can be seen as a step towards establishing hydrogen performance prediction models as generic virtual sensors for wider safety and monitoring applications. (author)

  20. Prediction Of Abrasive And Diffusive Tool Wear Mechanisms In Machining

    Science.gov (United States)

    Rizzuti, S.; Umbrello, D.

    2011-01-01

    Tool wear prediction is regarded as very important task in order to maximize tool performance, minimize cutting costs and improve the quality of workpiece in cutting. In this research work, an experimental campaign was carried out at the varying of cutting conditions with the aim to measure both crater and flank tool wear, during machining of an AISI 1045 with an uncoated carbide tool P40. Parallel a FEM-based analysis was developed in order to study the tool wear mechanisms, taking also into account the influence of the cutting conditions and the temperature reached on the tool surfaces. The results show that, when the temperature of the tool rake surface is lower than the activation temperature of the diffusive phenomenon, the wear rate can be estimated applying an abrasive model. In contrast, in the tool area where the temperature is higher than the diffusive activation temperature, the wear rate can be evaluated applying a diffusive model. Finally, for a temperature ranges within the above cited values an adopted abrasive-diffusive wear model furnished the possibility to correctly evaluate the tool wear phenomena.